Securing embedded systems based on Open System Architectures

Many of the standards developed by VITA working groups are for defining modules that are part of Open System Architectures (OSA) - whether they are VME, VPX, PMC, FMC or one of many other standards. These modules are used to build critical embedded systems that are deployed in a variety of application platforms. Today these platforms are typically connected via a network, a network that is often susceptible to cyberattacks. This article introduces you to high level concerns and challenges facing designers using Open System Architecture (OSA) modules.

Computer technology has evolved to increasingly complex levels making it more difficult each day to develop next generation products. Advances in hardware and software make it possible to develop systems that are highly capable and complex and getting more so with each iteration. They also face increasing scale, computation, and security challenges that can quickly overwhelm a project.

Use of OSA modules based on popular industry standards, for both hardware and software, is a common method to address these challenges. OSA modules reduce the cycle time needed to develop new systems and insert new technology into legacy systems. The ecosystem of module suppliers creates a competitive marketplace that reduces program costs and spreads risk. Because OSAs use nonproprietary system architectural standards in which various payloads can be shared among various platforms, technology upgrades are easy to access and implement.

However, adding security to an OSA could interfere with its ‘openness.’ As most current security approaches are ad hoc, proprietary, and expensive they are incompatible with OSA principles, especially when each platform developer individually implements and manages the platform security. Therefore, developing a system-level secure embedded system architecture that will seamlessly work with various OSA components is a challenge.

In past years, protecting real-time, embedded systems was a lower priority as each system was very isolated, but now, security is of paramount importance in embedded computing systems. These systems are becoming more intelligent and connected via wireless networks that are especially vulnerable. Combine that with the fact that major initiatives across multiple industries from telecommunications to defense are using OSA solutions to develop their next generation of platform raises concerns about the security capability of OSA modules (Figure 1).

Figure 1: In an ideal secure embedded system design process, functionality (blue) and security (gold) are co-designed, yet they are appropriately decoupled during testing so that security does not interfere with functionality. This co-design is often difficult to achieve because functionality and security are two very different disciplines.

What is an Open System Architecture?

An OSA is any system (or software) architecture that exhibits the following three beneficial characteristics:

  1. It is modular, being decomposed into architectural components that are cohesive, loosely coupled with other components (and external systems), and encapsulate (hide) their implementations behind visible interfaces.
  2. Its key interfaces between architectural components conform to open interface standards (that is, consensus based, widely used, and easily available to potential users).
  3. Its key interfaces have been verified to conform to the associated open interface standards.1

Two of the most important benefits are to increase competition among developers of the architectural components and to avoid single sourcing when it comes to acquiring and updating these components.

The definition does not state that all interfaces must conform to open interface standards, but rather only key interfaces must be open. Moreover, if one examines actual system architectures, one quickly learns that openness is not black and white but rather a matter of degree. Some interfaces are highly open (i.e., they conform to widely used international standards), some interfaces are relatively open (i.e., they conform to less-widely used, more application-domain-specific standards or widely used de facto standards), some interfaces are slightly open (i.e., they conform to product-line-specific conventions), and some interfaces are essentially closed (i.e., they are system-unique or conform to contractor-proprietary “standards”).

Common OSA interfaces

There are countless standards that define hardware interfaces for OSAs ranging from highly open to application specific openness. USB, SATA, Display Port, HDMI, DVI, Ethernet, RS232/422/485 and many, many more. Others define system, board, and software architectures, standards such as the VITA VPX family, PICMG’s Advanced Mezzanine Card, and numerous small form factors for boards and Linux for open software.

Challenges in securing embedded systems

Security has an asymmetric nature—an attacker can compromise a system by discovering a single, unexpected vulnerability, while a defender must defend against all vulnerabilities. Because it is impossible to correctly predict every future attack, securing an embedded system to prevent attacks is not a guarantee of a secure platform.

An embedded system provides very little, if any, allowance for security, especially for size, weight and power (SWaP) considerations; thus, security must not impose excessive overhead on the protected system. Security technologies must be compatible with embedded systems that use commercial-off-the-shelf (COTS) processor hardware platforms that are common to OSAs.

Computer security threats can be divided into four categories, according to whether they threaten confidentiality, integrity, usability, or availability. Breaking security into these elements makes the evaluation of potential solutions easier and more effective.

Confidentiality is the property that is violated whenever information is disclosed to an unauthorized principal. That may be a person or another computing device, either is relevant. Integrity is violated whenever the information is altered in an unauthorized way. It may be altered at a host or in transit between devices. Usability is a qualitative analysis of the system’s suitability to a task. A system that is highly secure but incapable of delivering the required functionality is not designed well. Usability metrics evaluate a system’s design by considering the system’s throughput, resilience, portability, upgradability, SWaP, and other similar parameters. Availability is the property of a system, which always honors any legitimate requests by authorized principals. It is violated when an attacker succeeds in denying service to legitimate users, typically by using up all the available resources.

Security is critical from the hardware through the layers of software all the way to the end application. Each is important to ensure the most secure system possible. To manage your risk, you must be sensitive to security threats through the entire system architecture.2


Security starts at the processor. A foundation of root-of-trust must be established to provide security services upon which to build a robust security environment. Today’s mainstream processors typically include the hooks and features needed to support a robust security strategy.


The next layer of defense is at the Basic Input/Output System (BIOS) level. Attacks on the BIOS are growing with reports of intrusions becoming more common. The National Institute of Standards and Technology (NIST) provides security guidelines for updating BIOS, the point at which the security threat is the greatest. Through these security guidelines – NIST SP 800-147, NIST is setting standards that require authentication of BIOS upgrade mechanisms.

BIOS providers have taken the security challenge seriously and offer suites of products providing multiple levels of security. They support the latest processor technology, which allows users to manage, inventory, diagnose, and repair their systems in efficient, remote, and streamlined ways all without compromising system security. The BIOS providers support the NIST SP 800-147 guidelines and they offer multiple other security options to protect FLASH and other storage devices. Users prefer to keep as much of the security responsibility at the hardware and BIOS level as possible because that is where the defense is strongest.

Operating system

Operating systems play many roles in providing increased levels of security. The most recent advancement became more feasible with the introduction of multicore processors that enable the ability to run multiple instances of operating systems on one multicore processor. This has led to hypervisor architectures that can protect key elements of the software environment (Figure 2). Operating systems must address:

  • Authentication – the process of ensuring that users, devices and software on a network are correctly identified.
  • Authorization – grants users and devices the right to access resources and perform specified actions.
  • Network Access Control – mechanisms that limit access to the network to authenticated and authorized devices, software and users.
  • Confidentiality – using ciphers to transform data to make it unreadable to anyone except those authorized and authenticated to view the data.
  • Integrity – checking mechanisms are designed to detect unauthorized changes to transmitted data through the lifecycle of a device, software and data.
  • Remote Management – a method to monitor, update and manage remotely manufactured and fielded devices.

Figure 2: Typical application illustrates the vulnerable points in a system.

Supply chain evolution

OSA module suppliers have changed the way they address the challenges of security. In the past it was given little or no thought. Now suppliers have refocused key staff on security, creating and staffing positions such as Secure Processing Solutions or Secure Embedded Solutions that live and breathe the challenge of ensuring that their products can meet the demanding requirements of security.

Others have added whole new business units. Mercury Systems formed its Security Center of Excellence in response to rapidly evolving cyber threats. Their team of security and systems analysts, as well as cryptography, hardware and software engineers, are dedicated to proactively address the most critical security issues across multiple vertical markets to help their customers create a safer, more secure world. Mercury Systems BuiltSECURE technology starts at the design stages and carries all the way through manufacturing to delivery, ensuring that modules and systems designed and built by Mercury Systems meet the most stringent of security requirements.

The Curtiss-Wright Defense Solutions has a similar program; the TrustedCOTS Program designed to address the protection of critical military technologies and data for their customers.

Ensuring that a system is trustworthy begins with the first instruction on trusted hardware. Attacks on computers and networks continue to proliferate despite extensive software approaches to prevent these attacks. Establishing a strong digital identity for both the user and the computer system through hardware-based security is a significant step beyond software-only strategies.

Although an OSA will almost certainly result in important benefits (especially if openness is maintained through the development and lifecycle), good system/software architecture engineering will recognize that 100 percent open system architecture is typically unachievable. Moreover, openness must be weighed against competing requirements to obtain the best architectural solution.

Getting the balance right between openness and security is going to produce the most secure and cost-effective solutions possible. For those interested in learning much more about computer security and cybersecurity, visit the National Institute of Standards and Technology, NIST website at


1. “Open System Architectures: When and Where to be Closed”, Software Engineering Institute, Carnegie Mellon University, Donald Firesmith, October 19, 2015. Retrieved from:

2. “Secure Embedded Systems,” Lincoln Laboratory Journal, 2016. Retrieved from: