Requirements for Devices Around Us: Embedded Systems, Part 2


 The first article in this series discussed certain requirements issues for embedded and other real-time systems, focusing on system requirements, architecture, and requirements allocation. In this article we look at some quality attributes that are particularly vital to explore when specifying requirements for embedded systems projects.

 Quality attributes for embedded systems can be much more complex and intertwined than those for other applications. Business software is generally used in an office where there’s not much variance in the environment. In contrast, the operating environment for embedded systems could involve temperature extremes, vibration, shock, and other factors that dictate specific quality considerations.

 Embedded systems also are subject to quality attributes and constraints that apply only to physical systems. These include size, shape, weight, materials, flammability, connectors, durability, cost, noise levels, and materials strength. All of these can increase the effort needed to validate the requirements adequately. There could be business and political reasons to avoid using materials whose supply might be threatened by conflict or boycott, causing prices to skyrocket. Other materials are best avoided because of their environmental impacts. Avoiding the use of optimal materials could lead to trade-offs in performance, weight, cost, or other attributes. Optimizing the product’s many attributes and constraints is a real balancing act.

 Because quality characteristics often have a profound impact on a complex product’s architecture, it’s essential to perform the quality attribute prioritization and trade-off analysis before getting too far into design. Quality categories that are particularly important to these kinds of products include performance, efficiency, reliability, robustness, safety, security, and usability.


 A real-time system must satisfy the timing needs and constraints of the operating environment. Therefore, include all processing deadlines for specific operations in the requirements. However, performance goes beyond operational response times. It includes such aspects as startup and reset times, power consumption, battery life, battery recharge time, and heat dissipation. Energy management alone has multiple dimensions. How should the system behave if the voltage drops momentarily, or under a particularly high current load during startup, or if external power is lost and the device must switch to battery backup power? Unlike software, hardware components can degrade over time. What are the requirements for how long a battery maintains a given profile of power over time before it needs to be replaced?


 Efficiency is the internal quality counterpart to the externally observable attribute of performance. Efficiency aspects of embedded systems focus on the consumption of resources, including processor capacity, memory, disk space, communication channels, electrical power, and network bandwidth. Requirements, architecture, and design become tightly coupled with these matters. For instance, if the total power demand of the device could exceed the power available, can it be designed to cut power to components that don’t need it all the time? The requirements should specify the maximum anticipated consumption of various system resources so designers can provide sufficient slack resources for future growth and unexpected operating conditions. This is one of those situations for which concurrent hardware and software design is vital.


 Real-time systems often have stringent reliability and availability requirements. Life-critical systems such as medical devices and airplane avionics offer little room for failure. For instance, an artificial cardiac pacemaker that’s implanted into a patient’s body must be expected to work reliably for years. When specifying reliability requirements, realistically assess the likelihood and impact of failure so you don’t over-engineer a product whose true reliability requirements aren’t as demanding as you might think.


 Robustness has to do with how well the system responds to unexpected operating conditions. One aspect of robustness is survivability. A good example of embedded systems designed for high survivability are aircraft “black boxes,” electronic recording devices (orange, actually) that are designed to survive the horrific trauma of an airplane crash.

 Other aspects of robustness have to do with how the system deals with faults, or exceptions, that occur during execution and can lead to system failures. Both hardware and software faults can lead to failures. I once attempted to withdraw $140 from an ATM. The ATM gave me a receipt for $140, all right, but it only issued $80 in cash. I waited 15 minutes while a bank employee rooted around in the back of the ATM; then she handed me my missing $60. Apparently there was a mechanical failure: several bills were stuck together and jammed the exit slot. The ATM thought the transaction had gone just fine—it never detected the problem. This is a robustness shortcoming.


 Safety requirements are vastly more significant for real-time systems than for information systems; people are rarely injured by exploding spreadsheets. Begin your investigation of safety requirements by performing a hazard analysis. This will reveal potential risks that your product could present. A fault tree analysis is a graphical, root-cause analysis technique for thinking about safety threats and what factors could lead to them. This helps you focus on how to avoid specific combinations of risk factors materializing into a problem. Safety requirements should address the risks and state what the system must do—or must not do—to avoid them.

 Hardware devices often include some kind of emergency stop button or dead man’s switch that will quickly turn the device off. My home exercise treadmill had a safety requirement something like the following:

Stop.Emergency: The treadmill shall have an emergency stop mechanism that brings the belt to a halt within 1 second when activated.

This requirement led to the design of a flat plastic key that must be inserted in the front of the treadmill before the treadmill can be powered up. Removing this safety key immediately turns off the treadmill.


 The security of embedded systems is under much discussion these days because of concerns about cyber attacks that could take over, disrupt, or disable power plants, railroad control systems, and other critical infrastructure. Theft of intellectual property from the memory of embedded systems is also a risk. An attacker could potentially reverse engineer code to learn how the system works, either to copy it or to attack it. Protecting embedded systems involves some of the same security measures that host-based information systems need. These include encryption, authentication, data integrity checks, and data privacy.


 Many embedded systems include some kind of human-computer interface. Certain aspects of usability might be important when a person is using a physical device in the field as opposed to a keyboard in the office. For instance, the display screens on products to be used outdoors must accommodate different lighting situations. I once used a bank whose drive-up ATM’s screen was completely unreadable when sunlight hit it at certain angles. Some usability constraints are imposed by legislation such as the Americans with Disabilities Act, which requires compliant systems to provide accessibility aids for people who have physical limitations. Embedded systems must accommodate users having a range of audio acuity and frequency response, visual acuity and color vision, handedness and manual dexterity, body size and reach.

 If you’re developing the requirements for an embedded system, you can still use all of the same elicitation, analysis, specification, and validation techniques that work for information systems. But keep in mind these areas of special emphasis, including architecture, requirements allocation, and quality attribute specification.

Author: Karl Wiegers & Joy Beatty

Karl Wiegers is Principal Consultant at Process Impact, Joy Beatty is a Vice President at Seilevel, Karl and Joy are co-authors of the recently-released book Software Requirements, 3rd Edition (Microsoft Press, 2013), from which this article is adapted.



Copyright 2006-2024 by Modern Analyst Media LLC