MEMS Get Smarter, Making Developers’ Jobs Tougher

Fueled by new technologies, sensors have become ubiquitous, providing better user interface experiences and more natural interaction with devices.

Rapid advances in MEMS and related technologies are enabling some of the most fascinating, innovative product developments since the PC. But the pace of innovation drives challenges as well. Those range from sensor fusion complexities in highly integrated devices, to stringent board layout requirements (these are mechanical parts, despite their chip-like characteristics), to restrictive power envelopes for mobile, battery-powered devices. Our panel of experts explains these challenges, and offers resources to help developers address them. Get the straight scoop from Dave Rothenberg, director of marketing and partner alliances at Movea; Mike Stanley, manager of the consumer Systems & Product Definition team in Freescale’s Sensors & Actuators Solutions Division; Antonio Cirone, product marketing at STMicroelectronics; and Leopold Beer, regional president Asia Pacific at Bosch Sensortec.

EECatalog: What are some of the hottest new application trends for MEMS?


Dave Rothenberg, Movea: Fueled by a new generation of motion sensors that are both smaller and less expensive, with higher performance and lower power requirements, sensors have become ubiquitous in our daily life; giving us a significantly better user interface experience and helping us interact with our devices in a more natural way. More and more, we’ve come to expect the integration of sensors in applications such as:

Mobile devices

  • Help us find our way in a shopping mall, airport, large retail store
  • Anticipate our needs based on our activity and location
  • Navigate mobile applications more naturally
  • Provides users with a more immersive gaming experience

Home entertainment

  • Navigate the SmartTV interface using motion
  • Contextually aware TVs and STBs can pause or stop the movie because the viewer has fallen asleep or has left the room

Sports, wellness and healthcare

  • Measure our physical activity with accuracy to help sports enthusiasts and athletes evaluate and keep track of their performance and progress
  • Track our daily activity and help us reach our health goals, fusing data with other information such as calorie consumption, environmental conditions (altitude, weather, etc.) to help us make better decisions
  • Monitor activity and range of motion for elderly patient activity tracking, physical therapy and rehabilitation


Mike Stanley, Freescale: There are several new trends worth mentioning here:

  1. Location-based services require a constantly updated estimate of a consumer’s location. MEMS and magnetic sensors can be used for dead reckoning calculations that can be used to interpolate/assist GPS and/or wireless triangulation methods. I just listened to an article on NPR ( – “Facebook Releases Quarterly Earnings”) which mentioned that Facebook is now bringing in significant ad revenue from material presented on portable devices. NPR reports that Facebook claims 750 million active mobile users and that mobile advertising accounts for 30% of Facebook’s ad sales
  2. An aging population is driving development of systems for home health and activity monitoring. This was an area that was clearly exploding at the 2013 Consumer Electronics Show. As the Baby Boomer generation enters its retirement years, we can expect the market to respond with systems designed to make in-home health care more practical.
  3. Industrial use of MEMS is increasing as those devices get smarter. MEMS sensors can be used for everything from vibration monitoring to measuring water levels in home washing machines.


Antonio Cirone, STMicroelectronics: MEMS sensors are becoming ubiquitous in our everyday life, abundant in our mobile phones, laptops/tablets, smart watches and gaming consoles. In the consumer market, MEMS sensors help us interact with portable devices in a more intuitive way or as “airbags” to protect sensitive hard disc drives and notebooks. By combing complementary sensors, we are accelerating to the next generation of advanced motion recognition and enabling new applications like optical image stabilization with gyros for better pictures in sub-optimal light, and location-based services (LBS), augmented reality and indoor navigation. MEMS sensors are also entering fitness and wellness markets, speeding us toward applications that allow smart remote monitoring for enhanced health care.


Leopold Beer, Bosch Sensortec: The trend clearly goes towards user context and user environment sensing. This means that besides utilizing motion, position and orientation sensors also ambient air temperature and air pressure sensors get employed to broaden the user context aspect. Further applications are enabled by the combination of various sensor signals in order to provide “synthetic sensor” outputs. One example of such a sensor cluster is the Absolute Orientation Sensor (BNO055) that frees application developers from the burden of doing sensor data fusion and sensor data correction. With this, motion tracking and position tracking applications are easy to implement in a wide range of devices.

EECatalog: MEMS suppliers continue to innovate, integrating more functionality into a single device. What do developers need to be aware of as they design around these integrated devices?

Rothenberg, Movea: Applications such as augmented reality, gaming and pedestrian navigation rely on the accuracy of the inertial sensing and sensor fusion techniques. When developing on an integrated device, developers who want to offer a better user experience want to extract the best possible accuracy. Therefore, they will need to be wary of the performance metrics of the devices with additional functionality. Many suppliers offer sensor fusion, but not all sensor fusions are created equally and do not deliver the same level of accuracy performance. Developers should pay particular attention to static accuracy, dynamic accuracy, anomaly detection / rejection as well auto calibration.

Stanley, Freescale: MEMS and magnetic sensors measure physical quantities. Measurements can be affected by physical placement of the sensor within the product. Placement guidelines are different for each type of sensor. I’ve previously outlined some of these guidelines at and–where-and-why. It is extremely important that design engineers understand these placement guides BEFORE they start PCB layout and physical design of the product.

Cirone, STMicroelectronics: The new trend is to have an intelligent sensor hub integrate multiple sensors and a general-purpose microcontroller into a tiny single package with embedded algorithms to reduce the load on the host processors and lower the power budget of the entire system. This intelligent hub collects the data generated from embedded or external accelerometers, gyroscopes, compasses as well as pressure and other sensors, and, through dedicated fast interfaces, synthesizes an accurate and stable output, immune to external interference.

Beer, Bosch Sensortec: If integration is not done in a proper way, the outcome might be overall poor sensor performance and sometime even parasitic sensor signals. In simple words, not every sensor can be combined with every sensor—developers need to carefully decide which solution they choose—and integrated sensor data fusion concepts or software that comes along with the hardware should also be an important decision criteria. As of today, sensor manufacturers are the most skilled in sensor data fusion implementation as these concepts are strongly hardware dependent, so sensor suppliers owning a broad spectrum of technologies have an advantage. Usually designers know quite well what the capabilities of the various sensor manufacturers are and are well-advised to rely upon experienced integration providers.

EECatalog: There’s been a lot of talk about the lack of standards in data sheet specs for sensors. What do engineers need to know about the issue and how to address it?

Rothenberg, Movea: From the sensor standpoint the only difference from one vendor to the next is how they specify their noise numbers; other performance metrics have de-facto standards. Standardization can happen at many levels: in hardware, the standards specify how to state performance metrics; in software, standardization typically focuses on APIs or common interfaces. Sensor fusion is a combination of hardware and software, making it even trickier. There are several initiatives from the likes of Android (Google), Win 8 (Microsoft), Intel and NVIDIA that attempt to tackle this problem.

Engineers should know that MEMS vendors specify that noise and drift numbers are not presented in the same format from vendor to vendor, and that noise and drift numbers are as important in directly impacting sensor performance. Sensor fusion vendors that are sensor agnostic take into account the differences in the sensors and normalize the fused outputs as best as possible, but the quality of the input signal directly impacts the quality of the output signal.

On May 2, 2013, the MEMS Industry group released “Standardized Sensor Performance Parameter Definitions,” authored in collaboration with leading MEMS sensors ecosystem players. This document is one of the first to provide an apples-to-apples comparison of disparate MEMS sensors from diverse vendors, enabling consumer platform developers to easily evaluate sensor solutions for their applications.

Stanley, Freescale: There is a lot of variation in the ways that various manufacturers specify sensor performance. During 2012, Intel and Qualcomm initiated a draft document entitled “Standardized Sensor Performance Parameter Definitions” to address at least part of this issue. This included inputs from at least ten other industry leaders. Version 1.0 of the spec is expected to be blessed by the MEMS Industry Group in the next month or so.

Cirone, STMicroelectronics: MEMS sensors combine mechanical parts with electronics—all connected in a single package. Each of these ingredients is equally important for the product performances and at system level for the application specs. Each supplier produces their guidelines for developers with the key parameters and the rules to design MEMS sensors into final product in such a way to optimize the system performance and thus the product specifications. We agree that standards will simplify the life of systems developers and shorten time to market.

Editor’s Note: “Standardized Sensor Performance Parameter Definitions” is available to the public for download at:

Beer, Bosch Sensortec: Within the last weeks there has been significant progress in this field, with the issuing of data sheet standards driven by the MIG (MEMS Industry Group).The transparency of data sheets has increased—still there are quite big differences in the market when it comes to sensor performance, quality and reliability of various sensor suppliers products.

EECatalog: As developers face complexities around applying sensor fusion data, what tradeoffs should they be considering in terms of development environments?

Rothenberg, Movea: Implementing data fusion creates a new set of challenges for developers. Developers must learn to work in a three-dimensional space, create testing solutions involved in motion and critically understand the complexities of calibrating three different types of 3-axis sensors. Developers should be looking for computer-aided engineering tools to support the engineering process from concept to released product. Tools such as Movea’s Motion Studio allow the developer to record and visualize motion data from a wide range of MEMS devices, give detailed support for all calibration requirements and allow access to a comprehensive set of both low-level algorithms though to complete functional blocks. With the ability to record and replay data either directly from sensors or after the data processing has been completed this multi-time domain tool is invaluable in supporting getting new designs to market with comprehensive regression testing support.

Stanley, Freescale: They need to evaluate their application to determine exactly what information they need to extract from their sensor subsystem. Do they need simple portrait/landscape? Angular rates? Linear acceleration? Orientation? Position? Gesture Recognition? How variable is the sensor environment? Is it pseudo-stationary, or subject to active acceleration/rotation?How much CPU bandwidth can they afford to spend on computations? Can the system justify use of a separate processor as sensor hub? What are the power limitations? How much memory can they afford? Once they figure out the boundaries of their problem space, they can evaluate fusion options to determine what makes the most sense. This may then impact choice of software development toolkit.

Another topic that should be considered is “what is the communication channel for fusion data”? Is it a wired interface? Bluetooth? Wi-Fi or some other mechanism? I have an Android app that I use for visualization. My working version of the tool supports both Wi-Fi and Bluetooth. It’s very handy for evaluating fusion tradeoffs and educating users about the strengths and weaknesses of fusion options. By the way, this app will be the subject of a presentation at Sensor’s Expo in June. (An early version is available at The Bluetooth/Wi-Fi version will be posted this summer.)

Cirone, STMicroelectronics: Developers need turnkey solutions and dedicated porting support to enjoy the benefits of the sensor hub or sensor-fusion algorithms, and they’d ideally prefer to skip all the complexity behind this emerging but well-established technique. With the embedded microprocessor for sensors, i.e., the “Brain,” the developer can use a standard tool for development and debug and, especially if they have access to the appropriate software drivers, they will reduce their overall development time.

Beer, Bosch Sensortec: Developers are well-advised to use solutions provided by sensor hardware and software suppliers since this is the guarantee that sensor matching is done properly and hardware-to-software adaptation has been tested in all circumstances.

EECatalog: What’s coming next for MEMS technologies? What should developers be thinking about for future projects?

Rothenberg, Movea: MEMS devices are becoming cheaper every day, allowing fast commoditization into devices such as smartphones and tablets. Combined with cloud computing becoming more pervasive, MEMS will revolutionize the way we use these devices and the services. Our mobile devices and even our environments will become context-aware, accessing public and private information such as emails, GPS location, weather, transit schedules, etc. to deliver smarter services to consumers.

For example, your mobile phone can alert you in the morning that your train is late. Additionally, it can provide you the best itinerary or route to reach your destination on time. Another example is if you are in a shopping mall, your mobile device can alert you that your favorite store is having a sale, and using indoor-based location services, it can give you directions to the sale.

In order to gather and process all these data points, developers need to leverage data-fusion platforms of models and developer tools to help them create smarter devices, applications and services.

Although MEMS technology is experiencing tremendous growth, challenges still remain. These include:

  • The wide variety of data sources in vastly different formats across heterogeneous networks
  • Creating adaptive solutions that can be tuned to new data types and new use cases
  • Managing different data rates, data synchronization and data loss
  • Creating effective learning strategies when no a-priori knowledge exists about mappings from data to response
  • Lack of data and metadata representation standards.

Great strides have been made to make our devices smart and intuitive, but in order to make a truly connected world a reality, manufacturers, developers and OEMs must collaborate more closely to deliver the next wave of consumer devices

Stanley, Freescale: Integration, integration, integration. More sensors merged into smaller and smaller devices. Inclusion of integrated MCUs and logic dedicated to fusion acceleration. MEMS manufacturers are working on ways to reduce test and packaging cost. Chip scale packaging will eventually become common. Vendor-supported fusion libraries will be required, allowing easy integration of sensor sub-systems into more complicated systems.Extensive support ecosystems are evolving to support these needs.

Cirone, STMicroelectronics: As MEMS sensors continue on the path of deep integration with multi-axis degrees of freedom, you’ll see them combine embedded processing capabilities, wireless communication and energy harvesting capabilities. Continuing deep integration will enable indoor navigation, LBS, augmented reality and enhanced motion tracking, as well as wireless sensors networks for the next evolution of the Internet of Things and the smart home.

Beer, Bosch Sensortec: The trend towards application-specific sensor nodes provides additional independence from hardware-specific constraints. Developers should focus on what provides maximum customer satisfaction as sensor manufacturers take over the “tricky” work of sensor parameter matching and sensor data fusion programming.

Cheryl Berglund Coupé is editor of Her articles have appeared in EE Times, Electronic Business, Microsoft Embedded Review and Windows Developer’s Journal and she has developed presentations for the Embedded Systems Conference and ICSPAT. She has held a variety of production, technical marketing and writing positions within technology companies and agencies in the Northwest.

Share and Enjoy:
  • Digg
  • Sphinn
  • Facebook
  • Mixx
  • Google
  • TwitThis