Uncategorized
July - 2018
-
WebsiteTitle
- AI and Embedded Systems: Top 5 Use Cases
Embedded systems have been a major part of modern life for decades, while artificial intelligence (AI) has recently experienced a rapid growth in popularity. The inherent synergies between these two technologies make them an ideal combination for a number of applications. Embedded systems can produce a wealth of data points and act based upon real-time inputs, and AI thrives on parsing through aggregated data and can be used to control embedded systems in real time.
Given that, we can expect a surge of innovations that combine these two powerful technologies over the next five to 10 years.
Do You Want AI with That?
Embedded systems are found throughout retail and quick service restaurant (QSR) locations everywhere. Point of sale (POS) systems, kiosks, and barcode scanners are all examples of embedded systems commonly found at these locations. Recently, AI has begun to make its presence felt in the retail and QSR sector as well. For example, CaliBurger, a California-based hamburger chain, has recently introduced two very interesting combinations of AI and embedded systems at its locations:
[caption id="attachment_26" align="alignleft" width="556"]A $100K burger-flipping robot outpaced humans at first but was put back on the job and "moves like a ninja." (Source: www.pasadenanow.com)[/caption]
Flippy the robotic kitchen assistantâCaliBurger partnered with Miso Robotics to create Flippy, a robot that helps prepare hamburgers (Figure 1). Flippy is promoted as the world's first burger flipping robot and is currently in use at one of the Pasadena CaliBurger locations.
A facial recognition loyalty programâ According to CNBC, CaliBurger launched a face-based loyalty program at the end of 2017. This program allows users to pay by simply smiling at a screen. By combining touchscreen technology at ordering kiosks with the facial recognition payment system, CaliBurger can reduce labor costs. The two innovations being piloted at CaliBurger may serve as an indicator of what may be to come for the retail and QSR sector if firms continue to combine AI and embedded systems.
Industry 4.0: Why Analyticsâ Influence Will Grow
Industry 4.0 has been a popular buzzword over the last few years. With grand visions of smart factories and automation, proponents of Industry 4.0 explain how it will fundamentally shift the way we do business, but what exactly will this technology driven industrial revolution consist of? The Boston Consulting Group has identified nine technologies that make up what is commonly referred to as Industry 4.0: augmented reality, big data and analytics, autonomous robots, simulation, horizontal and virtual system integration, the Industrial Internet of Things, cybersecurity, the cloud, and additive manufacturing. As we transition into the next decade, we can expect to see firms continue to iterate and innovate based upon these technologies, eventually leading to an overall paradigm shift in how manufacturing processes work. AI and embedded systems will work together to play a large part in driving this change.
The Industrial Internet of Things (if you need a quick overview of the Internet of Things, check out The Internet of Things: Living in a World of Connected Devices) will contain a variety of embedded systems that are connected to a network. These embedded systems will perform a variety of tasks on the factory floor ranging from monitoring temperature to controlling mechanical parts. The embedded devices will generate data that serves as input to âbig dataâ databases and respond based on inputs and insights from the analytics programs run against this data. For example, if a temperature sensor reports a device has begun to overheat, logic could be triggered to shut down or scale back that device. While this sort of automation is helpful, the insights and advancements that the analytics will make possible may become even more influential. All the sensors and devices on the factory floor will generate data on a scale a human would be unable to process, however, the business intelligence and analytics software firms will be able to process this data in a way that leads to insights and pattern identifications that could optimize operations and enable further automation.
Smarter Homes, Smarter Cities
Intelligent devices designed to make life at home more convenient and connected have flooded the marketplace over the last few years. Many smart home devices are actually embedded systems connected to the network. Examples of these devices include:
Ecobee4âThe Ecobee4 is a smart thermostat that comes with Amazon's Alexa virtual assistant built in. In addition to helping automate the HVAC system in a home, the added Alexa functionality enables it to do things like find recipes and order groceries based on voice commands.
Video DoorbellsâThese video doorbells from Ring can stream video from the front door, have built-in motion sensors, and can provide two-way talk, all of which can be accessed from a cell phone app.
Smart RefrigeratorsâThese refrigerators from Samsung can connect with a smartphone app and allow users to share photos on the fridge's screen, manage shopping lists, and stream TV or music. While there are some legitimate security concerns that need to be resolved as the smart home movement progresses (for a quick primer on how to secure your smart embedded devices, check out this article on the topic), we expect the market to continue to grow and AI to play a big role in it over the next five to ten years. As this market continues to mature, we will see more and more smart home devices available that improve the convenience level of everyday life. With advancements in the AI technology that will tie these different systems together, expect to see more automation and predictive analytics brought to users. For example, an AI program could learn your shopping habits and add items to your grocery list automatically based on the current contents of your fridge.
Smart cities are a concept similar to smart homes but scaled to the metropolitan level. Sensors, cameras, and smart meters are some of the many embedded devices you find in smart cities. These devices are coupled with business intelligence and analytics software that help cities make better decisions aimed at improving the quality of life for residents while reducing cost and minimizing waste. The smart city movement already boasts a number of success stories. For example, according to the Smart Cities Council, the City of Calgary was able to use a data-driven approach to predict and mitigate floods. Moving forward, AI may be able to push the smart cities movement even further into the future. At the heart of AI technology is the ability to capture data inputs and âlearnâ from those inputs. A city full of embedded devices will be able to provide a wealth of data allowing AI to help cities solve real world problems, potentially before the problem even occurs.
Tackling Healthcare Challenges
Embedded systems are just as ubiquitous in healthcare. Embedded devices are integral parts of a variety of healthcare systems ranging from hospital medical equipment to personal activity monitors that encourage a healthy lifestyle. Over the last few years, AI has also made a huge impact on healthcare. The healthcare industry generates a wealth of data that can provide insights and lead to better diagnoses if we are able to process it all, and AI is helping to do just that. Products like IBM Watson Health use AI to parse healthcare data and help tackle some of the major challenges facing healthcare professionals today. AI can assist in areas ranging from prevention to early detection to diagnosis. Watson in particular is working on solving complex problems related to cancer, diabetes, and drug discovery. Moving forward, we can expect to see more integration of AI with embedded systems that can generate valuable healthcare data, helping healthcare professionals improve outcomes for patients and helping consumers make better health related decisions.
Conclusion
The ability of embedded systems to interact with the real world and produce a large amount of data points makes them well suited for use in applications where AI needs to interact with the real world. Over the next decade, you can expect to see these two technologies enable innovations in retail, Industry 4.0, smart homes, smart cities, and healthcare.
Author. Gil Ben-Dov is a 20-year technology veteran with experience at industry leaders Cisco Systems, ResMed, and Air Liquide. He currently serves as the CEO for Total Phase and has held the role since October 2014. Prior to that appointment, Mr. Ben-Dov served as VP/General Manager from 2013 to 2014. He was previously director of sales, joining Total Phase in 2012. [company]
May - 2018
-
WebsiteTitle
- Finding my way around
[caption id="attachment_16" align="alignnone" width="300"]
Figure 1 Keeping tabs on tags keeps track of stock[/caption] I was at an exhibition recently and got lost â really lost, in the halls. I began to panic: âThey close at six! And I cannot, for the life of me, work out how to get out of this hall and on to my next appointmentâ. Thankfully, I did find my way back to base, although I may take some breadcrumbs or a ball of string to mark my return route for my next foray into an exhibition hall. I was reminded of my condition (âexhibition anxietyâ) when I read about shops using RF to track products and to update prices on ePaper price tags. Finnish company, Noccela, has created a positioning app for a mobile phone that works with a smart tag on merchandise in a store to make sure that stock is monitored and controlled. Receivers analyse movements of the tag and can send an alert if it is taken out of a specific area of the store. If the tag âstraysâ or is tampered with, a real time alert is sent to a mobile phone, showing its location. It can also be used, says Noccela, to alert assistants that a customer has picked up, so is interested in an article on the other side of the store and they can go there to help with the sale or offer more information. The real-time positioning system acts with the tag, which is accurate to 0.5meters (around 19 inches). The alarm is also activated and sent to all mobile phones equipped with the Cloud software if a tag is broken, the wire is cut or put into a foil-lined bad to try and evade detection. The app locates where in the store the activity is happening, classifies what kind of activity it is, and adds a time-stamp to help track any further evidence on CCTV. The second retail-related inspiration was from RFID Journal LIVE! in Orlando, Florida. Here, Powercast demonstrated a batteryless electronic Ultra-High Frequency (UHF) retail price tag which wirelessly updates an ePaper price display (Figure 2). [caption id="attachment_17" align="alignnone" width="300"]
Figure 2: Just hanging around for price changes[/caption] It uses RF-to-DC power harvesting, saving on batteries and can update a shelf or rail of products in less time than it takes to put out the stock. Data is sent over the air using a standard UHF Radio Frequency IDentification (RFID) reader. This can be handheld or fixed. The operating range is up to two meters (over six feet). Power is sent the same way, for reliable operation without requiring batteries or time consuming and expensive battery changes. The companyâs PCC110 Powerharvester wireless power receiver chip is embedded in the price tag. When a reader is pointed at the tag, the chip converts RF to DC to power the tag, and updates the display with the new price in a few seconds. Powercast says that the tags can harvest enough power to âoperate perpetuallyâ from the UHF RFID reader. The tag is a segmented ePaper display from E Ink, chosen because it maintains the image when placed outside or directly powered by the UHF field. The concept tag is available for license now. Powercast will also work with customers to design batteryless price tags using the PCC110 chip. (An evaluation kit for the Batteryless Electronic UHF Retail Price Tag is expected at the end of the year from distributors Arrow Electronics and Mouser Electronics.) These examples of wireless connectivity set me thinking: if I could have a tag that can track where I am and in which âzoneâ I should be, or helpfully display where I am headed, throughout long days, I should be able to navigate exhibitions in a timely and efficient manner
February - 2018
-
WebsiteTitle
- Intel Movidius Empowers Mobile, Cloud-Free Artificial Intelligence
Artificial Intelligence (AI), long relegated to the realm of science fiction and more recently to high-powered computing machinery is slowly finding its way to lower-end embedded hardware. For about $100 USD, it is now possible to acquire the all necessary hardware and software to build a customized, vision-based AI solution. Last month Google released their AIY Vision Kit that is powered by the Intel® Movidius MA2450 Vision Processing Unit (VPU). This is the same embedded hardware that has powered the Intelâs Neural Compute Stick USB platform, Googleâs Project Tango, and more recent generations of DJI-branded drones. Put simply, VPUs are customized microprocessor hardware built to handle the specialized machine learning algorithms that enable onboard machine vision processing. These algorithms include convolutional neural networks (CNN) and scale-invariant feature transform (SIFT). VPUs are necessary for efficiently handling neural network computer vision algorithms just as Graphical Processing Units (GPU) were developed apart from Central Processing Units (CPU) to provide better handling of graphics intensive tasks.
[caption id="attachment_9" align="alignleft" width="585"]AIY Vision Kit's do-it-yourself assembly (Source: Google)[/caption]
The onboard processing is a key distinction. Instead of relying on cloud-based processing of locally captured images, the $45 AIY Vision Kitâs VisionBonnet (the plug-in board that contains the MA2450 and associated circuitry) with its neural network algorithms can process imagery without the need for Internet connectivity. Sometimes you do not want an IoT device to require connectivity for processing data. No requirement for constant connectivity means some smarts are in the IoT device itself, which removes potentially unacceptable performance delays and security risks associated with the images traversing the Internet. This concept is sometimes referred to as âedge computingâ or âfog computing .â Going to the cloud for processing resources is not always desirable. Pushing computation down to as close as the sensor node as possible is preferable in many cases, especially for applications such as autonomous vehicles. In this way, latency is improved and connectivity is reserved for things like downloading an improved training model.
 The current iteration of the AIY Vision Kit is built specifically for the Raspberry Pi Zero W Linux-based single board computer. The kit supports two deep machine learning frameworks including Googleâs own TensorFlow and Caffe from Berkleyâs AI Research Lab. It can handle 30 frames per second of image processing. From a practical perspective, the AIY Vision Kit offers starry-eyed startups unprecedented capability in an extremely affordable and hackable package. The VisionBonnet comes with three pre-built neural network models. The first is a model that can both detect faces and determine the emotion emanating from that face. The next relies on MobileNet models (a suite of open source computer vision models built for TensorFlow running on resource constrained embedded devices) to recognize thousands of different objects. Lastly, there is a model that can differentiate between humans, dogs, and cats. Google has released the TensorFlow source code for the models and a compiler for the intrepid innovators who wish to tweak the models or develop their own. From the product prototyping and development perspective, the architecture of the AIY Vision Kit allows for very powerful yet straightforward interfacing. On the hardware side, the Raspberry Pi Zero W has four general purpose input/output (GPIO) that are available for interacting with external sensors and actuators. In addition, from a software point of view, the VisionBonnet can be interacted with using the increasingly popular Python programming language.
 Over time, we have seen technology evolve rapidly due in part to falling prices in conjunction with increasing capability. Part of that Mooreâs Law story is also making affordable tech accessible to those who work in low-end, low-cost embedded hardware such as Arduino and Raspberry-Pi. Affordable equates to accessible. Accessibility leads to opportunity. Rock on.
Â
Â
Â
July - 2017
-
WebsiteTitle
- I2C → I3C Get on the Fast(er) Bus
I2C is 35 years old. Finally, we have a new serial bus communication interface that combines the best of I2C & SPI and is ideal for sensors: I3C, which stands for âImproved Inter-Integrated Circuitsâ (and is pronounced âEye-three-Seeâ). I3Câs effective data bitrate is 33.3 Mbps (max) at 12.5 MHz. It sports only two signal lines and legacy I3C devices can co-exist on the same bus with some limitations. I3C supports dynamic addressing and static addressing for I2C legacy devices. I3C includes in-band interrupt
support, support for hot-join, and multi-master capability is continued. A clear master ownership and handover mechanism is defined for I3C. I3C is limited to a realistic dozen or so devices on a bus.
We are drowning in sensors these days. Itâs great to gather information, but sometimes you run out of GPIO. How many times have you used I2C or SPI because you ran out of GPIO or really needed to reduce pin count? Sure, I2C isnât as fast as other communications buses, but sometimes you donât need fast, you need more room and so you climb on the bus with that extra sensor.
I2C was introduced in 1982 by Philips (now NXP) as a means for MCUs to talk to I/O over a simple serial communications bus. I2C is probably older than some readers, but itâs still in wide use because itâs handy, uncomplicated, and no longer requires licensing fees. I2C is simple and uses two bidirectional, open-drain wires.
For those asking about why we have to have yet another standard, I3C is backwards compatible to I2C. Nothing has to change if you donât want it too, but I3C is more power efficient and addressing is no longer prone to collisions because two devices just happen to be using the same address. SMBus is also a light-weight communication alternative to I2C and was created in 1994. SMBUs is still used today and is compatible with I2C, but complete compatibility between the two buses is certain only if youâre working below 100kHz.
[caption id="attachment_147" align="alignleft" width="300" caption="The June 2017 I3C Plugfest was held in Atlanta, Georgia. "][/caption]
In 2014, the Alliance Sensor Working Group set off to create the I3C sensor interface, starting with surveying the MEMS Industry Group to help identify the âpain pointsâ experienced with I2C and SPI. The I3C standard was released in late 2016 and is still under development as the Sensor Working Group works on getting the kinks out in a series of Plugfests. Plugfests are a gathering of adopters who have developed prototypes for interacting with other adopterâs devices using the same standard. Plugfests are not only a good way to try out the technology, but to see if participants interpret a new specification as intended and to change the spec if developers have better ideas or have fallen into difficulty implementing certain areas.
I3C is very suited to sensors and targets the Internet of Things (IoT) and Automotive markets. The I3C standard is available to see at the MIPI Alliance website. The working group is aiming for the next revision of the I3C standard in early 2018.
June - 2016
-
WebsiteTitle
- Overcoming the Traditional Bottlenecks in Transistor-level and Cell-level Custom Design Flows
Typically, there are three phases in custom design flows where time to final layout has been a bottleneck: area estimation, layout / simulation cycle and final layout. Pulsic, a provider of physical design tools for precision design automation, will demonstrate solutions to overcome those bottlenecks at DAC 2016: Pulsic Animate⢠and Pulsic Unity⢠Chip Planner. Pulsic Animate surpasses traditional approaches to automating transistor-level IC design, which have attempted to improve on portions of the design flow, but have not managed to generate near âmanual-qualityâ layout without significant user intervention. Animate is the first complete automated layout system built from the ground up for transistor-level analog and custom-digital design. Animate overcomes layout bottlenecks by delivering an easy-to-use flow that reads in a schematic, automatically extracts design constraints, and employs patent-pending, multi-threaded PolyMorphic⢠technologies to very quickly produce abstracted âBlueprintâ representations of the designs. Many different âBlueprintsâ are generated in minutes or seconds. These layout âBlueprintsâ are guaranteed to contain no opens or shorts and can therefore be extracted to produce accurate parasitics that can be fed back into simulation. Animateâs constraint recognition capability automatically generates constraints based on netlist topology analysis, eliminating the need for manual constraint entry and management. Layout âBlueprintsâ can be saved to an OpenAccess database and modified by the user to produce high-quality, fully placed and routed, detailed layouts in a fraction of the time taken using a traditional approach. For more detailed information on Pulsicâs automated analog layout solution, see  http://www.pulsic.com/Animate/ For an exclusion demonstration of Pulsicâs Animate while at DAC, click here. âFor too long now, manual layout has persisted in analog design because previous efforts at automation could not approach the level of quality offered by manual designers,â said Mark Williams, co-founder and CEO, Pulsic. âAnimate provides value in multiple areas across the design flow. Since the introduction of Animate last year, we have been receiving accolades from both circuit designers and layout engineers who are able to do accurate simulations early in the design process and to get to faster layout closure with high quality of results.â Pulsic Unity Chip Planner is the first and only hierarchical, top-down and bottom-up floorplanner built for cell-level custom design. Although automated floorplanners are a part of standard digital design flows, they do not address all the needs of custom designers. Custom designers face unique challenges, such as large hard-IP blocks, analog content, and few metal layers available for routing. At leading-edge nodes (28 nm and below), process rules constrain designs in new ways, and the extreme aspect ratios of the routed wires and highly resistive metals make understanding parasitics critical. Unity Chip Planner enables custom design teams to manage growing complexity while accelerating design closure and improving design quality. By providing a high level of automation, Unity Chip Planner gives accurate results quickly and enables custom design teams to respond to netlist changes quickly and easily. In addition, Unity Chip Planner can produce early estimated parasitic extraction data from an unrouted -- or partially routed -- floorplan, allowing multiple architectures to be explored and validated without time-consuming detailed implementation of the layout. Unity Chip Planner provides all the necessary tools and technologies within a fully integrated floorplanning environment. The guided flow offered in Unity Chip Planner helps ensure faster design closure with successful results every time. For more detailed information, please visit:  http://www.pulsic.com/products/pulsic-planning-solution/unity-chip-planner/ For an exclusion demonstration of Pulsicâs Unity Chip Planner while at DAC, click here.
- Aldec Extends Spectrum of Verification Tools for Use in Digital ASIC Designs
Aldec, Inc., a pioneer in mixed HDL language simulation and hardware- assisted verification solutions for digital system designs, announced that its verification tools suite, used for over 30 years of complex FPGA design verification, is available for ASIC chip design. Covering the full digital verification flow from design and test planning through simulation, emulation, and prototyping, Aldecâs popular verification tools help designers in small and large fabless companies ensure that complex digital ASIC designs meet all functional and timing requirements before being committed to a mask set, helping to protect against the high cost of a mask respin. Aldecâs verification suite includes cost-effective, high-performance tools for HDL verification across the full range of industries, including computing, storage, communications, and the Internet of Things, as well as safety-critical applications within aerospace, medical, automotive, and industrial systems. âEngineers need a reliable verification partner that suits their budgets while still providing a high level of support,â said Dr. Stanley Hyduke, Aldec Founder and CEO. âTo fill this need, we at Aldec have extended our spectrum of verification tools for use in digital ASIC designs.â Aldec tools support all phases of the digital flow, from design planning through to prototyping for software verification.
- Spec-T RACER⢠manages requirements/specifications, providing capture, mapping to tests, and full traceability for compliance with critical-systems standards like ISO-26262 for automotive, IEC-61508 for industrial and DO-254 for avionics.
- ALINT-PRO⢠provides VHDL and Verilog code analysis (linting) as well as clock-domain-crossing (CDC) verification. Aldecâs libraries contain well-established criteria for basic coding, CDC, DO- 254, STARC, and RMM standards.
- Aldecâs high-performance Riviera-PRO⢠simulator speeds verification through both fast incremental compilation and fast multi-core simulation execution. It accepts code written in VHDL, Verilog, SystemVerilog, SystemC, and mixtures of these languages. It supports the latest verification libraries, such as OVM/UVM, along with many specialized graphical UVM debugging tools. It offers code and functional coverage capabilities with coverage analysis tools to support a metric-driven verification approach. It can handle the multi-million-gate designs typical of ASIC projects.
- The HES-7⢠board and HES-DVM⢠software combine to provide a hardware emulator with a SCE-MI 2 interface. HES-DVM manages design setup, design compiler integration, and debug instrumentation. It partitions large designs across multiple HES-7 boards and supports several host communication schemes for different emulation modes:
- PLI/VHPI for bit-level acceleration
- SCE-MI 2 and DPI-C for function-based transaction-level and UVM verification
- SCE-MI 2 and TLM for macro-based hybrid emulation with a virtual platform running processor models or a SystemC testbench
- In-circuit emulation with speed adapters for external data streams and interfaces Aldec hardware debugging provides 100% visibility into the design at the RTL level.
- The CT S⢠platform enables at-speed module execution for catching bugs that are evident only at high speeds.
- The HES-7⢠boards give software programmers a hardware prototype for high-speed testing of software against the hardware design.
- The verification tools are supported by a broad set of verification IP (VIP) libraries that save time and effort while ensuring thorough design checkout.
