Take a fresh look at your lifestyle.

Embedded AI Processors: The Cambrian Explosion

Half a billion years ago something remarkable occurred: an astonishing, sudden increase in new species of organisms. Paleontologists call it the Cambrian Explosion, and lots of of the animals on the planet today trace their lineage back to the wedding.

A similar thing is going on in processors for embedded vision and artificial intelligence (AI) today, and nowhere will that be more evident than in the Embedded Vision Summit, which will be an in -person event held in Santa Clara, California, from May 16 -19. The Summit focuses on practical know -how for product creators incorporating AI and vision in their products. These products demand AI processors that balance conflicting needs for high performance, low power, and cost sensitivity. The staggering quantity of embedded AI chips that will be displayed at the Summit underscores the industry's response to this demand. While the sheer number of processors targeting computer vision and ML has me overwhelmed, there are some natural groupings which make the field easier to comprehend. Here are some themes we're seeing.

First, some processor suppliers are thinking about the proper way to serve applications that simultaneously apply machine learning (ML) to data from diverse sensor types – for example, audio and video. Synaptics' Katana low -power processor, for instance, fuses inputs from the variety of sensors, including vision, sound, and environmental. Xperi's talk on smart toys for future years touches on this, as well.

Second, a subset of processor suppliers are focused on driving power and cost down to the absolute minimum. This really is interesting since it enables new applications. For example, Cadence will be presenting on additions to their Tensilica processor portfolio which allow always -on AI applications. Arm is going to be presenting low -power vision and ML use cases according to their Cortex -M series of processors. And Qualcomm will be covering tools for creating low -power computer vision apps on their own Snapdragon family.

Third, although a lot of processor suppliers are focused mainly or exclusively on ML, a few are addressing other sorts of algorithms typically used in conjunction with deep neural networks, for example classical computer vision and image processing. A great example is quadric, whose new q16 processor is alleged to excel at an array of algorithms, including both ML and conventional computer vision.

Finally, a completely new species seems to be visiting the fore: neuromorphic processors. Neuromorphic computing describes approaches that mimic how a brain processes information. For instance, biological vision systems process events in neuro-scientific view, as opposed to classical computer vision approaches that typically capture and process all the pixels in a scene in a fixed frame rate that has no relation to the origin from the visual information. The Summit's keynote talk, \”Event -based Neuromorphic Perception and Computation: The way forward for Sensing and AI\” by Prof. Ryad Benosman, can give an overview of the benefits to become gained by neuromorphic approaches. Opteran will be presenting on their neuromorphic processing method of enable vastly improved vision and autonomy, the style of that was inspired by insect brains.

Whatever the application is, and whatever your requirements are, somewhere available is definitely an embedded AI or vision processor that's the best fit for you. In the Summit, you can learn about most of them, and speak with the innovative companies developing them. Come check them out, and be sure to check in Ten years – whenever we will see what percentage of 2032's AI processors trace their lineage for this modern -day Cambrian Explosion!