embedded award 2023: Embedded vision nominees
2/17/2023 Embedded Vision Expert knowledge embedded world

embedded award 2023: Embedded vision nominees

The data volumes and data rates are very high for embedded vision systems. Consequently, the demands on the systems, their components and their connections are much higher than for classic sensor systems. As a result, embedded vision is driving the entire industry. The nominees in this category are also among the drivers of the industry.

Cameras in an embedded vision system Embedded vision is driving the entire industry
ADTF3175 Time-of-Flight Depth Sensor ADTF3175 Time-of-Flight Depth Sensor

A one megapixel time-of-flight module, a ultra-compact 3D laser profile sensor and a miniature near-eye display module


ADTF3175 – One Megapixel Time-of-Flight Module

Exhibitor: Analog Devices
Hall/Booth: 4A/4A-360

Across many industries automation is becoming a necessity as the available workforce resources diminish and onshoring on manufacturing and logistics increases. Autonomous Automation requires a combination of advanced sensor technology and post processing (machine learning / AI) in order to become a reality and address this global surge in demand for automation.

The ADTF3175 Time-of-Flight Depth Sensor advances the accuracy and quality of depth imaging such that autonomous machines can operate more efficiently and at a faster rate. For instance, as machines transition to autonomous modes of operation where they take on more and more of the decision-making process, their need to perceive accurately with precision in three dimensions advances.

Today, depth sensing is accomplished using triangulation methods (i.e. stereo 2D cameras or structured light) but this has a number of limitations as compared with accessing true 3D depth measurements. The ADTF3175 is able to resolve 3D images with an accuracy of +/-3mm over the full depth range of 20cm up to approx 4m.

Another example is with Autonomous mobile robots (AMR) that require the ability to see all around them, and to navigate safely in environments that will contain obstacles, and that will have a wide range of illumination conditions. Collaborative robots (cobots) must work alongside humans, and safety is paramount; safety that can only come with accurate, low-latency imaging of the shared space that the ADTF3175 enables.

The ADTF3175 is the first high-resolution, industrial quality, indirect Time-of-Flight (ToF) module for 3D depth sensing and vision systems. The module delivers a turnkey, scalable system that provides high-accuracy and robustness over variable environmental conditions.

Moreover, the module abstracts the burden of optical and electromechanical system design and delivers a fully engineered and calibrated depth system to designers allowing them to concentrate on bringing new 3D sensing and vision systems to the market.

Time-of-Flight technology is complex to implement for typical customers. It requires capabilities in optical, mechanical, and high-speed electrical design, combined with sophisticated system engineering expertise and supply-chain breadth around chip-on-board, optics, etc. The result of this complexity has limited the access to this groundbreaking technology.

Current high quality depth cameras that are custom engineered tend to be expensive. The goal of the ADTF3175 is to democratize the access to high-performance (ToF) to the broadest base of users while bringing to bear Analog Devices’ expertise and understanding of the industrial market with respect to technical challenges, quality, and product lifetime.

Based on the ADSD3100, a 1 Megapixel CMOS indirect Time-of-Flight (iToF) imager, the ADTF3175 also integrates an infrared illumination source with optics, laser and laser driver and a receive path with a lens and an optical band-pass filter.

Would you like to delve deeper into the topic?
At embedded world Exhibition&Conference 2025 
from March 11 to 13, 2025,
you will have the opportunity to exchange ideas with industry experts. 

Vision Components VC picoSmart 3D

Exhibitor: Vision Components GmbH
Hall/Booth: 2/2-450

VC picoSmart 3D was developed for the fast and simple design of ultra-compact and low-cost 3D laser profile sensors. The tiny sensor, measuring only 100 x 40 x 41 mm, combines the smallest complete embedded vision system on a circuit board with an ideally matched line laser module. Despite its small size, the integrated processors with operating system enables image processing in real time and directly onboard of the sensor.

The results are shown on an integrated display, which can also be used to set and check the measurements. Possible applications of the VC picoSmart 3D include position & object recognition, adhesive bead inspection, angle, gap & profile measurement, etc.

The 3D profile sensor is available freely programmable, with customized housing or adaptable for individual applications. This makes it an ideal basis for the rapid development of low-cost OEM 3D sensors. VC picoSmart 3D is developed and produced in Germany.

VC picoSmart 3D contains all components for versatile triangulation tasks, from the laser module and Scheimpflug adapter to the 1 megapixel image sensor and the components for real-time image processing, incl. FPGA module, FPU processor, real-time OS, memory, FPC connector for interface board etc.

The tiny sensor is the ideal basis for the fast and cost-effective development of individual OEM 3D sensors. All components are long-term available and optimized for mass production. The integrated ultra-compact line laser module, especially developed by Vision Components, features a blue high power laser, class 2, with a wavelength of 450 nm and 130 mW average output power.
Due to the Ambient Light Suppression Technology developed by VC and combining an extremely powerful laser with very short shutter rates, the VC picoSmart 3D laser profiler enables

measurements even at ambient light levels of up to 100,000 Lux. To our knowledge, VC picoSmart 3D is the first and only freely programmable 3D laser profiler with such a small design, super-competitive price-performance-ratio and onboard data processing, that has been developed especially for OEM applications.

It is optimized for mass production and low series costs, can be adapted to OEM customers’ needs and fulfills all requirements for industrial-grade quality including long-term availability of all components.

embedded award Nominee

Direct viewing miniature near-eye display module

Exhibitor: WiseChip Semiconductor Inc.
Hall/Booth: 1/1-153

The direct view miniature near-eye display system developed by WiseChip has improved the optical engine mechanism. As a result, the device has only a 5% light loss. The device is not susceptible to interference from ambient light, and the panel brightness only needs 100 nits, which can satisfy the use of the device under high ambient light.

Furthermore, because the energy consumption, light consumption, and weight of the device are reduced, the device can satisfy the user's operation in the outdoor all day, and the user is not prone to fatigue.

WiseChip's direct view miniature near-eye display module combines micro-lens, micro-display, and thin-film encapsulation technologies. This technology can greatly reduce weight and light loss of the optical engine, ambient light interference, power consumption, device cost, the thickness of the display panel, and the panel border.

Besides, the panel is miniaturized through miniaturization technology, and its scale is like nails. The combination of the micro-display and the micro-lens constructs its image in the eyes of the user. Because the volume of the product is tiny, it can be placed in the non-primary eye area, so that users see it out of the corner of my eye. This design will not affect the user's main environmental information and cause danger in use.

The direct view miniature near-eye display module is very suitable for outdoor or high-intensity light environments. The product gives users advice on outdoor sports behaviors, such as heart rate, rhythm, temperature, and humidity, allowing athletes to focus more on coordination.

It can also be used in the manufacturing industry, for example, to provide engineers to confirm the instrument values and steps. It can also be used in vehicles, aerospace and military extension applications, such as advanced auxiliary system information, battlefield combat data, and auxiliary program formulation.

Besides, under the environment of low ambient light indoors, this product can also provide additional auxiliary information on consumer behavior, such as selling price and style.


embedded award Nominee