Safe driving takes centre stage in AutoTronics
To meet the 2018 E-NCAP/C-NCAP requirement, AEB has become almost a compulsory function for every vehicle.
AEB requires a combination of one or more different kinds of sensors with different levels of AEB. Haitec’s system integrates signals from sensors to track objects with its own fusion system. After calculation by a vehicle dynamic model, “we could trigger the brake system and the most accurate timing for AEB,” the company explained. “By doing so, false positive rate could be enormously reduced. Additionally, by expending suitable sensors and with proper software changes, this system could upgrade to fulfill other ADAS functions.”
Figure 1: Haitec's hardware-independent AEB system could upgrade ADAS functions.
The point of Haitec’s AEB solution, however, is, “It’s a highly generic system,” explained Chihhan Chang, image processing department manager at Haitec’s Advanced Engineering Division. “We designed it to work with various sensors or brake systems in different car OEMs’ models with a simple calibration process.”
Driving simulator
In testing highly automated vehicles, nothing is more important than vehicle simulators, noted Vector Yeh, deputy manager, mechatronics and system integration technology group, R&D Division at Automotive Research & Testing Center (ARTC).
The driving simulator integrates virtual environment, sensor models, vehicle dynamic models, driver interfaces, and real-time simulation with a variety of I/O interfaces to establish a real-time co-simulation platform. In accordance with requirements, road scenarios, sensing models and simulated signals can be arranged, and the developed ADAS algorithms or ECU can be verified and validated in the platform.
Figure 2: Victor Yeh demonstrates how to use ARTC's driving simulator.
According to ATRC, the driving simulator can perform different level simulations such as Hardware-in-the-Loop to support algorithm verifications, functional tests, and fault-injection tests before on-vehicle tests. Interactions between drivers and the systems can also be evaluated by driver-in-the-loop level simulation.
Vision-based detection system
Putting bounding boxes around objects that need to be detected is something everyone is doing these days to advance their ADAS systems.
ARTC demonstrated its vision-based system to which the team uses only a camera. “We use a deep learning method like CNN with embedded system to detect pedestrian and bicycle by a pin-hole camera. Real-time forward collision warning system is applied in advanced driver assistance systems,” explained Han-Wen Huang, who works for the applied sensor technology group at ARTC.
Figure 3: The vision-based system provides real-time forward collision warning.
Of course, CNN, deep learning...
But isn’t that what everyone else in the ADAS space already doing?
Noting Taipei road traffic is so crowded with motorcycles, Huang said, “We’ve collected far more data about bicycles and motorcycles weaving through our streets. Our system is significantly effective to detect motorbikes.”
CONTACT US
Zarchin 10St.Raanana,43662 Israel
238884 Singapore