AI-SEE [Artificial Intelligence enhancing vehicle vision in low visibility conditions]
The automotive industry is facing one of the most demanding challenges in its history: how to make automated travel safe in all conditions. There have been great advances towards automation with new vehicles increasingly equipped with driver assistance systems (ADAS). The biggest barrier now remaining to full automation is safe driving under poor weather and low visibility. The AI-SEE project aims to build a novel, robust sensing system supported by Artificial Intelligence (AI) that will enable automated travel in varied traffic, lighting and weather conditions. It will extend the Operational Design Domain (ODD) of automated vehicles (i.e. the scope of what they can do), taking the technology from SAE level 3 conditional automation) to level 4 (high automation) where vehicles drive themselves with no human interaction in most circumstances.
With advanced and autonomous vehicles entering the market, solving problems linked to illumination and weather conditions such as rain, fog and snow is key to ensuring a safe environment for drivers, passengers and pedestrians. However, to move from level 3 to level 4 requires solutions to four key challenges: (i) mass-production of powerful computing platforms (ii) improved sensing capabilities and lower-cost sensors (iii) necessary technical standards and (iv) infrastructure. AI-SEE is focusing primarily on the second challenge by increasing the environmental and situational awareness of vehicles.
Humans ‘see’ by combining stored memories and sensory input to interpret events and anticipate upcoming scenarios. Today’s automated vehicles cannot yet provide this inferential thinking, nor communicate in real-time with the environment. For automated vehicles to drive without human intervention, the information content from current sensors needs to be enhanced significantly. But this will create an increasingly large amount of data transmitted at huge data rates which, along with all the additional sensors, will quickly exceed the limits of in-vehicle storage space, and vehicle computational and energy resources.
Together, the high number of sensors needed for 360 degree environment perception and situation awareness and the high cost of LiDAR (Light Detection & Ranging) used for measuring distances to objects, represent significant barriers to the wider roll out of automated driving.
Taking technologies to the next level
AI-SEE will address these challenges by combining complex hardware and software development, creating automotive perception systems that go beyond today’s state-of-the-art. Its goal is to introduce reliable, secure, trustable sensors and software by implementing self-diagnosis, adaptation and robustness.
The AI-SEE concept is built on four main blocks:
- A 24/365 high resolution adaptive all-weather sensor suite
- An AI platform for predictive detection of prevailing environmental conditions including signal enhancement and sensor adaptation
- Smart sensor data fusion to create the 24/365 adaptive all-weather robust perception system
- A demonstrator and system validation plan, with testing carried out in simulations and in real-world environments in northern Europe
Read the full project profile for the co-labelled Penta and Euripides² project here.