Invisible-to-Visible visualizes real and virtual world information through augmented reality to create the ultimate connected-car experience for drivers and passengers
Invisible-to-Visible (I2V) uses a 3D, augmented reality interface that merges the real world and virtual world to make information visible which the driver would not otherwise see. By connecting to the virtual world (called the Metaverse), Invisible-to-Visible creates limitless possibilities for services and communications that will make driving more convenient, comfortable, and exciting.
Nissan is continuing with I2V research and development, with an initial forecasting of sometime beyond 2025 for the technology to fully emerge.
I2V technology offers two unique advantages:
Information collected by sensors inside and outside the vehicle is combined with data from the Omni-Sensing Cloud to provide the driver with enhanced information about the surrounding area, including predictive information ahead of the vehicle. This includes potential hazards hidden behind buildings, and obstacles ahead of a blind corner? all displayed seamlessly to the driver to support a confident driving experience.
By connecting the driver and passengers to the virtual world (the Metaverse), distant family, friends, and others can appear inside the vehicle as a 3D AR avatar. Avatars can join inside the vehicle as a driving companion or helping support driving with advice and suggestions.
* Metaverse: A virtual world constructed on the Internet where people can interact freely in a variety of ways and forms. By utilizing technologies such as AR (augmented reality), VR (virtual reality), MR (mixed reality), and XR (cross reality), the Metaverse is able to connect the digital world and the real world.
By connecting to the Metaverse, I2V can connect drivers and passengers with people all across the world.
Invisible-to-Visible visually communicates rich information collected by the vehicle and Omni-Sensing Cloud, providing endless possibilities for service and communication in real-time.
Information collected by Nissan's Omni-Sensing Cloud technology, real-time traffic analysis by Nissan's Seamless Autonomous Mobility (SAM) technology, logged driving environment data, and other information is provided into the driver's field of view- allowing the driver to see "invisible" information.
• Typical driving scenario: The system displays information such as road environment conditions and potential hidden obstacles or pedestrians.
• Traffic congestion scenario: The system displays upcoming traffic situations, including the cause of them, so the driver can decide the optimal lane or route to take.
• Parking scenario: The system displays where and when the next parking space will become available.
• Mountain driving scenario: The system projects an image of oncoming vehicles or an obstruction that is hidden beyond a curve.
• Autonomous driving scenario: The system can project a scene of clear skies on the vehicle windows during poor weather conditions to improve the driving experience.
People joining through the Metaverse appear as 3D avatars in the real world through MR (Mixed Reality), letting people in both worlds share the experience of space and movement in real-time.
• Companion scenario: A family member or friend from anywhere in the world can join the vehicle as an avatar to accompany on the journey via the Metaverse. The avatar would occupy a seating position to give a sense they are actually in the vehicle.
• Tourism scenario: If traveling to a destination, a local guide can be requested from the Metaverse to join the drive as an avatar to answer questions and give recommendations on places of interest in real-time.
• Service scenario: If needing driving instruction or guidance, a pro driver from the Metaverse could join the drive as an avatar and assist in real-time.
• Service scenario: While on a long drive, services such as business consultation, private advice or self enriching activities such as language lessons can be offered by real people connecting inside the vehicle as a 3D avatar through the Metaverse.
I2V utilizes several systems to provide rich information and data from inside and outside the vehicle including area infrastructure.
Nissan Omni-Sensing technology: A virtual hub that gathers real-time data from traffic environments and the vehicle's exterior and interior surroundings. Exterior information can include: road status, visibility, signage, and nearby cars and pedestrians. Interior information can include: The driver's level of alertness, facial expressions such as confusion, and body tracking for virtual avatar or Virtual Personal Assistant (VPA) interaction.
Seamless Autonomous Mobility (SAM): Analyzes the road environment through relevant real-time information.
ProPILOT: Identifies the conditions immediately around the vehicle, and interior sensors identify the interior environment.
Digital twin: A digital representation of the physical world including vehicles, buildings and infrastructure is created in real-time through massive amounts of data collected through Omni-Sensing technology. Linked to virtual spaces and the real world, the Digital Twin can be utilized to share information and support augmented and mixed reality interfaces in the real world.
I2V augmented reality is currently experienced through the use of AR goggles that the driver and passengers wear, and virtual reality goggles that the user in the Metaverse wears.
- A range of information while driving, from the Metaverse and infrastructure data is collected by the Omni-Sensing Cloud. The information is distilled down to what is needed and projected into the driver’s field of view as 3D AR information. When needed or useful, information such as near-future predictions obtained from the results of simulations or past information obtained from the Omni-Sensing Cloud can also be projected for the driver to see.
- By connecting to the Metaverse, the driver and passengers can connect with people who are active in the virtual world. The Metaverse user is projected as a bit-out presence into the real world (in the vehicle) while at the same time the real world users (those inside the vehicle) are projected as an atom-in presence on the Metaverse side, allowing users in both worlds to share the experience of space and movement together.