E-E says: When we discuss IoT we usually are t..." />

Learn More


Published on January 3rd, 2018 | by Emergent Enterprise

Nissan Tech Allows a Car to Read Your Mind to Boost Reaction Times

[avatar user=”floatee” size=”1″” align=”left” /] E-E says: When we discuss IoT we usually are talking about connectivity between devices and machines. What happens when one of the “machines” is the human brain? Research and innovation is going on in healthcare, industry and government to harness the power and guidance of the brain so we can direct machines to do things just by thinking the commands. In most autonomous vehicles the machine does all the thinking and the passenger is just along for the ride. Nissan may have another approach by tapping into the driver’s grey matter to anticipate reactions and respond to needs. Do we want our thoughts to become data that can become accessed and widely know? Share thoughts below (if you dare).

Source: Darrell Etherington, techcrunch.com, January 3, 2018

Nissan’s latest research project is ‘brain-to-vehicle’ (aka ‘B2V’) tech that could have your next car anticipating your driving reactions before you can even translate them into a turn of the wheel or applying the brake. The neural interface, which can not only improve reaction times, but also manage cabin comforts based on signals it takes from your brain, is one of the things Nissan will be showing off at CES this year.

The automaker shared a look at its B2V tech ahead of the show, demonstrating how it improve reaction times by around 0.2 to 0.5 seconds, which, while a seemingly small period of time, could actually make a big difference on the road, where split-second decision-making can mean the difference between accidents and narrowly avoiding the same.

Anticipating things like braking, applying the accelerator, or anticipating turns, Nissan could develop great advanced driver assistance (ADAS) features, or it could help bridge the gap between semi-autonomous and autonomous vehicles, more safely. It could also help with non-driving functions; Nissan imagines being able to detect discomfort from a driver, which could lead to changing the way the vehicle drives in order to fit the driver’s expectations – and potentially using augmented reality to change what the driver sees to make the driving environment more amenable to safe conduct on the road.

Nissan will show off aspects of the tech using a driving simulator at CES, so attendees will get a chance to see what this cold look like in practice. It sounds like the premise for a ‘Black Mirror’ episode, but it could be something that improves ADAS now and paves the way for much smarter and more capable fully autonomous driving down the road, thanks to the data it provides.

Tags: , , ,

About the Author

The Emergent Enterprise (EE) website brings together current and important news in enterprise mobility and the latest in innovative technologies in the business world. The articles are hand selected by Emergent Enterprise and not the result of automated electronic aggregating. The site is designed to be a one-stop shop for anyone who has an ongoing interest in how technology is changing how the world does business and how it affects the workforce from the shop floor to the top floor. EE encourages visitor contributions and participation through comments, social media activity and ratings.

Back to Top ↑