Autonomous Cars With Eyes?
I’ll make no secret of the fact that I’m not a fan of autonomous cars. For one thing, a lot of people like the feeling of being in charge of where they’re going. For another, well, we’ve all had those moments when other electronic bits and pieces flop and crash, and generally don’t do what they’re supposed to do. An autocorrect fail is not usually life-threatening, and an app that refuses to open won’t kill you. However, we can all imagine what could go wrong with a car that (supposedly) thinks for itself. However, computers don’t get drunk or distracted, so the idea is that autonomous cars will make things safer overall on the roads.
However, among several things that autonomous cars have problems with, shared zones are one of them. Shared zones are those parts of the road where pedestrians and cars can share the same space. They’re usually found in commercial areas of town with lots of shops and eateries. You’ve probably used one of these at some point – I know I have. The thing with these spaces is that the issue of who gives way to whom is often sorted out through a complex series of gestures and eye contact between drivers and passengers. For example, if I’m the driver going through one of these shared zones, I can see a person on the side of the road who looks like they want to cross my path, make eye contact with him or her, then tell him or her to go first with a wave of my hand or a jerk of my head – and the pedestrian may do the same, or accept the offer to go first with a nod, a smile, a thumbs-up… or just stepping out.
The problem is that autonomous cars just aren’t equipped for this. Part of the problem is that they can’t cope with body language and all the subtle nuances that humans can do without thinking. We’re good at this sort of thing. However, another part of the problem, according to some Japanese researchers, is that pedestrians don’t know if the car is “looking” in their direction or is about to move in a certain direction. Indicators and brake lights help, but they can only convey big-picture information: left, right and stop. With cars driven by humans, the drivers do subtle things that suggest they’re about to do something, which another human can pick up on, such as inching forwards, adjusting positioning on the road prior to making a move. However, autonomous cars just do it, like the Nike ad.
What if cars could somehow make eye contact with pedestrians and telegraph what they’re about to do and/or let pedestrians know that the car has “seen” the pedestrian? Well, it’s being tried by some Korean researchers, who have decided that the solution is to give autonomous cars big googly eyes. It’s called the Gazing Car concept. The idea is that the big eyes will “look” at the part of the road that the sensors are focused on. This means that pedestrians will know if the car has registered their presence or if the car is about to move in that particular direction.
You can see the promo video for the Gazing Car here.
If you watched the video and saw the graph showing the reduction in unsafe crossings, please remember that the trial involved nine guys who crossed the road a combined total of 60 times, so it’s not conclusive and more research will need to be done.
Is this technology likely to be taken up? Given the track record of other whimsical pedestrian safety features (e.g., Tesla’s proposal to have bleating goat noises or farts as the low-speed audio warning sound on its EVs), I’d say it may not catch on. But what do you all think? Are these lights useful, creepy, cute or just plain silly? And am I the only one who thinks that a car with these lights ought to talk as well?