Dave's Notebook
Writing Practice by David Rickmann.
Informal Communications Between Cars and Humans
There are two things that humans do very well, often without conciously thinking about it. Tool Use and Communication.
Subconcious communication is a topic of special interest to me, because there are a lot of things that human brains do automatically which mine, mostly doesn’t. I have a thing called prosopognasia, an inability to automatically recognise faces. To be precise I have a variant of this called prosopamnesia. In brief, there is a specialised bit of brain which does face recognistion and storage. Folks with prosopognasia don’t have this. Folks with prosopamnesia are missing the encoding section. I did a test where I was shown a face and then had to pick it out a lineup. I was very good at this test. I was then shown a face, and then a fuzzy screen for a second and then the lineup. I was very bad at this test. Funnily enough I am actually far far better than most people at doing this task if the faces are upside down. The upshot of this is that I had to learn to mimic some of the social signals that humans do sub conciously. Did you know that when you meet someone that you recognise, and you want to talk to them, you do a tiny twitch of your eyebrows? Well, you do. If you meet someone who you think should recognise you and they don’t twitch their eyebrows you get annoyed. No part of your concious brain manages this, most people are unaware of this communication.
What does this have to do with autonomous vehicles? Well. It has to do with negotiations. How does a car tell a pedestrian (or another car) what it wants to do? Now there’s a reson I said car there and not driver, and it’s do ith the other thing that humans are great at doing sub conciously. There was a French Philosopher called Merleau-Ponty I’m not good at philosophy. Partially because Lisa McNulty does it for me, so please excuse this very very bad paraphrase of Merleau-Ponty who said that, more or less, if you hold a stick then the stick is part of you. Humans perceive the world through their objects. I usually relate an anecdote here about my friend E. He was holding a wallnut with some kitchen tongs, so that I could smash the wallnut with a hammer. AS I brought the hammer down he leapt back and yelled “Ahhh My Tongs!”. His brain had included the tongs into the model of his body, and he reacted to protect them. Drivers of cars are much the same. When you drive a car your brain, to some extent, includes the car as “you”.
This means that cars as an entity make, and recognise subconcious signals between each other and between pedestrians.. At a simple level this could be a driver inclining their head by a degree to indicate that it’s safe for a pedestrian to cross, a quick flash of lights at a junction to tell another car that they should go first, or a slow/pausing roll forward to say, “yes, I’ve seen you and I’ll move when you’re clear”. Humans look at these signals, speed, car position, how they’re accelerating or braking and build a composite picture of the drivers mood, intention, etc. All tools to help predict how and where they’ll go. But most people would find it quite difficult to put into words specifically how or why they had come to a certain conclusion.
The challenge then, if we are to integrate autonomous vehicles into driving on neighbourhood streets, and interacting with human driven cars or pedestrians is this: How do we teach an autonomous vehicle to recognise, or mimic, these subconcious communication behaviours, especially if (like the eyebrow flash) you don’t even know you’re doing it.