As a technologist, I’ve always been interested in human interface. Trained as an engineer, I’ve struggled with the human to human interface issues. As a husband and father, I’ve struggled with human interface with my wife and children. Let’s be fair—they’ve struggled with interfacing with me as well. But that’s got very little to do with trucking. Instead, let’s talk about interfaces between humans and machines.
Early in my career I worked in medical equipment. We worked to find the best ways to provide warnings to doctors and nurses and to quickly provide them the necessary information to react to a medical emergency. We had lights and sounds and printers spitting out EKG data while software tried to automatically interpret the EKG for the doctor and exclaim “VENTRICULAR BRADYCARDIA!” I then went on to working with industrial automation equipment. Now the task was to tell a worker that the drill inside a hole in an engine block has just broken and you better stop the machine or you’re going to waste an entire engine block. Again, we had lights and sounds to work with to inform the operator. But, the operator could not possibly react fast enough to prevent damage to the block. So, we had one more thing at our disposal, we could automatically control the CNC machine and stop the drill from turning—almost immediately. Saving an engine block, or an even more valuable titanium part in the aerospace business was important. So, the human machine interface was reduced to informing the human after the fact, not before.
Next, I spent some time working on human machine interface with nuclear power reactors on submarines. If things get too hot or out of control in a nuclear reactor thousands of feet under the sea, you really do want somebody to take notice and do something about it. In 1968, 12–15 years earlier, a Soviet sub sank due to a control rod failure in its reactor. So, this was a real possibility. Protocol seems to have required that a human be involved in the decision making process. Of course, this was now three decades ago, and we can do so much more with logic in computers now than we could then.
Then, I moved to working on the human machine interface with cars and trucks. I worked on automated mechanical transmissions, anti-lock brakes, instrument clusters, electronically controlled engines, fleet management systems, air conditioning, radar cruise control. Yikes! The human machine interface in a truck has become unbelievably complex. And, we still rely on the driver to be in the decision loop.
Now, along comes various advanced driver assistance systems (ADAS), semi-autonomous vehicles and autonomous vehicles, even drones in the sky. Now the human machine interface extends to more than the driver. The machine also has to inform those around it. It began a few years ago when Japan wanted electric vehicles to make noise so that blind pedestrians would know the vehicle is in the area. What is a pedestrian supposed to do when he/she looks both ways and sees a car with a driver reading a newspaper and not looking at the environment? What happened to the research that says you look the driver in the eye to make sure they understand you mean business and don’t move that vehicle while I’m crossing the street? Well, there is an answer.
A hundred years ago, traffic lights just got started. Green did not always mean go, and red did not always mean stop. Green was not always on the bottom. It took time to agree on the standard for the traffic light and all the other many signs we deal with as drivers. Now, we have the first attempt in a Master’s Degree project to provide a standardized set of lights on the outside of a vehicle to inform the pedestrian. The lights tell you if the vehicle is about to stop, stopped and resting, or about to start and move. Check it out here: AVIP. Nothing is said about sounds though. Will this have to have some sounds like a group of humans attacking and yelling, “Whoop, Whoop?