It’s May, 2020 and police have issued an “amber alert” because it appears a young girl has been kidnapped. They have photos of the girl, the suspected kidnapper and a description and partial plate on the vehicle. These amber alerts ask all drivers to be on the lookout for them, and sometimes they succeed. This time the trail is cold.
In this hypothetical manhunt, though, police ask Tesla Motors for a special favor. Tesla has a system they use to train the machine learning systems in their cars. When there is a particular type of road feature or obstacle they want more training data on, they transmit software that looks for this type of thing, and then transmit images back to Tesla to use.
This time, police ask Tesla to load in images of the child and kidnapper, as well as the car and partial plate. At this point, many vehicles run Tesla’s new high-powered processor and can easily check all video frames for things that look like the desired images. Tesla owners see an alert pop up, asking their permission to have their car search for the girl. While it takes some time for the neural network to download, once active it’s very effective in California, where a Tesla drives by every minute. Minutes after the kidnapper starts to move, he passes by a Tesla and is spotted. Police confirm the photos as other Teslas track the movements of the vehicle, now with a complete plate. Police apprehend the kidnapper and the girl is saved.
Who can argue with something that just rescued a kidnapped child? Few would dare, but this on-demand ability presents a chilling capability Orwell may not have dreamed of. Police will want more. With their fast processors, Teslas could easily be given lists of license plates to look for and find any of them in minutes if they go on the roads. And though harder, identifying human faces on the sidewalks is also possible.
In the opening scenario, drivers opted in to participating in the search. Yet how long before they decide to simply opt in to all such searches? How long before the scope of the searches extends to anybody with a warrant out on them? How long before it extends to persons of interest? How long before police crave the ability to do this even without opting in, if they have a warrant to do so?
What happens when then Teslas are in China, and the orders come to search for a dissident?
You may think that’s a crazy, Orwellian picture, but this was the pattern in telecommunications. Back in the 1990s, the CALEA act required makers of telecommunications equipment to install wiretap functions to make it easier for police to execute their wiretaps. Those same functions were then used in China and other nations for wiretaps that Americans might not find so savory. At the same time, the National Security Agency tapped into most major fiber switching centers with gateways outside the country, and engaged in large numbers of wiretaps without warrants. Only because of a whistleblower did we at the Electronic Frontier Foundation learn of this and act to warn others and stop it. Our attempts to block it through lawsuits were thwarted by congress passing legislation to retroactively make the wiretaps legal to avoid the lawsuit. Later, Edward Snowden blew the whistle on far grander wiretap ability.
This stuff happens. It happens in the repressive countries and it happens in the liberal democracies. And unless careful steps are taken, it will happen with cars.
The earliest robocars used LIDAR and simple cameras without lots of computer vision processing power. LIDAR is low resolution and can’t really identify people, car models or even license plates in most cases. They had forward-facing cameras, but not enough processing power to be hunting for people while they drove. That has changed. This is not simply the ability of somebody hacking their vehicle to use its cameras for surveillance, this is the problem of all the tools already being present in the car and remotely available to its maker.
It’s time to establish precedents that the fleets of advanced cars on the road do not become a giant surveillance apparatus. That it should be illegal for police to request that car fleets perform surveillance for them. That companies operating fleets resist such requests when they come, in the courts if they have to.
And yes, that includes not building a system that would let individual car owners decide to “opt-in” to being part of the surveillance machine. Or at the very least, have the opt-in question say, “By opting in, you realize you are becoming part of a vast surveillance machine that while it might be used for a good purpose today, history informs us that it never remains good for long.”
It should be noted that this will be a problem with all advanced robocar fleets. I mention Tesla just because they are the first company to deploy a large fleet of cars with camera arrays and powerful neural network processors. Others will follow.
Yes, many cities already have cameras on the street corners, though not as much in the USA. People have fought that too. But the public and private companies should not be co-opted in the building or a system even Orwell didn’t dream of.