When we think of Internet privacy, our thoughts often gravitate towards big tech companies like Google and Facebook that mine our online activity with a view toward feeding us ads we are likely to respond to. But is there something else going on?
The New York Times has reported that fledgling company Clearview AI has been used by law enforcement to track down criminals by searching all the photos on the Internet. That candid shot taken by your girlfriend and posted on Facebook, the one where you photobombed your best friend or your LinkedIn profile picture are all part of the company’s database.
If you’re not a criminal, this may not irk you. Nevertheless, privacy advocates have challenged the company in court. To be clear, law enforcement has always used surveillance footage from CCTV to identify criminals. To do so, they have relied upon human intervention – comparing the video to mug shots for example. Clearview AI has supercharged that capability by using artificial intelligence (AI) to scan the images in their database of billions of photos to identify the suspects. And they have succeeded in helping the cops catch the robbers. Sounds great, right? Still, you have to wonder what happens when AI begins tracking people who simply appear suspicious.
Take, for example, the webcams many of us have in our smart homes -- cameras that activate and alert you when the home security system detects motion. Those systems can’t tell the difference between a burglar and a chipmunk. So, another company, IC Realtime, plans to take it to the next level. Using AI, their systems can detect future problems. Dilated pupils and clenched jaws are a precursor to violence. Police could be dispatched before the bar fight begins.
These systems use machine learning to improve their ability to make such judgments. One of the challenges, of course, is that such learning starts with inputs from human beings – and humans have prejudices that are already being mainstreamed into such systems. It seems inevitable that the AI will begin to reflect the bias of those doing the tracking. The bias of one cop toward people of color may affect hundreds. Institutionalizing that bias could affect millions.
In my upcoming novel, such systems are used by the villains to track down the heroes when they are on the run. You’ll have to buy it find out what happens next.
Sign up for this newsletter by clicking HERE.
Comments