top of page

A life ruled by artificial intelligence? Sign in here


French philosopher Michel Foucalt defined the relationship between power and knowledge in 1979. Describing its manifestation as a disciplinary society, he outlined how normative behaviors are defined and then enforced. Deviations are punished. Repeated deviations are punished more severely. Rewards are used to reinforce good behavior. To be clear, he wasn’t talking about criminal behavior. He was describing how institutions – government, religious, educational -- reinforce social norms. His views were the subject of heated debate in the 1970’s.


We are left to wonder why anyone would sign up to be manipulated by institutions?


Yet, we have. It’s widely accepted that a Russian cabal endeavored to manipulate the outcome of the 2016 election, using social media to influence the attitudes of voters. We weren’t coerced into following the groups set up on Facebook. We willingly signed up for them. And Facebook’s algorithms spread the word to our friends and family, attracting others to the cause no matter how nefarious. The social pressure once present in our communities and neighborhoods have been expanded by social media. Express an inelegant thought to the wrong audience and you will be punished. Conform and you will be rewarded with “Likes.”


Can we resist social media manipulation in the future? It may not matter. Religious and educational institutions are experimenting with robots that interact with congregants and students respectively. And with positive results.


Gabriele Trovato, a professor at Japan’s Waseda University, has created an empathetic social robot intended to alleviate loneliness among senior citizens. He predicts that such robots might make religion more accessible. Quoted in The Wall Street Journal, he points out that religion has morphed over centuries from “oral tradition to written tradition to press to mass media. So it’s very reasonable to think that AI and robotics will help religion to spread…” If that seems far-fetched, consider that more than 10,000 people received blessings from an ATM converted to that purpose during Germany’s celebration of the 500th anniversary of the Protestant Reformation.


Meanwhile, MIT’s Technology Review has reported that Chinese company Squirrel AI is already experimenting with AI-enabled tutors to help students learn math. The effort has attracted investors such as Facebooks’ Marc Zuckerberg and The Bill and Melinda Gates Foundation to a joint venture between the company and Carnegie-Mellon University.


It’s important to note that AI-enabled robots are not programmed to perform a narrow task. They are programmed to learn what works with their human subjects and what doesn’t. They will adjust their approach to suit their human subjects in order to achieve their objectives whether that’s conformance with religious teachings or math proficiency.


Does that make you nervous? Would it make you want to turn off your computer? It’s worth noting that current AI can detect when you’re getting nervous – by interpreting facial expressions or changes in tone of voice, for example – and change its behavior to make sure you don’t get nervous.


In my upcoming novel, The Awakening of Artemis, people sign up to have AI-enabled chips implanted in their necks to measure secretions of hormones and enzymes. An environment enabled to read the body’s biologic indicators could make sure you only receive stimuli that pleases you – just like those Russian algorithms fed us what we wanted to hear even when we didn’t know what that was.


As we become accustomed to AI providing tutorials, religious teachings and empathy, the manipulation that characterized the Russian election interference may become the norm rather than the exception.


0 views0 comments

Recent Posts

See All
Post: Blog2_Post
bottom of page