Our near-term AI threat isn't direct military attack, like Terminator movies. We're gripped now by early-stage reality manipulation actions. Some of this computational propaganda, like #ReleaseTheMemo, serves those already in power.
In 2016 Aviv Ovadya warned that "the web and the information ecosystem that developed around it was wildly unhealthy." Core civilization institutions are being upended, what he calls an "infocalypse" is unfolding.
In other words, we're collectively at risk of losing touch with reality. Let me break it into six categories.
Technologies that can be used to enhance and distort what is real are evolving faster than our ability to understand and control or mitigate it.
“At the time, it felt like we were in a car careening out of control and it wasn’t just that everyone was saying, ‘we’ll be fine’ — it’s that they didn't even see the car,” he said.
Ovadya saw early what many — including lawmakers, journalists, and Big Tech CEOs — wouldn’t grasp until months later: Our platformed and algorithmically optimized world is vulnerable — to propaganda, to misinformation, to dark targeted advertising from foreign governments — so much so that it threatens to undermine a cornerstone of human discourse: the credibility of fact. [emphasis mine]
That's how I feel, like we're in an out of control car and everybody not only thinks we're fine but they don't even see the car. Two out of control variables are erupting, climate destabilization and the attack on science and truth.
Hyperrealistic photo, audio, and video fakes
Already available tools for audio and video manipulation have begun to look like a potential fake news Manhattan Project. In the murky corners of the internet, people have begun using machine learning algorithms and open-source software to easily create pornographic videos that realistically superimpose the faces... — or anyone for that matter — on the adult actors’ bodies. At institutions like Stanford, technologists have built programs that that combine and mix recorded video footage with real-time face tracking to manipulate video. Similarly, at the University of Washington computer scientists successfully built a program capable of “turning audio clips into a realistic, lip-synced video of the person speaking those words.” As proof of concept, both the teams manipulated broadcast video to make world leaders appear to say things they never actually said.
Adobe also recently piloted two projects — Voco and Cloak — the first a "Photoshop for audio," the second a tool that can seamlessly remove objects (and people!) from video in a matter of clicks.
Here's a sample of how video can simulate a real person saying something that's programmed.
Already someone could use this technology to throw an election or start a war by releasing false video.
There’s “diplomacy manipulation,” in which a malicious actor uses advanced technology to “create the belief that an event has occurred” to influence geopolitics. Imagine, for example, ... Donald Trump or North Korean dictator Kim Jong Un, ... near-perfect — and virtually impossible to distinguish from reality — audio or video clip of the leader declaring nuclear or biological war. “It doesn’t have to be perfect — just good enough to make the enemy think something happened that it provokes a knee-jerk and reckless response of retaliation.”
Another scenario, which Ovadya dubs “polity simulation,” is a dystopian combination of political botnets and astroturfing, where political movements are manipulated by fake grassroots campaigns.
This summer, more than one million fake bot accounts flooded the FCC’s open comments system to “amplify the call to repeal net neutrality protections.” Researchers concluded that automated comments — some using natural language processing to appear real — obscured legitimate comments, undermining the authenticity of the entire open comments system. Ovadya nods to the FCC example as well as the recent bot-amplified #releasethememo campaign as a blunt version of what's to come. "It can just get so much worse," he said.
For details on the latter read How Twitter Bots and Trump Fans Made #ReleaseTheMemo Go Viral.
Automated laser phishing is inevitable
Laser phishing is jargon for precisely personalized phishing. Once software spreads making it easy for users to collect detailed data from your social media to generate texts, voicemail and videos which sound exactly like your friends and relatives, people will be overwhelmed. People will turn off their email and watch fantasy.
“In the next two, three, four years we’re going to have to plan for hobbyist propagandists who can make a fortune by creating highly realistic, photo realistic simulations,” Justin Hendrix, the executive director of NYC Media Lab, told BuzzFeed News. “And should those attempts work, and people come to suspect that there's no underlying reality to media artifacts of any kind, then we're in a really difficult place. It'll only take a couple of big hoaxes to really convince the public that nothing’s real.”
Beset by a torrent of constant misinformation, people simply start to give up. Ovadya is quick to remind us that this is common in areas where information is poor and thus assumed to be incorrect. The big difference, Ovadya notes, is the adoption of apathy to a developed society like ours. [emphasis mine]
I find this pretty scary, Ruth. Of course, I did suspect it because of what I see politically these days. Most of us think we know what was said by one party or the other but it seems to get mixed up today. People are claiming certain things were said by the Dems and later you hear it back as it being said by the GOP. I'm seeing that a lot now. After this is done a while many will ask how do you know the difference?
This can carry on into Trump's many lies to where fact checking goes out the window. New technology has you doubting what was really said, so you just pick whatever you want to identify with and claim it as true. Everybody will be doing it, politicians and advertisers and it's all to milk the public.
Push this to the extreme and history will even be changed. A few generations down the line and nobody will know an accurate past. People will be treated like cattle and all for a profit. It's all very scary indeed.
If there's a silver lining to all this shit, it's that it makes it a bit easier not to regret the fact that we're (individually, I mean) not going to be around longer than we are.
Ruth, You are a ninja for things I don't understand or normally look into!