Starred In Any Porn Videos Lately? Are You Sure?

Subscribers Only Content

High resolution image downloads are available to subscribers only.


Not a subscriber? Try one of the following options:

OUR SERVICES PAY-PER-USE LICENSING

FREE TRIAL

Get A Free 30 Day Trial.

No Obligation. No Automatic Rebilling. No Risk.

“Deep fakes.”

According to Great Britain’s “The Guardian” newspaper, that’s the next big threat to privacy rights, economic stability and the remaining vestiges of civil discourse.

A “deep fake” is a high-tech forgery, using a machine learning technique called a “generative adversarial network” (or GAN). It’s a realistic computer-generated replication of a person saying or doing whatever the “puppet master” software user wants them to say or do. Think “Photoshop on steroids.”

I have dreamed of such technology being used to give the world an inexhaustible supply of new performances by long-departed stars such as John Wayne or Lucille Ball, but I now fear that nefarious abuses would outweigh the good.

When I was growing up, we never imagined such opportunities for mischief. We might put “devil horns” behind someone’s head during a class picture; but nowadays a video could have viewers conned into gasping, “Look! They’ve killed the opposing team’s baby seal mascot and are sacrificing it to the Lord of Darkness himself.”

Think of the societal impact if the face of a squeaky clean actress was superimposed on the writhing body of an XXX-rated starlet, if a virtual Bill Gates announced that all Microsoft products would spontaneously combust in 24 hours, if the lily white police chief of a powder keg city seemingly started singing “De Camptown Races” at a press conference, if a special-effects Vice President Mike Pence failed to don gloves and provide a chastity belt for a little old lady before helping her cross the street…

Deceptive editing has already provided Facebook and YouTube with a plethora of misleading videos. When marginally savvy troublemakers can up the ante and use artificial intelligence to manipulate the words and gestures of politicians, businessmen and religious leaders, we’ll be more polarized than ever, since most people follow the mantra “Seeing is believing.”

(“Seeing is believing – unless I’m seeing a socialist country like Venezuela crash and burn. Then I say they just need to double down and increase the taxes on that rich guy wrestling a stray dog for scraps from the garbage can.”)

On the other hand, the long-term danger is that people will get burned one time too many and start disbelieving simply everything they see. (“That alarmist consumer reporter story about sewer rats here at my favorite restaurant was obviously just a CGI prank and… Ouch! Make it stop biting! Why don’t they warn us about stuff like this? “)

The algorithms that power “deep fakes” are growing more and more sophisticated. Actually, the software fine-tuning is just overkill in the case of people who already see what they want to see. (“Yeah, those stuck-up private-school girls are obviously guilty of shooting at JFK from the grassy knoll. I saw it on my Etch A Sketch. With corroborating evidence from my Wooly Willy and Spirograph, I might add.”)

There is currently a frantic arms race to find ways of identifying and debunking fake videos before evildoers concoct ways of making them even more realistic. Maybe truth and justice will triumph, but only if the forensics experts aren’t deceived themselves.

“I know I’m supposed to pull a double shift scrutinizing campaign ads, but an anonymous tipster just sent me a video of my wife having an affair with James Dean and Clark Gable. I gotta go home and see if this marriage can be salvaged… ”

Copyright 2019 Danny Tyree. Danny welcomes email responses at [email protected] and visits to his Facebook fan page “Tyree’s Tyrades.” Danny’s weekly column is distributed exclusively by Cagle Cartoons Inc. newspaper syndicate.