Difference between revisions of "Deepfake"

From Conservapedia
Jump to: navigation, search
m
Line 1: Line 1:
 
The term '''Deepfake''' is a combination of the terms "deep learning" and "fake", which references movies, audio, or any other production which is produced by [[Artificial intelligence]].<ref>[https://www.politico.eu/article/spa-donald-trump-belgium-paris-climate-agreement-belgian-socialist-party-circulates-deep-fake-trump-video/ Belgian socialist party circulates ‘deep fake’ Donald Trump video]</ref>
 
The term '''Deepfake''' is a combination of the terms "deep learning" and "fake", which references movies, audio, or any other production which is produced by [[Artificial intelligence]].<ref>[https://www.politico.eu/article/spa-donald-trump-belgium-paris-climate-agreement-belgian-socialist-party-circulates-deep-fake-trump-video/ Belgian socialist party circulates ‘deep fake’ Donald Trump video]</ref>
 +
 +
Deepfake goes beyond impersonation. It creates a Computer Generated Image and even a Computer Generated Video of the subject, doing things that are often wildly out-of-character.
 +
 +
== In popular culture ==
 +
The earliest cultural reference to anything approaching a deepfake was the [[motion picture]] ''[[The Manchurian Candidate]]'' in 1962. In it, [[China|Chinese]] operatives capture a [[United States Army]] patrol in [[Korea]] and condition them to say glowing things about one of their number. Whom the Chinese also program as an assassin. They use a treatment, part light-induced, part drug-induced, to drum false memories into their minds, and post-hypnotic commands into the mind of their chosen assassin.
 +
 +
Next was the [[television]] series ''Mission: Impossible'' from the 1960s and early 1970s. The "Impossible Missions Force" was a secret cadre of con artists who performed "impossible missions," first for the [[intelligence]] community, then in aid of [[law enforcement]]. Some of the most memorable &ndash; even frightening &ndash; confidence tricks they played on their targets involved the deepfake. The Force used a variety of techniques, from a directional sound "beamcast" that only the subject could hear, to surgical implants and hidden cameras and speakers, and even to kidnapping a subject, staging a medical intervention that he barely perceives, and returning him to his environment, ready to accept orders from Force members whom he will now trust implicitly.
 +
 +
== The deepfake as a production tool ==
 +
At least one [[Hollywood]] motion picture studio has already used the deepfake to complete a motion-picture project after the leading actor died. When ''Star Wars Episode Nine: The Rise of Skywalker'' (2019) was in production, Actress [[Carrie Fisher]], then portraying "[[General]] Leia Organa," died. To make her appearance in the film plausible, the studio generated footage of Fisher, in costume, giving "last orders" to her executive officer.
 +
 +
== The deepfake as a political tool ==
 +
Author Linda Goudsmit (''The Book of Humanitarian Hoaxes: Killing America with "Kindness"'', ''Dear America: Who's Driving the Bus?'') warned on 22 September 2020 that the opponents of [[President of the United States|President]] [[Donald Trump]] might generate deepfake footage of him saying something outrageous and appearing to confirm their image of him as copying [[Adolf Hitler]], or something equally frightening.<ref>Goudsmit L, "Ultimate Election Malfeasance: Manipulation of Reality," on Pundicity and at Conservative News and Views, retrieved 22 September 2020. <http://goudsmit.pundicity.com/24574/ultimate-election-malfeasance-the-manipulation> <https://www.conservativenewsandviews.com/2020/09/22/accountability/news-media/deepfakes-ultimate-election-malfeasance/></ref>
  
 
==References==
 
==References==

Revision as of 14:34, September 22, 2020

The term Deepfake is a combination of the terms "deep learning" and "fake", which references movies, audio, or any other production which is produced by Artificial intelligence.[1]

Deepfake goes beyond impersonation. It creates a Computer Generated Image and even a Computer Generated Video of the subject, doing things that are often wildly out-of-character.

In popular culture

The earliest cultural reference to anything approaching a deepfake was the motion picture The Manchurian Candidate in 1962. In it, Chinese operatives capture a United States Army patrol in Korea and condition them to say glowing things about one of their number. Whom the Chinese also program as an assassin. They use a treatment, part light-induced, part drug-induced, to drum false memories into their minds, and post-hypnotic commands into the mind of their chosen assassin.

Next was the television series Mission: Impossible from the 1960s and early 1970s. The "Impossible Missions Force" was a secret cadre of con artists who performed "impossible missions," first for the intelligence community, then in aid of law enforcement. Some of the most memorable – even frightening – confidence tricks they played on their targets involved the deepfake. The Force used a variety of techniques, from a directional sound "beamcast" that only the subject could hear, to surgical implants and hidden cameras and speakers, and even to kidnapping a subject, staging a medical intervention that he barely perceives, and returning him to his environment, ready to accept orders from Force members whom he will now trust implicitly.

The deepfake as a production tool

At least one Hollywood motion picture studio has already used the deepfake to complete a motion-picture project after the leading actor died. When Star Wars Episode Nine: The Rise of Skywalker (2019) was in production, Actress Carrie Fisher, then portraying "General Leia Organa," died. To make her appearance in the film plausible, the studio generated footage of Fisher, in costume, giving "last orders" to her executive officer.

The deepfake as a political tool

Author Linda Goudsmit (The Book of Humanitarian Hoaxes: Killing America with "Kindness", Dear America: Who's Driving the Bus?) warned on 22 September 2020 that the opponents of President Donald Trump might generate deepfake footage of him saying something outrageous and appearing to confirm their image of him as copying Adolf Hitler, or something equally frightening.[2]

References