Everything You Need to Know About Deep Fake Technology

Apr 12, 2021
4 min read

Do you see “old pictures of babies” singing random songs and the audience going gaga over it surfacing all over Instagram reels? Or Do you see some legend or country freedom fighter coming alive in the picture -blinking its eyes or smiling back at you? Or Influencers bringing their grands’ picture to life and taking the reaction of their parents? Well, this clearly reflects how advanced and robust technology has grown over time.  All this kind of facial manipulation is possible because of Deep Fake Technology, and here is what you need to know about it.

What is Deep Fake Technology?
Google defines Deep Fake Technology as a synthetic media in which an existing image or a video is replaced with someone else’s likeness. Even though the definition seems very simple, the technology is much more complex.

A deep fake technology can easily manipulate the facial muscles of a person using artificial intelligence. It can replace faces, synthesize faces and synthesize speech. For someone using this technology for entertainment can be fun, while others can land into serious problems if used for unlawful purposes.

In 2017, Deep Fake Technology was first introduced in public. "Deep Fake," a username on Reddit used to manipulate people's facial expressions for pornographic purposes, and hence the technology was named after the name of the user.

This technology has impacted many profiles, from celebrities to politicians. Some of the deep fake technology incidents surfacing all over social media includes Public Service Announcement by Former President Obama, Nancy Pelosi slowed down, deep fake roundtable- a funny compilation of Hollywood stars such as Tom Cruise and others, Trump joining the Breaking Bad and the list of examples is endless.

How Does a Deep Fake Technology Work?
Deep fake images are usually created by manipulating facial expressions or facial swapping using AI. These images are created by training the AI with hundreds of images, training it to identify the facial expressions, and reconstructing the patterns. The AI is trained to analyze an image from different angles to successfully transpose the image on each other making it look real.  Mostly, selfies are used to create deep fake technology because your face and features can be easily traced.  

Another way to create deep technology is the use of GAN or Generative Adversarial Network. GAN consists of two algorithm bodies to act against each other. The first algorithm, known as Generator is trained to recognize the voice and convert it into images, which is then added to the original image.  These manipulated pictures are then fed to a second algorithm - Discriminator. At this stage, the image is just the face of the personality. Discriminator repeats the original images endless times that helps in improving Generator and Discriminator by giving feedback on performance. After enough cycles and feedback, the generator will start generating realistic faces of the personality giving the final result of the fake image and video.

Why is Deep Fake Technology Dangerous?
It is reported that more than 180 million photos are uploaded every day on social media. This technology has acted as a weapon in defaming celebrities’ profiles. With the help of advanced AI, the pictures of celebrities can be easily transposed onto nude and pornographic pictures.

Considering that the technology can also create deep fake videos, the existing videos of political leaders can also manipulate the announcement, adding defamation to the political personality, making agenda chaos in public.

Not only the celebrities, but it can also affect the common people's life. It is reported that many women become victims to their predators by threatening them to leak fake intimate videos online or send them to their families.  

The technology seems fun on social media like Instagram but can cause equal or more damage to an individual’s life.  

How to Spot Deep Fake Images?
Spotting a deep fake video or images can be difficult, but adding little effort to your actions can help you not fall into such traps. Here are two ways to spot them:

  1. Poor Quality- The deep fake videos or images are generally of poor quality. Certain features might not be exactly like authentic muscle expressions, such as inconsistency in eyes blinking, patchy skin tone, and more. One of the significant indications is blinking because AI is not trained or aware of blinking movements.
  2. Fine Details- Fine details are many times overlooked by the creator. However, paying focused attention to details like hair, jewelry, teeth can help you spot the difference between a real and a fake image.

How to Spot Deep Fake Videos?
Deep Fake Videos are difficult to identify because of the use of AI. If you notice a difference between the person and the background, the video is likely to be manipulated. In such scenarios, the face of the person is more focused and the background doesn’t seem in sync or blurred or different, it is an indication that the video you are watching is fake.

How to Avoid Being Victim to Deep Fake Technology Traps?
Many researchers and scientists have suggested that Artificial Intelligence can play a major role in detecting such images and videos. Tech firms such as Microsoft and Intel have introduced deep fake detection tools.

However, building such technologies is time consuming; until then, one of the best ways to avoid such traps is to limit the upload of your pictures or videos especially close ups where your features can be easily traced.

Bottom Line
Deep Fake Technology can be very amusing until it is  used for entertainment purposes, but one must always verify the images before forwarding them to anyone. One of the ways that the future looks forward is the creation of advanced cameras that can convert the images and video into digital signatures that cannot be copied or tampered with by the hacker. With advancing technology and professionals getting skilled at creating deep fake, it can be very difficult to spot the real or the fake images.