When artificial intelligence first came into existence, someone, somewhere said it’s not right. Perhaps, that someone knew where things would head with time and how dangerous it could prove to be.

Well, guess what? That, someone, was right. Thanks to artificial intelligence, a new trend is here.

What is it, you ask?

Now you can actually paste a photo of someone you like (read someone you have an agenda against) over another person’s face in a video of your choice with the use of some open-source software and create fake content of all kinds – fake speech, fake news and even fake porn.

When the artificial intelligence first came into existence, there was jubilation. Humans had given machines a potential to, well, surpass their own intelligence.

In simple terms, the term ‘artificial intelligence’ is used when a machine imitates cognitive functions that are often associated with human minds, such as ‘learning’ and ‘problem-solving’.

“The pace of progress in artificial intelligence is incredibly fast.  It is growing at a pace close to exponential. The risk of something seriously dangerous happening is in the five-year time frame. 10 years at most.”

Elon Musk wrote in a comment on edge.org.

The dangerous has happened.

What happened now?

Earlier this week, there appeared a video of former American President Barack Obama calling Trump a “total and complete dipshit.”

His words seemed largely believable until of course, 40-seconds into the video, Jordan Peele revealed himself to be projecting Obama’s voice along with a use of some AI face-swapping tools.

fake news
Jordan Pelle’s production company made this Obama video in collaboration with BuzzFeed

According to The Verge, the video was made by Peele’s production company using a combination of old and new technology: Adobe After Effects and the AI face-swapping tool FakeApp. The video is very well-made and could actually pass-off as real unless you are really looking for any foul play in the video.

Labelled in the end as a ‘public service announcement’, the video warns us to be more wakeful with what we believe in.

“Moving forward, we need to be more vigilant with what we trust from the internet.  It’s a time where we need to rely on trusted news sources. May sound basic, but how we move forward, age of information is gonna be the difference between whether we survive or whether we become some sort of fu***d up dystopia.”

A powerful Ted video is going viral online around the same context:

We can now create videos of people saying things they never said. This is how you can spot them.

Watch the full TED Talk here:

Fake videos of real people — and how to spot them | Supasorn Suwajanakorn

What is fake face?

Fake face is primarily a technique with which faces of anyone ranging from celebrities and politicians to your own family members can be copy-pasted over faces of adult stars or even a normal person talking random stuff using openly available software to create highly dangerous and misleading content.

Advanced machine-learning is used to create fake adult content which features real actors and celebs by pasting their faces over existing performers in adult films.

One can easily imagine the damage this technology can do. In a country where women are subject to harassment of all kinds, this can take it to another level.

Although you’ll have to be quite rogue to bother collecting loads of pictures of your desired target and then wait for hours to let the machine train itself, don’t be too surprised if you hear such a story in the coming months.


You’d also like to read: What Is Artificial Intelligence And How Will It Affect The Future Of Jobs?


How was it developed?

Artificial intelligence has been in the market for quite some time now and of course, the technology has been advancing pretty rapidly in the past few years. In what is called as a ‘contemporary AI boom’, AI has taken huge strides and has now become a tool with a lot of potential, albeit dark potential.

In December last year, a Reddit user, which goes by the name of ‘deepfakes’, came in the center of a storm with his posts involving celebrities swapping places with adult stars in explicit movies.

It wasn’t the first time someone tried making celebrity porn. But it was the quality and smoothness of his work that made us take a step back and question the power of artificial intelligence.

“This is no longer rocket science”

Artificial intelligence researcher Alex Champandard told Motherboard.

fake news
Singer Katy Perry’s face on the body of an adult star using FakeApp

As described by The Verge, all someone needs is a bunch of photographs that can be fed into an algorithm that creates convincing human masks to replace the faces of anyone on video using lookalike data. The software trains to improve itself over time.

All of this has been made oh-so-easy by a user-friendly app known as FakeApp. This app was created by another Reddit user that goes by the name ‘deepfaceapps’ using deepfakes’ original software and improvising upon it. This app has made it ridiculously easy for you and me to do this face-swap thing and the implications can be huge.

What next?

It’s 2018 and technological advancements are normal. But this potential of AI is something that is legitimately scary. To think of what this tool can do to the world of online media is simply beyond imagination.

It is easy to be fooled when tools like these are passed on in our hands. Not to forget, this is just the beginning. Imagining how this can evolve with time is itself another frightening thing to do.

Scientists are developing tools to spot fakes, and we pray that they do it quickly before the trend of fake news takes over and we are doomed.


Image Credits: Google Images, SlashGear

Sources: The Verge, Motherboard (Vice), The Guardian +more


Also read:

What Is Area 51 And Why Do 1.2 Million People Want To Raid It?

LEAVE A REPLY

Please enter your comment!
Please enter your name here