Вы находитесь на странице: 1из 7

Running head: DEEPFAKES 1

The Rise of Deepfakes: Why we should be concerned

Excel Chukwu

2019
2
DEEPFAKES

Out of curiosity late one night, Noelle Martin decided to do a reverse google search of

herself and was horrified by what she saw. Rather than finding her pictures on friends’ social

media timelines, she found explicit photos of herself engaging in acts she would never have.

As Noelle began to speak up, videos of her engaging in pornographic activities emerged. She

was barely 18 and not a celebrity either. Although the creation of false images and the

doctoring of videos have been around for a long time, the term ‘deepfakes’ was only coined

recently, and generally means the creation of false digital content using artificial intelligence.

The rise of deepfakes has not only been caused by the ubiquity of technology but also by the

increasing weaknesses in our social institutions, which has affected the values people hold.

Information and Technology (IT) corporations should work to develop the technology needed

to combat this menace, while individuals play their parts in helping to strengthen our social

institutions.

Up until now, people had not bothered about image manipulation. The technology was

not widely available (it was left for Hollywood and the movie industry in general). The vast

majority did not have to worry much about whether the technology would be used against

them too. However, everyone should be concerned now, especially with the rise of artificial

intelligence and related technology. What exactly are deepfakes? It may sound like something

from a science fiction movie, but it is real, and individuals, corporations, and governments

alike would do well to pay attention to it. According to Maras and Alexandrou (2019):

Deepfake videos provide the ability to swap one person’s face onto another in a video

clip or an image. The technology that creates these videos is designed to continuously

improve its performance. Specifically, the algorithm that creates the fake videos

learns, and improves the videos by continuing to mimic the individual’s facial
3
DEEPFAKES

expressions, gestures, voice, and variations, making them more and more realistic.

When starting with sufficient video and audio of a person, the algorithm can not only

create the fake video but can also make the person say things they have not actually

said. Eventually, these videos will be indistinguishable to the naked eye from

authentic videos (pp. 255-256).

Even for someone who may never have heard of deepfakes, the definition above gives a good

understanding of the significance of the concept.

Over the past few years, advances in technology have given humans the ability to do

things formerly not thought possible. One of such advances is artificial intelligence, portrayed

as what could take humanity to the next level of evolution. These videos are created primarily

using artificial intelligence, and a subset of it called machine learning. Artificial intelligence

is “computational models of human behaviour and thought processes that are designed to

operate rationally and intelligently (i.e., simulate human behaviour)” (Maras & Alexandrou,

2019, p.256) and machine learning is “a branch of artificial intelligence that allows computer

systems to learn directly from examples, data, and experience . . . [and] carry out complex

processes by learning from data, rather than following pre-programmed rules”(Maras &

Alexandrou, 2019, p.256). One can now understand how this concept of deepfakes has been

made possible, and hopefully, sufficient interest has been sparked.

As has been the case when such issues arise, many have been quick to point out that

the misuse of, and lack of proper regulation around technology have been the cause of this

phenomenon. However, the rise of deepfakes might point to a cause that has been all but

buried: the decay of our social institutions. While most people do not give much attention to

these institutions, some have worked hard to systematically break down these institutions, the

backbones that uphold societies, such as the family, educational system, organized religion,
4
DEEPFAKES

the community, etc. Perhaps in an age of progressivism, humanity has discarded the very

things that could have helped uphold its values. It is no wonder then that individuals would

not see anything wrong with creating deepfakes, even of people who have done them no

harm. Every technology has the potential to be used positively or negatively, but one must

ask why they are often used more negatively than positively, as is the case with deepfakes.

To address this issue of deepfakes, we must go beyond seeking more technological

advancements in hopes that, somehow, technology is the panacea for our problems. IT

corporations should still play their part in solving this problem, but ‘non-experts’ must also

play their role by working to strengthen these social institutions, and it is not technological or

technical at all. This strengthening could be done in several ways, such as fighting to preserve

the traditional family structure. Individuals and groups must stand for the government giving

families more control over their finances, the services they receive, and those who can

provide the services.

Also, they must strive to reform the educational system and put it back in the hands of the

people rather than leaving the education of children and youth in the hands of a few people.

Another way to strengthen these institutions is by clamoring for more transparency from the

media and other big businesses because they have a significant say in what individuals

believe to be true or false, right or wrong. History stands as evidence to anyone who would

pay attention, that thinking technology to be the solution to our problems is not only risky but

outright dangerous. The Pew Research Center points out in a research that the advancements

in technology the vast majority have believed so much in have only become “tools of

surveillance, behavioral manipulation, radicalization, and addiction” (2018).

One might argue that there is no substantial evidence linking the misuse of technology

(in this case, deepfakes) to the breakdown of our social institutions and loss of values.
5
DEEPFAKES

However, research by Joyce Hertzler, professor of sociology at the University of Nebraska,

points out that such evidence exists. In a research paper, she points out that “institutions,

being such important phases of the life of groups, are most intimately bound up with the

values of groups. Through the institutions play great life values; they are in turn the causes of

many values and the guarantors of other values” (Hertzler, 1929, p.140). Going further,

Mesner, Rosenfeld, and Karstedt (2012) point out the relationship between decaying social

institutions and rising crime rates in an article for The Oxford Handbook of Criminological

Theory: “…criminological phenomena…can be productively understood with reference to

institutional regulation or legitimacy. Trends in crime…are likely to reflect to an appreciable

degree the operation of the basic institutions of a society. When institutions lose legitimacy,

crime rates tend to increase” (p. 12). This ties in perfectly with the concept of deepfakes.

Because these institutions have broken down, an increasing number of people do not concern

themselves with the moral implications of what they do. Thus, they can go on to make these

videos without a thought of the effect it could have on the individual and society at large.

During an interview with a Wall Street Journal correspondent, a professional maker of deep

fakes says he makes them because people want them. Although he should think about ethics,

he realizes that people will make this content regardless, and so goes ahead making them.

Evidence such as these above should convince anyone about the state of our societal

institutions, and how, if the problem of deepfakes is to be tackled effectively, individuals and

groups must work together to dig deep to the root (our social institutions).

In conclusion, this paper raises awareness about the phenomenon of deepfakes and

shows why it is worth the concern. It also helps the reader understand the technologies behind

this phenomenon. Most importantly, however, it provokes the reader to think beyond the

surface about what has caused the rise of deepfakes and what could be potential solutions. It
6
DEEPFAKES

addresses this cause as the breakdown of our social institutions and a possible solution as the

strengthening of these institutions. The rise of deepfakes seems to be a primarily

technological phenomenon, but perhaps the cause is more profound, as pointed out in this

paper. It is worth the thought, and ultimately, worth the solution as well.

As the world becomes more digitized, the technology used to make deepfakes will

only grow more advanced, perhaps even faster than the technology that could counter it.

Humanity is at a point where it will have to decide whether it will surrender to technology as

the messiah that will solve its problems, or whether it will own responsibility and act. It

suffices to say that the ‘artificial’ in artificial intelligence is real. Hence technology, no matter

how advanced, would never solve the problem as humans would. Everyone should be

concerned about deepfakes because more than the reputations of individuals are at stake. The

moral values of our society are too.


7
DEEPFAKES

References

Anderson, J., & Rainie, L. (2018, April 17). The future of well-being in a tech-saturated

world. Retrieved from https://www.pewresearch.org/internet/2018/04/17/concerns-

about-the-future-of-peoples-well-being/

Hertzler, J.O. (1929). Social institutions. Retrieved from

https://ia801604.us.archive.org/27/items/in.ernet.dli.2015.187187/2015.187187.Social

-Institutions.pdf#page=148&zoom=auto,-36,516

Maras, M.-H., & Alexandrou, A. (2019). Determining the authenticity of video evidence in

the age of artificial intelligence and in the wake of deepfake videos. The International

Journal of Evidence & Proof, 23(3), 255-256.

DOI: 10.1177/1365712718807226

Messner, S.F., Rosenfeld, R., & Karstedt, S. (2012). Social institutions and crime. In F.T.

Cullen & P. Wilcox (Eds.), The Oxford Handbook of Criminological Theory.

Retrieved from

https://www.oxfordhandbooks.com/view/10.1093/oxfordhb/9780199747238.001.000

1/oxfordhb-9780199747238-e-21

Schellmann, H. (2018, October 15). Deepfakes are getting real and that’s a problem [Video

file]. Retrieved from https://www.wsj.com/articles/deepfake-videos-are-ruining-lives-

is-democracy-next-1539595787

Вам также может понравиться