Just last week, a relative of mine posted an image on Facebook that grabbed my attention. I teach juniors and seniors in a Media and Design class, and our first project was based around image editing. At first glance, the photo just didn’t look right.
The next day, I showed the image to my students, explaining that I had seen it on Facebook. Immediately, one of my students said, “It looks like someone put the front of someone on the back of someone!” They all squinted looking at the proportions of the body. Having spent time manipulating images earlier in the year, they have a trained eye on how easily images can be altered.
We then had a conversation around how, when images like this one are politicized, they feed into our beliefs, and how, when we feel strongly about something, especially if we agree with it, we tend to look past details like bad Photoshop.
On my own, I dug more into the account The Commander in Tweet on Facebook (this is hard to do at school where Facebook is blocked). A brief perusal of the page shows a number of altered images with a political bias. In fact most of the images presented there are altered. Looking at the number of shares and likes, this one image that I saw has been shared over 2,000 times. It’s not clear how many of the people who shared it simply found it funny, and how many believed it was a real image (I’m thoroughly convinced it is not). These kinds of images can easily be used to polarize communities and to influence the political climate. Facebook accounts like these were used by Russia during the 2016 election to attempt to influence our election.
After showing my students this image and discussing it, I asked how many of them had heard of “deepfake” videos. Only one or two students raised their hand, and they weren’t totally clear on what they entailed. I then showed them the University of Washington’s deepfake video of President Obama from 2017, a completely fabricated video created using clips of Obama from different parts of his life. I explained how the scientists fed hours of video clips of Obama to an artificial intelligence software that learned about his facial expressions and head position when speaking to emulate what his face would look like if he were saying the words in the audio from the videos. I could see them in awe of the accuracy of the video. I explained that this video was made in 2017 and that the current technology is easier to use and is more easily accessible to everyday people. They began thinking about the implications of these videos and we discussed how videos like this could easily be used to start a war or provoke political or social unrest. In March 2019, a researcher at the University of Washington called this new realm of misinformation “Information Wars.” We discussed in class how this is a new realm of work within the Pentagon and our government is working on technologies that can detect when videos have been fabricated or altered.
Our young people will be on the frontline of these Information Wars, and many of them will soon be voting for the first time. They are also entrenched in social media where many of these misleading media are shared. Not to sound alarmist, but if we are not having these conversation with our students and showing them these images and misinformation or disinformation campaigns, we are endangering society as we know it and putting our democracy at risk. This whole conversation took 10 to 15 minutes and it had a huge impact. If we take 10 to 15 minutes even a few times a week, we are helping our students learn to be skeptical and teaching them skills that will help them be informed citizens and potentially keep them from letting these kinds of media influence their perspective and polarize them from their peers.
If you want to learn more about media literacy in the classroom, check out my new book, Digital and Media Literacy in the Classroom: Practical Classroom Applications.