Translate

Listen on Apple Podcasts

https://www.thispersondoesnotexist.com/



If you're unaware of this site, it's a website where every time its refreshed it loads a completely A.I. generated photo. These photos look genuine and it's scary to think the person you are looking at isn't real. Here's a link if you want to know a little more about the software involved with such a website: https://www.lyrn.ai/2018/12/26/a-style-based-generator-architecture-for-generative-adversarial-networks/
This is simply just my personal opinion that the technology for creating fake images, videos, and even voice recordings is already here. How can we distinguish between what's real and what isn't? The media/government can easily use this software and create a literal enemy which doesn't even exist.


In April of 2019 this story was published.

If you look at other sources you'll see they use the same 2 images of the "suspect" and the images in question are very poor quality. Even the name given to the suspect sounds completely madeup. Oh and you guessed it she was found dead. This is a good example of case that may have just been a scare tactic. It takes place around the anniversary of the Columbine High School Shooting. Many school districts were on lockdown and all that just for the suspect to "take their own life" according to the articles.

Some things are just a little fishy and my bullshit radar has been going off more and more. Fake pictures and videos are not impossible to make. If movies can make you suspend your disbelief, how do you know the media isn't doing that with narratives that are more plausible and easier for many people to digest.

1 comment:

Anonymous said...

Great work! food for thought.