Categories Technology

2025 Was the Year I Couldn’t Believe My Eyes

The internet is a fire hose of fake shit. And I’m not sure what to do about it.

I’ve been writing about fake images on the internet since 2013, when a photo of President Teddy Roosevelt riding a moose was going viral and I fired off a quick blog post to debunk it. The image was a pre-Photoshop fake, created as a humorous curiosity a century ago. It was just one of those old images that made you pause to wonder, “wait, could that be real?â€

The following year, in 2014, I debunked a lot more fake images that had gone viral. And I’ve been writing year-end round-ups featuring viral fakes ever since. I typically enjoy fact-checking images that go viral because it feels like I’m getting a better understanding of what’s authentic in the world and sharing that with people who might want a similar understanding. But that feeling changed recently.

This year, I was stumped. Looking back at 2025, there were too many fake photos and videos to process. And doing a round-up of the biggest fakes of the past year felt like a futile effort. What happens when a significant percentage of posts on platforms like Instagram, Facebook, and X are just AI-generated garbage? It felt like this year was a tipping point for that question. So many of the images and videos popping up in my social media streams were just fake. I’ve never seen it so bad.

What examples would you include in an article about the fake images and videos of 2025? In 2023, there were images like the Pope in the puffer jacket created with AI that seemed to be everywhere. It was the early days of photorealistic AI image generation, and the fake Pope seemed to spread to every corner of the internet. Here in 2025, it’s difficult to focus on a select number of images and videos because there are just too many to debunk. People are drowning in fakes, and many have no idea. They just flick their finger and move on to the next photo or video.

Even the experts can’t tell anymore. Jeremy Carrasco has been doing amazing work this year on TikTok, YouTube, and Instagram, educating people about how to spot AI videos. But there was at least one video that he called fake in 2025, which was actually real.

That’s where we’re at. Even the people who are best equipped to spot the fakes aren’t sure of themselves.

Video after video and photo after photo on social media is just AI slop looking for engagement. There’s a financial incentive for people to post fake videos, like pets being rescued from a fire, dogs choosing their owners, or animals being saved from being trapped under ice. These videos are compelling if you don’t know they’re fake, as content creators get paid for attracting millions of views. But they’re not very compelling when you know they’re not real.

What are the stakes when you watch a very short video of imaginary people doing imaginary things? Movies and TV shows tend to give us a reason to care about the characters that have been developed on screen before they’re tossed into a conflict they must overcome. None of that dramatic tension is present when you’re watching AI characters. So why should anyone care?

There was an era when I used to think that some people were inexcusably naive when they fell for bad photoshops. But I rarely have that attitude anymore. The tools have progressed so much that anyone can fall for fake images. You really shouldn’t feel bad if you fall for something that’s AI in 2025.

The top image in this article was created in Google Gemini with Nano Banana Pro, released just last month. I typed nothing more than: “Create a photorealistic image of a man looking directly at the camera. Make it as realistic as possible.†If I wanted, I could’ve customized the image using all kinds of specific instructions to depict a scene in a particular place with particular people. But I created this image in 10 seconds just to demonstrate how trivially easy photorealism is to achieve now.

As someone who’s spent over a decade applying my skepticism to fake images, I can’t just rely on visual clues anymore or tracing the source of a given photo. Google has finally integrated its visual watermark detector in Gemini, but that only gets us so far. There are so many AI image generators out there, and if Gemini doesn’t flag your image as AI-generated, it doesn’t mean it’s real. It just means Google didn’t have any hand in creating it. There are countless AI image creators out there.

Powerful people like Donald Trump are taking full advantage of this information environment. The president has several times insisted that real things are fake in 2025. Like when Trump said the BBC used AI to make him say something he didn’t (they didn’t use AI) or when furniture was being thrown out of the White House in September, the month before the demolition of the East Wing. Trump is relying on what’s known as the liar’s dividend. When the information environment is confused, people will insist that real things are misinformation.

Elon Musk, the billionaire owner of X, has said that he envisions a world where everything that people consume on a daily basis is AI. And while it’s too soon to say whether that prediction will come true, he’s working hard to make that happen.

Musk thinks this version of the future is what people really want—fake content that caters to your desires—because he’s so detached from humanity.

As he told Joe Rogan: “Most of what people consume in five or six years, maybe sooner than that, will be just AI-generated content. So music, videos…â€

[image or embed]

— Matt Novak (@paleofuture.bsky.social) November 8, 2025 at 8:27 AM

In some ways, it feels like we’re halfway there already. And when most things are fake, it seems much more efficient to just point out the things that are real.

Original Source: https://gizmodo.com/2025-was-the-year-i-couldnt-believe-my-eyes-2000699072

Original Source: https://gizmodo.com/2025-was-the-year-i-couldnt-believe-my-eyes-2000699072

Disclaimer: This article is a reblogged/syndicated piece from a third-party news source. Content is provided for informational purposes only. For the most up-to-date and complete information, please visit the original source. Digital Ground Media does not claim ownership of third-party content and is not responsible for its accuracy or completeness.

More From Author

Leave a Reply

Your email address will not be published. Required fields are marked *