Everything's fake until proven otherwise.
How marketers build trust when synthetic is the default.
We had a huge shift in AI in recent weeks. Not many people realized it. And it is not really mainstream yet. But soon, everyone will notice. And it will affect every one of us—big time. Simply put: anything that is digital will be fake until proven otherwise.
In other words, AI does such a great job in creating synthetic content that you simply can’t tell the difference between real and AI-generated anymore.
This wasn’t a case 3-6 months ago. You could either tell it from the context (random politician is riding on a zebra, half-naked, jacked) or from the quality (either too polished to be real, or just simply bad output).
Now, it doesn’t matter how hard you look. Just take a look at this girl:
The first one is clearly AI. Too polished, no human being has this skin. Although I have to say, shadows, the focus, and the rest work perfectly still. Some people could be fooled by it, for sure (just look at your current Facebook timeline…). But the second looks so real. It’s not perfect. It looks like a photo taken by someone. And yes, both of these photos are AI, of course.
We just crossed a threshold most marketers haven’t processed yet: visual verification is dead. Not dying. Not “something to watch.” Dead. The pixels lie, and there’s no coming back.
In this weekend edition, we will discuss how we (marketers) can create trust and value not just in a post-truth world, but in a post-real world.




