An unfortunate couple excitedly traveled for hours for a chance to take a mountaintop cable car called the Kuak Skyride. They’d seen it online, complete with smiling tourists gliding along and a TV journalist narrating the whole video.

However, when the couple arrived, there was nothing but a small town and confused locals unaware of what they were talking about. Turns out it was all an AI-generated video that they had believed was real. That story, detailed in a report by Fast Company, sounds like it would be unique, but I suspect it’s something everyone will have to consider when perusing the internet for ideas of things to buy or places to visit.

A small logo in the corner of the video indicates the video was made with Veo 3, Google’s newest AI video engine, and it’s hardly the only indicator that the video is made with AI. The appearance of the people and the structures all has that AI sheen of unreality to it. However, if you’re not well-versed in deepfakes or looking for the signs, you might not have noticed, as it would seem silly to be suspicious of a well-made tourist video.

Apakah benar Kabel car di Pengkalan Hulu & Unbelievable cable car at Pengkalan Hulu Perak – YouTube
Apakah benar Kabel car di Pengkalan Hulu & Unbelievable cable car at Pengkalan Hulu Perak - YouTube

Watch On

However, our new reality is that AI can now sell you not just a product, but a place – and that place might never have existed before. Slightly wrong spelling and suspicious URLs are practically quaint in comparison. It wasn’t even clear whether the video was malicious or just someone’s misguided attempt at content creation. It’s easy to roll your eyes and say, it would never happen to you. But we all have blind spots. And AI is getting really good at aiming for them.

This is obviously a much more problematic use of AI video than showing cats as Olympic divers. Still, the necessity of really paying attention to spot the clues of an AI creation is universal.

AI travel tricks

We’re past the visual age of trust. In the AI era, even seeing is just the beginning of the vetting process. Of course, that doesn’t mean you should abandon all travel plans. However, it does mean that the average person now needs a new kind of consumer savvy, calibrated not just for Nigerian princes and surprise crypto pitches, but for video illusions and AI travel influencers who can go places no human can follow.

And that’s before considering real places with review sections flooded by AI-written, fake testimonials extolling places, almost certainly with AI-generated exaggerations of things to do that don’t exist outside of their own hallucinations.

Dealing with it might mean having to be suspicious of things that look too good to be true. You might need to cross-check multiple sources to see if they all agree that something is real. Maybe a reverse image search or public social media post search would be necessary. And when it comes to images and videos, make sure they aren’t too perfect. If no one is frowning or sneezing in a crowd shot, I’d be wary about its reality.

It’s unfortunate. I don’t like the idea of seeing a beautiful location in a video and doubting its reality instead of planning a trip there. But maybe that’s the price of living in a world where anyone can make realistic illusions of almost-real worlds. But you’ll need to do more to ensure you’re headed somewhere with a foundation that’s more than just pixels and algorithms.

You might also like

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums

Leave a Reply