Two hikers were rescued this spring from the fittingly named Unnecessary Mountain near Vancouver, Canada after using ChatGPT and Google Maps to plan their route and finding themselves trapped by unexpected snow partway up. The incident has served as a reminder of the limitations of artificial intelligence when it comes to real-world outdoor navigation.
The pair, wearing only flat-soled sneakers, were caught off guard by the lingering snowpack on the mountain’s peaks, a common condition in the Vancouver region well into spring. Their situation called for an emergency response from Lions Bay Search and Rescue, who climbed the mountain with boots and ski poles for the stranded hikers.
Speaking to the Vancouver Sun, Lions Bay Search and Rescue chief Brent Calkin emphasized the dangers of information overload and unchecked digital advice. This incident is not isolated, with rescue organizations noting an increase in operations linked to inexperienced hikers being led into dangerous situations by social media and navigation apps.
“This call was a good reminder that AI tools like ChatGPT and Google Maps are not always the best for backcountry navigation,” said Lions Bay Search and Rescue in a press release relating to the incident, before encouraging hikers to rely on trusted local sources for route-planning and carry proper navigational tools such as a map and compass or GPS device.
This incident is not isolated, with rescue organizations noting an increase in operations linked to inexperienced hikers being led into dangerous situations by social media and navigation apps.
The Perils of AI for Hiking Route Planning
While AI models like ChatGPT can synthesize vast amounts of information, their application in dynamic environments like backcountry navigation comes with significant risks. Current AI systems cannot provide real-time updates on factors such as weather changes, trail closures, or snow conditions. As demonstrated in this incident, seasonal conditions in mountainous regions can change rapidly and drastically, rendering past information obsolete and dangerous.
Especially for less-traveled paths or when specific, nuanced outdoor knowledge is required, AI can become dangerously inaccurate or even “hallucinate” information, suggesting non-existent trails or unsafe routes. The accuracy of AI responses heavily relies on the user’s ability to ask precise and comprehensive questions. Newer hikers, often unaware of the information needed for safe outdoor excursions may receive incomplete or misleading guidance.
With very little prompting, I was able to get Gemini to assure me that I didn’t need to bring microspikes on an early spring summit attempt of a Colorado 14er. Please don’t climb a 14er in April without traction, guys!
Heading out on a hike — whether you’re new to hiking or fairly experienced — requires a level of respect, preparation, and understanding that AI simply cannot grasp. By all means, use ChatGPT for a little inspiration for your next trip, but do not let that replace the human expertise and traditional navigation methods that are absolutely mandatory for safe travels outside.
Featured image: Rescuers ascended Unnecessary Mountain on foot to extract the stranded hikers. Photo: Lions Bay Searcch and Rescue.
Affiliate Disclosure
This website contains affiliate links, which means The Trek may receive a percentage of any product or service you purchase using the links in the articles or advertisements. The buyer pays the same price as they would otherwise, and your purchase helps to support The Trek's ongoing goal to serve you quality backpacking advice and information. Thanks for your support!
To learn more, please visit the About This Site page.