Mitchell Boyer of Glastonbury is sixteen years old, ranked second in the nation for youth indoor rock climbing, just returned from Finland where he placed seventh in the world for speed climbing, and has now been completely locked out of his Instagram account, where he posted pictures of his competitions and used his climbing success to obtain sponsorship contracts that helped with travel and equipment.
Now those sponsors – ranging from climbing shoe brands to protein bars – are starting to ask questions, and neither Mitchell nor his mother, Clarrissa, who essentially manages his social media profile, has any good answers because no one at Meta will answer the phone or respond to their repeated emails and attempts to get the account back online.
Mitchell’s ban from Instagram and Facebook, however, are not for anything he did or posted – he used the platforms to promote himself as an athlete – but rather because Meta’s artificial intelligence wrongly flagged his account for “child exploitation and sexual content,” according to Clarissa, who likewise has been locked out of her personal and small business accounts because they were connected to Mitchell’s. She says she’s lost decades of pictures and memories, along with her ability to administer social media for some rock-climbing organizations.
Mitchell says the loss of his Instagram account, which he built over seven years, has meant a lot of difficulty for his sponsors and his ability to connect with other athletes at international events, where Instagram is used in place of texting or phone calls as a means of communication.
“One of the main criteria for being sponsored was to have a social media presence in order to support the brands,” Mitchell said. “It was also really important for my international connections because not all countries will use messaging apps, so we use Instagram to connect. So, when I’m going to train in these various places, we always use Instagram to connect.”
As a child athlete, Mitchell can’t accept money as payment, but sponsors supported him by helping with travel and providing equipment like climbing shoes, which can cost upwards of $250 per pair, which he then showcases on his account.
“One of those sponsors came to us and was like, ‘what’s going on Mitchell? This summer we didn’t get anything,’” Clarissa said. “And people don’t know what happened because the account was literally just wiped out.”
It was July when Mitchell woke one morning to find he was locked out of his account. He and his mother were on an international trip, stopping in Austria for training, Finland for the youth international championships, then to China, and finally Singapore. The fact that they were out of the country at the time made it more difficult to reach Meta, and more difficult to explain to Mitchell’s sponsors what was happening.
While their Instagram and Facebook accounts had been disabled under the claim they were engaged in “child exploitation and sexual content,” neither of them has any idea what Meta is talking about. Clarissa says they only posted pictures of Mitchell’s climbing. Clarissa tried in vain multiple times to go through Meta’s appeals process. She even went so far as to file a police report and visit Meta’s headquarters in Singapore – where she was ignored and had the police called on her.
Clarissa and Mitchell believe they have been the victims of Meta’s artificial intelligence run amok, and they’re not the only ones who have been affected. A nonprofit organization was started precisely because of this problem. People Over Platforms Worldwide describes itself as “A global nonprofit movement demanding digital justice, transparency, and accountability in an increasingly automated online world.”
“This began with thousands of stories — business owners, creators, parents, advocates, and everyday people — locked out of their digital lives overnight,” the nonprofit states on its website. “What outsiders called ‘just an account’ was, for many, a lifeline: a source of income, cherished memories, vital communities, and irreplaceable connections.”
A petition started by the group calling for Meta to be held accountable for the lockouts and to “provide real support” has roughly 43,000 signatures. According to the petition, “In 2023, Meta rolled out massive AI-driven moderation changes,” and “by 2024-2025, these systems were flagging real people at unprecedented scale – especially on Instagram.”
“This is happening to a lot of people around the world. Meta is using AI bots to indiscriminately, randomly pick out accounts to shut them down. I learned we were not the only victims,” Clarissa said. “Meta hasn’t been responsive. We’ve hit a wall every time we try to email somebody.”
News stories from around the world tell stories similar to Mitchell’s; a photographer in Iowa was locked out of her account for “child exploitation” and was unable to appeal; a teacher in Columbus, Indiana, had her Instagram account locked for child exploitation and nudity, according to WRTV.
Attempts at both the national and state levels to regulate the rapid and massive growth of the unwieldy new technology have been quickly quashed as national leaders fear rival countries like China getting ahead of the United States in what has occasionally been deemed the “space race,” when it comes to developing and perfecting artificial intelligence.
President Donald Trump laid out his “AI Action Plan,” meant to turbo-charge AI development in the United States and “achieve global dominance.” Congress, however, stripped a proposed moratorium on state regulation of AI out of the Big, Beautiful Bill at the last minute. While that ostensibly leaves AI regulation to the states, implementing patchwork regulation for technology can be difficult, and many expect the moratorium language to return, according to Politico.
In Connecticut, legislative attempts to pass AI regulation have been shot down by Gov. Ned Lamont’s administration, who favored statutory language that places development of AI policies into the hands of the Office of Policy and Management’s chief data officer. In previous comments to Inside Investigator, Lamont spokesperson Julia Bergman said, “The Governor believes the federal government could provide a national framework for regulation versus a patchwork of rules across states.”
The legislature’s bill – Senate Bill 2, championed by Sen. James Maroney, D-Milford, who is considered the General Assembly’s AI expert – would have allowed for a “regulatory sandbox,” for the testing of AI technology, it would also have required companies to use “reasonable care to protect consumers from any known or reasonably foreseeable risks of algorithmic discrimination,” – although that term generally refers to AI algorithms discriminating against racial and ethnic minorities, not locking people out of accounts for false reasons.
While Senate Bill 2 passed out of committee, it never made it to the floor for a vote.
Reached for comment, Sen. Maroney says he is working with a rotary club in Connecticut that has experienced the same issue as Boyer, and he has reached out to his contacts at Meta to see if they can reinstate the rotary club’s account.
“This is a great example of how AI isn’t quite ready for prime time yet, and we need to ensure human review,” Maroney said, who added that states including California, Maine, New York, and Illinois have all recently passed AI regulation bills, including bills addressing companion chatbots, which have encouraged homicide and suicide in some cases.
“We are seeing states slowly start to act on these harms,” Maroney said. “There’s so much good that can come from AI, it’s just we need to make sure there are some guardrails in place.”
Meta did not respond to a request for comment, nor have they responded to the Boyer’s emails, phone calls, and requests to have their accounts reinstated.
Left with nothing else to do, Mitchell has started a new Instagram account, having to rebuild from the ground up. At the time of Inside Investigator’s interview, he said he had 118 followers, pictures and video of his climbing in Austria and at competitions, and a note saying his previous account was “hacked.”
“The sponsors look at all the content over the years for continuity of what he has done,” Clarissa said. “It’s like a job to us, and so this is very cruel, very painful.”
Republish our articles for free, online or in print, under a Creative Commons license.