Karen Hao on how the AI boom became a new imperial frontier

4 months ago 2

The author of “Empire of AI” traces how OpenAI’s global reach is reshaping labor, energy and power — and why she sees echoes of empire in its rise.

July 3, 202510:00 AM UTCUpdated ago

Karen Hao, the Hong Kong-based American journalist and author of the book "Empire of AI". Penguin Books/Handout via REUTERS

When journalist Karen Hao first profiled OpenAI in 2020, it was a little-known startup. Five years and one very popular chatbot later, the company has transformed into a dominant force in the fast-expanding AI sector — one Hao likens to a “modern-day colonial world order” in her new book, “Empire of AI: Dreams and Nightmares in Sam Altman’s OpenAI.”

Hao tells Reuters this isn’t a comparison she made lightly. Drawing on years of reporting in Silicon Valley and further afield to countries where generative AI’s impact is perhaps most acutely felt — from Kenya, where OpenAI reportedly outsourced workers to annotate data for as little as $2 per hour, to Chile, where AI data centers threaten the country’s precious water resources — she makes the case that, like empires of old, AI firms are building their wealth off of resource extraction and labor exploitation. This critique stands in stark contrast to the vision promoted by industry leaders like Altman (who declined to participate in Hao's book), who portray AI as a tool for human advancement — from boosting productivity to improving healthcare. Empires, Hao contends, cloaked their conquests in the language of progress too.

The following conversation has been edited for length and clarity.

Reuters: Can you tell us how you came to the AI beat?

Karen Hao: I studied mechanical engineering at MIT, and I originally thought I was going to work in the tech industry. But I quickly realized once I went to Silicon Valley that it was not necessarily the place I wanted to stay because the incentive structures made it such that it was really hard to develop technology in the public interest. Ultimately, the things I was interested in — like building technology that facilitates sustainability and creates a more sustainable and equitable future — were not things that were profitable endeavors. So I went into journalism to cover the issues that I cared about and ultimately started covering tech and AI.

That work has culminated in your new book “Empire of AI.” What story were you hoping to tell?

Once I started covering AI, I realized that it was a microcosm of all of the things that I wanted to explore: how technology affects society, how people interface with it, the incentives (and) misaligned incentives within Silicon Valley. I was very lucky in getting to observe AI and also OpenAI before everyone had their ChatGPT moment, and I wanted to add more context to that moment that everyone experienced and show them this technology comes from a specific place. It comes from a specific group of people and to understand its trajectory and how it's going to impact us in the future. And, in fact, the human choices that have shaped ChatGPT and Generative AI today (are) something that we should be alarmed by and we collectively have a role to play in starting to shape technology.

Sam Altman, CEO of OpenAI, attends the Asia-Pacific Economic Cooperation (APEC) CEO Summit in San Francisco, California, U.S. November 16, 2023. REUTERS/Carlos Barria/File Photo

There are data centers that are being built that will be 1,000 to 2,000 megawatts, which is around one-and-a-half and two-and-a-half times the energy demand of San Francisco.

Karen Hao

You’ve mentioned drawing inspiration from the Netflix drama “The Crown” for the structure of your book. How did it influence your storytelling approach?

The title "Empire of AI" refers to OpenAI and this argument that (AI represents) a new form of empire, and the reason I make this argument is because there are many features of empires of old that empires of AI now check off. They lay claim to resources that are not their own, including the data of millions and billions of people who put their data online, without actually understanding that it could be taken to be trained for AI models. They exploit a lot of labor around the world — meaning they contract workers who they pay very little to do their data annotation and content moderation for these AI models. And they do it under the civilizing mission, this idea that they're bringing benefit to all of humanity.

It took me a really long time to figure out how to structure a book that goes back and forth between all these different communities and characters and contexts. I ended up thinking a lot about “The Crown" because every episode, no matter who it's about, is ultimately profiling this global system of power.

Does that make CEO Sam Altman the monarch in your story?

People will either see (Altman) as the reason why OpenAI is so successful or the massive threat to the current paradigm of AI development. But in the same way that when Queen Elizabeth II passed away people suddenly were like, “Oh, right, this is still just the royal family and now we have another monarch,” it's not actually about the individual. It's about the fact that there is this global hierarchy that's still in place in this vestige of an old empire that's still in place.

Sam Altman is like Queen Elizabeth (in the sense that) whether he's good or bad or he has this personality or that personality is not as important as the fact that he sits at the top of this hierarchy — even if he were swapped out, he would be swapped out for someone who still inherits this global power hierarchy.

In the book, you depict OpenAI’s transition from a culture of transparency to secrecy. Was there a particular moment that symbolized that shift?

I was the first journalist to profile OpenAI and embedded within the company in 2019, and the reason why I wanted to profile them at the time was because there was a series of moments in 2018 and 2019 that signaled that there was some dramatic shift underway at the organization.

OpenAI was co-founded as a nonprofit at the end of 2015 by Elon Musk and Sam Altman and a cast of other people. But in 2018, Musk leaves; OpenAI starts withholding some research and announces to the world that it's withholding this research for the benefit of humanity. It restructures and nests a for-profit within the nonprofit and Sam Altman becomes CEO; and those were the four things that made me wonder what was going on at this organization that used its nonprofit status to really differentiate itself from all of the other crop of companies within Silicon Valley working on AI research.

Right before I got to the offices, they had another announcement that solidified there was some transformation afoot, which was that Microsoft was going to partner with OpenAI and give the company a billion dollars. All of those things culminated in me then realizing that all of what they professed publicly was actually not what was happening.

You emphasize the human stories behind AI development. Can you share an example that highlights the real-world consequences of its rise?

One of the things that people don't really realize is that AI is not magic and it actually requires an extremely large amount of human labor and human judgment to create these technologies. These AI companies will go to Global South countries to contract workers for very low wages where they will either annotate data that needs to go into training these training models or they will perform content moderation or they will converse with the models and then upvote and downvote their answers and slowly teach them into saying more helpful things.

I went to Kenya to speak with workers that OpenAI had contracted to build a content moderation filter for their models. These workers were completely traumatized and ended up with PTSD for years after this project, and it didn't just affect them as individuals; that affected their communities and the people that depended on them. (Editorial note: OpenAI declined to comment, referring Reuters to an April 4 post by Altman on X.)

"Empire of AI" by Hong Kong-based American journalist and author Karen Hao. Penguin Books/Handout via REUTERS

Your reporting has highlighted the environmental impact of AI. How do you see the industry's growth balancing with sustainability efforts?

These data centers and supercomputers, the size that we're talking about is something that has become unfathomable to the average person. There are data centers that are being built that will be 1,000 to 2,000 megawatts, which is around one-and-a-half and two-and-a-half times the energy demand of San Francisco. OpenAI has even drafted plans where they were talking about building supercomputers that would be 5,000 megawatts, which would be the average demand of the entire city of New York City.

Based on the current pace of computational infrastructure expansion, the amount of energy that we will need to add onto the global grid will, by the end of this decade, be like slapping two to six new Californias onto the global grid. There's also water. These data centers are often cooled with fresh water resources.

How has your perspective on AI changed, if at all?

Writing this book made me even more concerned because I realized the extent to which these companies have a controlling influence over everything now. Before I was worried about the labor exploitation, the environmental impacts, the impact on the job market. But through the reporting of the book, I realized the horizontal concern that cuts across all this is if we return to an age of empire, we no longer have democracy. Because in a world where people no longer have agency and ownership over their data, their land, their energy, their water, they no longer feel like they can self-determine their future.

Sign up here.

Edited by Aurora Ellis; Video by Tristan Werkmeister; Photo editing by Simon Newman

Our Standards: The Thomson Reuters Trust Principles., opens new tab

Read Entire Article