Author: Sam Altman
Compiled by: Deep Tide TechFlow
Deep Tide Insight: Someone threw a Molotov cocktail at Sam Altman's home at 3:45 a.m. He rarely shared a family photo publicly, hoping it might deter the next person from taking such a step. This article is not just a complaint about the attack; it also marks his first comprehensive explanation of his belief in AI: AI must be democratized, a few labs should not decide humanity's future, and the temptation that "once you see AGI, you can't unsee it" fills this field with Shakespearean drama.
This is a photo of my family. I love them more than anything.
I hope images have power. We usually maintain a fair amount of privacy, but in this case, I'm sharing a photo in the hope that it might discourage the next person from throwing a Molotov cocktail at our home, no matter how they feel about me.
The first person did so last night at 3:45 a.m. Fortunately, it bounced off the house, and no one was hurt.
Words also have power. A few days ago, there was an inflammatory article about me. Yesterday, someone told me they thought the article appeared at a time of extreme anxiety about AI, making my situation more dangerous. I didn't take it seriously at the time.
Now I'm awake in the middle of the night, angry, and realizing I underestimated the power of words and narratives. This seems like a good time to talk about a few things.
First, my beliefs.
Striving for everyone's prosperity, empowering all people, and advancing science and technology are moral imperatives to me.
AI will be the most powerful tool to expand human capabilities and potential. The demand for this tool is essentially limitless; people will do incredible things with it. The world should have an abundance of AI, and we must figure out how to make that happen.
Not everything will go smoothly. Fear and anxiety about AI are justified; we are witnessing the greatest societal change in a long time, perhaps ever. We must get safety right—it's not just about aligning a model. We urgently need a societal response to address new threats. This includes new policies to help navigate difficult economic transitions and reach a better future.
AI must be democratized; power cannot be too concentrated. Control over the future belongs to all people and their institutions. AI needs to empower individuals, and we need to collectively decide our future and new rules. I don't think it's right for a few AI labs to make the most important decisions about our future form.
Adaptability is crucial. We are all learning new things very quickly; some of our beliefs will be right, some will be wrong, and sometimes we need to change our minds rapidly as technology evolves and society changes. No one yet understands the implications of superintelligence, but they will be enormous.
Second, some personal reflections.
When I look back at my work at OpenAI over the first decade, I can point to many things I'm proud of and a bunch of mistakes.
I was thinking about our upcoming trial with Elon and remembering how adamant I was about not agreeing to the unilateral control he wanted over OpenAI. I'm proud of that, proud of the narrow path we walked at the time that allowed OpenAI to continue to exist and all the achievements that followed.
I'm not proud of my avoidance of conflict, which has caused immense pain for me and OpenAI. I'm not proud of how I mishandled the conflict with the previous board, which caused great chaos for the company. I've made many other mistakes in OpenAI's crazy trajectory; I am a flawed person at the center of an exceptionally complex situation, trying to get a little better each year, always working for the mission. We started knowing how high the risks of AI were and how personal disagreements between well-intentioned people I care about would be magnified immensely. But experiencing these intense clashes firsthand and often having to arbitrate them is another thing, and the cost is serious. I am sorry to those I have hurt and wish I could learn faster.
I am also very aware that OpenAI is now a major platform, not a small startup, and we need to operate in a more predictable way now. The past few years have been extremely intense, chaotic, and high-pressure.
Still, what I am most proud of is that we are achieving our mission, which seemed extremely unlikely when we started. Against all odds, we figured out how to build very powerful AI, figured out how to accumulate enough capital to build the infrastructure to deliver it, figured out how to build a product company and business, figured out how to deliver fairly safe and robust services at scale, and more. Many companies say they will change the world; we actually did.
Third, some thoughts about this industry.
My personal takeaway from the past few years, and my view on why there is so much Shakespearean drama between companies in our field, boils down to this: "Once you see AGI, you can't unsee it." It has a real "Ring of Power" dynamic that makes people do crazy things. I don't mean that AGI itself is the ring, but rather the totalitarian philosophy of "being the one who controls AGI."
The only solution I can think of is to move in the direction of widely sharing the technology with people, and no one owning the ring. The two obvious ways to do this are individual empowerment and ensuring democratic systems remain in control.
It is important that the democratic process remains more powerful than companies. Laws and norms will change, but we must work within the democratic process, even if it is messy and slower than we would like. We want to be a voice and a stakeholder, but not own all the power.
Much of the criticism of our industry comes from sincere concerns about the extremely high risks of this technology. This is very valid, and we welcome good-faith criticism and debate. I empathize with anti-tech sentiment; obviously, technology is not always good for everyone. But overall, I believe technological progress can make the future unbelievably good, for your family and mine.
As we have that debate, we should de-escalate the rhetoric and tactics and try to have fewer explosions in fewer homes, both metaphorically and literally.







