AWS boss explains why investing billions in both Anthropic and OpenAI is an OK conflict

AWS CEO Matt Garman said Amazon’s recent $50 billion investment in OpenAI, following its long partnership, is inclusive $8 billion is being invested in it Anthropic is the kind of conflict of interest the cloud giant is used to.
Garman has worked at Amazon since he was an intern at a business school in 2005, before the launch of AWS in 2006, he told the audience at the HumanX conference taking place this week in San Francisco.
When asked about the inherent conflict of working closely with two AI model companies that are fierce (and perhaps sometimes small) competitors, he said it’s not a problem. Because AWS itself often competes with its partners, it has a lot of direct experience with such competition, he explains.
In AWS’s early years, it knew it couldn’t build every cloud offering itself, so the unit partnered with others.
“We also knew we would have to compete with our partners because technology is interconnected,” says Garman. “So for a long time, we’ve been building this strength in the way we go to market with our partners,” he continued. “But we may also have our own products that compete with them, and that’s OK, and we’ve promised them that we won’t give ourselves an unfair competitive advantage.”
Today, the world is used to Amazon competing with those who sell via the cloud. Even one of AWS’s biggest rivals, Oracle, sells its database and other services on AWS. But it was a radical idea in 2006, when technology partners did their best never to compete with the partners who helped them succeed.
Yet Amazon is hardly a pioneer in putting aside investor loyalty and conflicts of interest in the wild, money-grubbing world of AI. When Anthropic announced its latest $30 billion funding round in February, it included at least a dozen investors who also backed OpenAI. This included OpenAI’s main cloud partner, Microsoft.
WAN event
San Francisco, CA
|
October 13-15, 2026
For AWS, making a huge investment in OpenAI to acquire its model for its customers (and as a technology development partner) was almost a matter of life and death. Both models were already available in the cloud of Microsoft, AWS’s biggest rival.
The cloud giants are also trying to keep themselves front and center by offering AI model routing services. These services allow their customers to automatically use different models for different tasks to maximize performance and reduce costs. As Garman explained, one model may be ideal for planning, another for reasoning, and a cheaper model for simpler tasks such as completing code. “I think this is where the world is going to go,” Garman said.
That’s also how Amazon, and Microsoft for that matter, will adopt their own homegrown models – that old situation of competing with your partners again.
Nowadays all is fair in love and AI.




