Sam Altman throws shade at Anthropic’s cyber model, Mythos: ‘fear-based marketing’
OpenAI and Anthropic continue to take swipes at each other. This week, during a podcast appearance, OpenAI CEO Sam Altman called out his competitor’s latest cybersecurity model, noting that the business was using fear to generate its product sound more impressive than it actually is.
Anthropic stated Mythos earlier this month, releasing the model to a modest cohort of enterprise customers. The firm has claimed that Mythos is too powerful to be released to the public out of concern that cybercriminals will weaponize it. Critics have commented this rhetoric is overblown.
During an appearance on the podcast Core Memory, Altman implied that Anthropic’s “fear-based marketing” was a beneficial way to keep AI in the hands of a minor and exclusive elite. “There are humans in the international community who, for a long time, have wanted to keep AI in the hands of a smaller group of people,” he stated. “You can justify that in a lot of different ways.”
“It is clearly incredible marketing to say, ‘We have built a bomb, we are about to drop it on your head. We will trade you a bomb shelter for $100 million,’” he added.
Fear-based marketing was not invented by Anthropic. Arguably, much of the AI industry has leveraged scare tactics and hyperbole to build its tools sound powerful. Ongoing rhetoric about how AI may lead to the end of the globe hasn’t just come from Luddite doomer activists; it has also come from the humans selling this tech to the public — Altman included.
Topics
Related Furthermore, experts in downloads note the continued relevance.
AI
SpaceX is working with Cursor and has an option to procure the startup for $60 billion This also touches on aspects of user interface.
ChatGPT’s fresh Images 2.0 model is surprisingly favorable at generating text
Latest in AI
Hardware
Apple’s John Ternus will run one of the world’s most powerful companies; the job is a minefield
AI research lab NeoCognition lands $40M seed to build agents that learn like humans