The chatbot, developed by Amazon’s cloud computing division, is focused on workplaces and not intended for consumers. Amazon Q aims to help employees with daily tasks, such as summarizing strategy documents, filling out internal support tickets and answering questions about company policy. It will compete with other corporate chatbots, including Copilot, Google’s Duet AI and ChatGPT Enterprise.
“We think Q has the potential to become a work companion for millions and millions of people in their work life,” Adam Selipsky, the chief executive of Amazon Web Services, said in an interview.
Amazon has been racing to shake off the perception that it is lagging behind in the A.I. competition. In the year since OpenAI released ChatGPT, Google, Microsoft and others have jumped into the frenzy by unveiling their own chatbots and investing heavily in A.I. development.
Amazon was quieter about its A.I. plans until more recently. In September, it announced that it would invest up to $4 billion in Anthropic, an A.I. start-up that competes with OpenAI, and develop advanced computing chips together. Amazon also introduced a platform this year that allows customers to access different A.I. systems.
As the leading provider of cloud computing, Amazon already has business customers storing vast amounts of information on its cloud servers. Companies were interested in using chatbots in their workplaces, Mr. Selipsky said, but they wanted to make sure the assistants would safeguard those hoards of corporate data and keep their information private.
Many companies “told me that they had banned these A.I. assistants from the enterprise because of the security and privacy concerns,” he said.
In response, Amazon built Q to be more secure and private than a consumer chatbot, Mr. Selipsky said. Amazon Q, for example, can have the same security permissions that business customers have already set up for their users. At a company where an employee in marketing may not have access to sensitive financial forecasts, Q can emulate that by not providing that employee with such financial data when asked.
Companies can also give Amazon Q permission to work with their corporate data that isn’t on Amazon’s servers, such as connecting with Slack and Gmail.
Unlike ChatGPT and Bard, Amazon Q is not built on a specific A.I. model. Instead, it uses an Amazon platform known as Bedrock, which connects several A.I. systems together, including Amazon’s own Titan as well as ones developed by Anthropic and Meta.
The name Q is a play on the word “question,” given the chatbot’s conversational nature, Mr. Selipsky said. It is also a play on the character Q in the James Bond novels, who makes stealthy, helpful tools, and on a powerful “Star Trek” figure, he added.
Pricing for Amazon Q starts at $20 per user each month. Microsoft and Google both charge $30 a month for each user of the enterprise chatbots that work with their email and other productivity applications.
Amazon Q was one of a slew of announcements that the company made at its annual cloud computing conference in Las Vegas. It also shared plans to beef up its computing infrastructure for A.I. And it expanded a longtime partnership with Nvidia, the dominant supplier of A.I. chips, including by building what the companies called the world’s fastest A.I. supercomputer.
Most such systems use standard microprocessors along with specialized chips from Nvidia called GPUs, or graphics processing units. Instead, the system announced on Tuesday will be built with new Nvidia chips that include processor technology from Arm, the company whose technology powers most mobile phones.
The shift is a troubling sign for Intel and Advanced Micro Devices, the dominant microprocessor suppliers. But it is positive news for Arm in its long-running effort to break into data center computers.
Don Clark contributed reporting from San Francisco.