Amazon today announced the general availability of Bedrock, its service that offers a choice of generative AI models from Amazon itself and third-party partners through an API.

Bedrock, which was unveiled in early April, allows AWS customers to build apps on top of generative AI models and customize them with their proprietary data. Leveraging these models, brands and developers can also create AI “agents” that automatically execute tasks like booking travel, managing inventory and processing insurance claims.

In the coming weeks, Llama 2, the open source large language model model from Meta, will come to Bedrock, Amazon says — joining models from AI21 Labs, Anthropic, Cohere and Stability AI.

Amazon claims Bedrock will be the first “fully managed generative AI service” to offer Llama 2, specifically the 13-billion- and 70-billion-parameter flavors. (Parameters are the parts of a model learned from historical training data and essentially define the skill of the model on a problem, such as generating text.) However, it’s worth noting that Llama 2 has been available on other cloud-hosted generative AI platforms for some time, including Google’s Vertex AI.

Bedrock is in many ways comparable to Vertex AI, speaking of, which offers its own library of fine-tunable first- and third-party models on which customers can build generative AI apps. But Swami Sivasubramanian, VP of data and AI at AWS, argues that Bedrock has an advantage in that it plays nicely with existing AWS services, like AWS PrivateLink for establishing a secure connection between Bedrock and a company’s virtual private cloud.

To be fair to Google, I’d argue that’s more of a perceived advantage than an objective one, seeing as it’s dependent on the customer in question and the cloud infrastructure they’re using. Of course, you won’t hear Sivasubramanian acknowledge that.

“Over the last year, the proliferation of data, access to scalable compute, and advancements in machine learning have led to a surge of interest in generative AI, sparking new ideas that could transform entire industries and reimagine how work gets done,” Sivasubramanian said in a press release. “Today’s announcement is a major milestone that puts generative AI at the fingertips of every business, from startups to enterprises, and every employee, from developers to data analysts.”

In related news this morning, Amazon announced the rollout of its Titan Embeddings model, a first-party model that converts text to numerical representations called embeddings to power search and personalization applications. The Titan Embeddings model supports around 25 languages and chunks of text — or whole documents — up to 8,192 tokens (equivalent to ~6,000 words) in length, on par with the latest embeddings model from OpenAI.

Bedrock had a rocky start. Bloomberg reported in May that, six weeks after Amazon demoed the tech with an unusually vague presser and just one testimonial, most cloud customers still didn’t have access. With today’s announcements — and its recent, multi-billion-dollar investment in AI startup Anthropic — Amazon’s clearly looking to make waves in the growing and lucrative market for generative AI.



Source link

By admin

Malcare WordPress Security