Home  >  Article  >  Technology peripherals  >  AWS provides comprehensive solutions for the implementation of generative AI

AWS provides comprehensive solutions for the implementation of generative AI

WBOY
WBOYforward
2023-11-30 20:41:501435browse

Without changing the original meaning, it needs to be rewritten into Chinese: We have previously introduced to you a series of new technologies that Amazon Web Services (AWS) just announced at re:Invent 2023 aimed at accelerating generative artificial intelligence. Measures for the practical application of intelligence-related technologies

AWS provides comprehensive solutions for the implementation of generative AI

These include but are not limited to establishing a deeper strategic partnership with NVIDIA, the first computing cluster based on the GH200 super chip, and new self-developed general-purpose processors and AI inference chips, etc.

AWS provides comprehensive solutions for the implementation of generative AI

But as we all know, generative AI relies not only on powerful computing power in hardware, but also on good AI models. Especially in the current technical background, developers and enterprise users are often faced with many choices. Since different models are good at different generative categories, this leads to reasonable selection of models, parameter settings, and even effect evaluation. In practice, It has become a very troublesome thing for many users, and it has also greatly increased the difficulty of applying generative AI to actual application scenarios.

AWS provides comprehensive solutions for the implementation of generative AI

So how to solve the difficulties in the practical application of generative AI and truly liberate the productivity of new technologies? In the early morning of November 30, 2023, Beijing time, AWS gave a series of answers.

Currently, more model choices are brought together

First, AWS today announced further expansion of the Amazon Bedrock service. Previously, the service already included multiple industry-leading large language model sources, including AI21 Labs, Anthropic, Cohere, Meta, Stability AI and Amazon. Through this hosting service, users can conveniently choose to use various large language models on one platform without having to visit other platforms

AWS provides comprehensive solutions for the implementation of generative AI

During today’s keynote, AI security and research company Anthropic announced that they have brought the latest version of the Claude 2.1 model to Amazon Bedrock. Claude 2.1 excels at summarizing, performing Q&A, and comparisons on large volumes of text, making it particularly suitable for working with financial statements and internal data sets. According to Anthropic, Claude 2.1 offers significant improvements in honesty compared to the previous model, with 2x fewer false statements

AWS provides comprehensive solutions for the implementation of generative AI

At the same time, the well-known large language model Llama 2 has also introduced a new version with a scale of up to 70 billion parameters into Amazon Bedrock. As Meta's next generation large language model, Llama 2 has 40% more training data than the previous generation, and the context length has doubled. In its latest version, it has been fine-tuned with a data set of instructions and over 1 million human annotations, and optimized for conversational use cases.

AWS provides comprehensive solutions for the implementation of generative AI

What’s more important is that AWS has previously successfully developed their own AI large language model Titan. In addition to the previously released Amazon Titan Text Embeddings and Amazon Titan Text models for text generation, the Amazon Titan Image Generator and Amazon Titan Multimodal Embeddings, which focus on image generation, have also been officially announced today. Compared with traditional generative image models, AWS's own model also embeds unique technology for copyright protection and supports embedding image and text information into the database to generate more accurate search results in the future.

AWS provides comprehensive solutions for the implementation of generative AI

In addition, AWS has also innovatively proposed a copyright compensation policy for content generated by large models. That is, AWS will compensate customers for accusations that the generally available Amazon Titan model or its output infringes on third-party copyrights.

Using large language models more accurately and safely is now easy

In traditional use cases, enterprises may need to spend a long time to determine benchmarks, set up evaluation tools, and evaluate different models based on rich professional knowledge before they can choose the model that best suits them

AWS provides comprehensive solutions for the implementation of generative AI

But now with the model evaluation function on Amazon Bedrock, all the above troubles can be avoided. Users only need to select preset evaluation criteria (such as accuracy, robustness) in the console, and then upload their own test data set, or select from a preset data amount to run complete automation. large model evaluation process.

AWS provides comprehensive solutions for the implementation of generative AI

Even if manual evaluation is required, AWS's expert team can provide detailed evaluation reports based on customer-defined indicators (such as relevance, style, brand image). While greatly saving time, it also significantly lowers the technical threshold for enterprises to use generative AI.

AWS provides comprehensive solutions for the implementation of generative AI

Not only that, in Amazon Bedrock, multiple large language models, including Cohere Command, Meta Llama 2, Amazon Titan, and Anthropic Claude 2.1 that will be adapted in the future, will support users to fine-tune according to their own needs. In addition, the Amazon Bedrock knowledge base function will also allow large models to connect to an enterprise's proprietary data sources, thereby providing more accurate and enterprise-specific responses for use cases such as chatbots and question-and-answer systems. .

AWS provides comprehensive solutions for the implementation of generative AI

At the same time, for the protection mechanism in the use of generative AI, Guardrails for Amazon Bedrock will now allow enterprises to customize the language principles of generative AI. They can set which topics will be rejected, configure thresholds for hate speech, insults, sexual language and violence to filter harmful content to their desired level. In the future, Guardrails for Amazon Bedrock will also introduce word filter functionality and use the same or different guard levels across multiple different model use cases.

After gaining the trust of a large number of users, AWS is fully promoting the implementation of generative AI

In addition to dramatically simplifying the process of selecting and using generative AI through new technologies, AWS’s acclaimed Amazon SageMaker service is now being used by customers such as Hugging Face, Perplexity, Salesforce, Stability AI, and Vanguard for Continuously train and enhance their large language models. Compared with using the company's own computing equipment, AWS's huge hardware advantages and flexible business model make the evolution of "large models" faster and simpler.

AWS provides comprehensive solutions for the implementation of generative AI

Not only that, we can see including Alida, Automation Anywhere, Blueshift, BMW Group, Clariant, Coinbase, Cox Automotive, dentsu, Druva, Genesys, Gilead, GoDaddy, Hellmann Worldwide Logistics, KONE, LexisNexis Legal & Professional, A series of companies such as Lonely Planet and NatWest have chosen to put their data on AWS and use this data to privately "customize" their own generative AI services without having to worry about the data being leaked or used by other competitors. Used by opponents. And because "any input or output of Amazon Bedrock will not be used to train its basic model, this is not only AWS's self-guarantee, but also a technical constraint they have made on third-party large model providers

AWS provides comprehensive solutions for the implementation of generative AI

In fact, if you list the relevant partners of AWS that appeared in today’s keynote speech, you will find that they cover almost all the industry chain links of today’s generative AI, such as basic model selection to accelerate training and iteration on AWS. ; Model service providers host their services on AWS to reach more users; and users of large models also prefer AWS's related payments, because the platform greatly simplifies their use of AI technology to improve service quality and business operation efficiency. Threshold, but also has excellent cost performance and extremely high reliability.

AWS provides comprehensive solutions for the implementation of generative AI

A few months ago, everyone may have been thinking about how to truly apply “generative artificial intelligence” to actual enterprises and users and bring benefits to them. However, after today's AWS re:Invent 2023 keynote, the answer is obvious

The above is the detailed content of AWS provides comprehensive solutions for the implementation of generative AI. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:sohu.com. If there is any infringement, please contact admin@php.cn delete