Home > Article > Hardware Tutorial > The 'Big Brother in Cloud Computing' makes a big move late at night: in a few minutes, he can build an app purely by relying on Prompt!
How long does it take to develop an APP now?
" Cloud Computing Brother "Amazon Cloud Technology has set a new standard late at night -
It only takes three steps and a few minutes, purely relying on natural language and the mouse "click".
No need to say much, just show it!
Step one: Speak out your idea
We can first describe the requirements for building an APP directly in natural language, for example:
Create an application for my team and submit it for project approval through a form. This form will accept details and allow the user to upload relevant documents.
After waiting for a few seconds, this AI will analyze and summarize the APP's use cases, processes, key features and other information based on your needs.
After confirming that there is no problem, we can click the "Generate APP" button in the lower right corner.
Step 2: Edit APP
In this visual interface, we can edit the user interface, data objects, automation and other operations.
Whether you are adding new elements or new pages to the APP, you only need a "drag and drop" action - pull in the component on the right.
And all changes will automatically take effect in real time in the APP you develop, and can be previewed at any time.
Step 3: Prepare to "go to work"
After completing all the APP configuration work, we can deploy it to the test environment for testing.
After confirmation, you can go online and use it.
And if there are updates in content and functions in the future, the operation will be the same as the above steps, what you see is what you get.
This is App Studio, a new feature officially released by Amazon Cloud Technology at the late-night New York summit.
Matt Wood, Vice President of AI Products at Amazon Cloud Technology, also said at the scene:
App Studio is the fastest and easiest way to build applications.
△Matt Wood, Vice President of AI Products at Amazon Cloud Technology
However, looking at the entire event of Amazon Cloud Technology, App Studio is only a corner of the new releases.
The most intuitive feeling is that it is full of generative AI -
application layer, model layer, computing power layer, all with new actions.
Programming through chat
In terms of the generative AI application layer, in addition to the App Studio just mentioned, the most representative one is Amazon launched by Amazon Cloud Technology for enterprises and developers. Q.
At today’s event, the developer version Amazon Q Developer also showed its very AI side.
For example, given a piece of code, we now only need to click "Explain" in the IDE, and it can quickly interpret the code.
Selecting a specific code fragment is also a click action - "Fix", and you can also make corrections to it.
If you are unclear about the code, you can also ask questions in natural language, and Amazon Q Developer can answer your questions with a click.
In addition, Amazon Q Developer has also launched a new customization function today.
To put it simply, developers can now find more relevant code recommendations in internal libraries, APIs, packages, classes and methods.
For example, if a programmer at a financial company wants to write a function to calculate a customer's total portfolio value, he now only needs to describe the intention in a comment or type the function name (such as computePortfolioValue (customerId: String)).
Then Amazon Q Developer will implement this function with sample suggested code learned from the private code base, which is more in line with the "company's baby physique".
This makes generative AI closer to the demands of programmers:
Generate more personalized code
Better understand private documents
Can master internal software packages
So for this Do programmers pay for the functional results?
Matt Wood gave real data on the British BT company after using Amazon Q Developer:
The code acceptance rate reached 37%.
Successfully converted and deployed more than 200,000 lines of code to the production environment.
It can be seen that the threshold for programming has been lowered by the generative AI of Amazon Cloud Technology.
In addition, Amazon Q Apps is also officially available today, and you can easily create an enterprise-level app using only natural language.
Amazon Bedrock: Smarter and safer
Amazon Cloud Technology’s large model capabilities for external output mainly rely on Amazon Bedrock at the model layer.
To understand simply, Amazon Bedrock is a platform that integrates a variety of advanced AI large models.
With just a single API, it can provide the capabilities of more than 30 models including Claude, Mistral, Llama, Stable Diffusion, and self-developed Titan series.
One of the current methods to make large models smarter is Retrieval Augmented Generation (RAG). The previous Amazon Bedrock already supported this feature.
And just today, Amazon Cloud Technology has made further updates at the data level -
In addition to the original Amazon S3, the current data source can pick data from web crawlers and third parties.
For example, taking Confluence as the data source, the operation can be completed in just 4 simple steps.
Includes providing a name and description for your data source, selecting a hosting method, entering a Confluence URL, and more.
After completing the Confluence data source configuration, it is time to complete the knowledge base setup by selecting the embedding model and configuring the selected vector storage.
But with the increase in data sources, security has naturally become a hidden danger.
To this end, Amazon Cloud Technology has also added a more secure "lock" - Guardrails - to Amazon Bedrock.
Specifically, on the basis of the original four major security measures, namely topic filter, content filter, sensitive information filter and word filter, two new "security bolts" are added:
Contextual grounding check
ApplyGuardrail API
The purpose of contextual grounding check is to detect hallucinations in model responses based on reference sources and user queries. It mainly includes Grounding and Relevance. It can be used during the setup process. Adjust the thresholds of both.
The purpose of the ApplyGuardrail API is to evaluate the input prompts and model responses of all basic models to achieve centralized governance of all generative AI applications.
This feature allows for standardized and consistent safeguards for all generative AI applications built using any custom or third-party base model.
In this way, Guardrails blocks up to 85% of harmful content and filters more than 75% of RAG hallucination responses.
In addition, in terms of Agents, Amazon Cloud Technology has added a new memory retention function to Amazon Bedrock.
Simply put, this feature can retain a summary of the conversation between Agents and users, providing a smooth adaptive experience.
This can play an important role in complex multi-step tasks such as user interaction and enterprise automation solutions.
Notably, each user’s conversation history and context is securely stored under a unique memory identifier and retained for 30 days by default.
In addition to the longer memory, the code interpretation capabilities of Agents on Amazon Bedrock have also been enhanced.
The purpose of this is also to better handle complex use cases such as data analysis, data visualization, text processing, solving equations and optimization problems.
Specifically for the large models on Amazon Bedrock, Amazon Cloud Technology has added a fine-tuning function to Claude 3 Haiku.
It is understood that Anthropic debuted this feature in Amazon Bedrock.
After fine-tuning, Claude 3 Haiku now "opens" like this:
The classification accuracy increased from 81.5% to 99.6%, while reducing the tokens per query by 89%.
Finally, in terms of computing power and energy, Amazon Cloud Technology also brought good news at today’s event:
Achieve the 100% renewable energy goal seven years ahead of schedule!
In short, throughout the entire launch event, around generative AI, Amazon Cloud Technology has achieved "more speed, more savings".
How to rate?
First of all, it is fast enough.
Take Amazon Bedrock as an example. In fact, it has only been more than a year since it was released in April 2023, but it has almost maintained an update rate every 1 or 2 months:
June 2023: Invested US$100 million to establish a generative AI innovation center.
July 2023: Announced support for the latest Llama 2 base model launched by Meta, and released 7 new generative AI features.
September 2023: Announced strategic partnership with Anthropic.
November 2023: Launched the three-layer architecture of generative AI and released the enterprise-level generative AI assistant Amazon Q.
February 2024: Mistral AI comes to Amazon Bedrock.
March 2024: Claude 3 coming to Amazon Bedrock.
April 2024: Amazon Bedrock features a major upgrade - custom model import, model evaluation function is officially available.
May 2024: Amazon Bedrock Studio preview released.
Especially when major major models release major updates, such as Claude 3, Amazon Bedrock includes them almost immediately.
Correspondingly, these mainstream large model players have also chosen Amazon Bedrock as the first place to update their functions, just like the new fine-tuning function of Claude 3 Haiku this time.
Secondly, the technology is strong enough.
Amazon Cloud Technology is slightly different from many mainstream large model players in the AIGC era, that is, it is a full-stack player.
Because of this, every update iteration must take into account the entire process of the computing power layer, model layer and application layer.
Judging from this release, Amazon Cloud Technology has also optimized multiple functions and capabilities for each layer, just like a puzzle, constantly expanding the territory of generative AI.
According to data released by Matt Wood, in the past 18 months, the number of features released by Amazon Cloud Technology in generative AI is more than twice that of the second and third places combined.
Such a quantity can be said to be a certification of technical strength.
Finally, the market is recognized enough.
This can also be seen from a set of data released by Matt Wood:
96% of AI/ML unicorn companies are using Amazon Cloud Technology
2024 Forbes’ top 50 AI companies, 90 % are using Amazon Cloud Technology
With technology, strength, speed, and market, it’s worth looking forward to what surprises Amazon Cloud Technology will bring in the AIGC era.
—End—
Click here to follow me and remember to star~
"Share", "Like" and "Watch" with three clicks
See you every day on the cutting-edge progress of science and technology~
The above is the detailed content of The 'Big Brother in Cloud Computing' makes a big move late at night: in a few minutes, he can build an app purely by relying on Prompt!. For more information, please follow other related articles on the PHP Chinese website!