Developers can now swiftly construct and improve bots with the latest Amazon Lex AI advancements

Developers can now swiftly construct and improve bots with the latest Amazon Lex AI advancements

Developers can now use‍ simple natural⁤ language to build or ⁢enhance chatbots with Amazon Lex, a ​tool for crafting conversational interfaces. Using new ⁣generative AI features, programmers can describe tasks they​ want the service to perform, ‌like “organize a hotel booking including guest ⁢details and payment method,” as highlighted in a recent blog post by the company.

“Without generative AI, the bot developer would have ‌to manually design each element of the bot — intents⁤ or possible paths, utterances that would trigger a path,⁢ slots for information to capture, and​ prompts or bot response, among other elements,” Sandeep ⁢Srinivasan, a ⁣senior product manager of Amazon Lex at AWS,⁣ said in an interview. “With this approach, you get started easily.”

Lex can also help with tricky human-bot interactions. If Amazon Lex can’t figure out ‍part ​of a conversation, it asks an AI foundational large language model (LLM) chosen by the bot​ maker for help

Another new Amazon Lex feature simplifies creating chatbots by‍ automatically handling frequently asked questions⁣ (FAQs). Developers set up‌ the bot’s primary functions, and a built-in ‌AI finds answers from a provided source —⁣ a ‍company knowledge ‍base, for example —⁤ to answer users’ ‌questions.

Amazon is also introducing a built-in QnAIntent ‌feature⁤ for Lex, which incorporates ​the question-and-answer process directly into⁢ the intent structure. This feature utilizes ​an LLM to‌ search for an approved knowledge base and give a relevant⁣ answer. The feature, available in preview, uses⁢ foundation models hosted on Amazon ​Bedrock, a service that offers a choice of ⁢FMs from various AI companies. Currently, the feature allows you ⁢to switch between⁤ Anthropic models,‍ and “we are working to expand to other LLMs in the future,” Srinivasan said.

Amazon Lex can be thought ​of as a system of ⁢systems — and many of those‍ subsystems ‌employ generative AI, Kathleen Carley, a professor at​ the CyLab Security and⁢ Privacy ⁢Institute at Carnegie Mellon University, said⁢ in an interview.

“The key is that ⁣putting a large language model into Lex means that if you build or interact ‍with an Amazon Lex bot, it will be‌ able⁣ to provide more⁢ helpful, more‍ natural human-sounding, and possibly more accurate responses to standard questions,” Carley added. “Unlike the old style‌ analytic ⁤system, these bots are not task focused and so ‌can do things other than follow a few preprogrammed ‌steps.”

Lex is part of Amazon’s AI strategy, including building its LLM. The model, codenamed ​“Olympus,” is customized to⁢ Amazon’s needs and has 2 trillion⁣ parameters, making it ⁢twice the size of​ OpenAI’s GPT-4, which has‌ over 1 trillion parameters.

“Amazon’s LLM is likely to be more⁤ flexible than GPT-4, better able to handle nuance, and may ‌do a better job⁢ with linguistic flow,”​ Carley⁢ added.​ “But it is⁤ too early‍ to really see the practical differences. The differences will depend on both differences in…

2023-12-03 18:41:03
Link from ‍ www.infoworld.com

Exit mobile version