- In mid-May, AWS highlighted its portfolio of AI tools and solutions during its AWS Summit Online for the Americas region and announced the general availability of Amazon Kendra for enterprises.
- Tools that support AI model development and management and pre-built solutions that can be easily deployed by developers who aren’t AI experts help streamline AI adoption.
AWS understands the challenges enterprises face when building their own machine learning models. The company notes that when scaling AI adoption, enterprises face wide-ranging complexities that can start as early as the data collection stage and continue throughout the model management lifecycle. At the beginning of a project, organizations face challenges related to data identification, storage, and curation as they pull together disparate data sources. Later, while building and training models, they need to manage numerous other complexities, such as sharing notebooks and pre-trained models. They need to ensure effective collaboration among what can be a growing number of individuals or teams, each with their own specializations. And, since machine learning models aren’t usually perfect the first time, team members need to communicate during the process of model tuning and optimization. They need to manage multiple versions of models, run experimental models in real time, and compare results. Even after deployment, machine learning algorithms need to be managed and monitored for concerns such as data drift, with newer versions deployed as additional data is collected or the factors that impact model results change. Managing these tasks can be challenging, and as AWS rightly points out, tools that help manage the complexities do much to streamline and speed AI deployments.
In mid-May, AWS highlighted its portfolio of AI tools and solutions during its AWS Summit Online for the Americas region and announced the general availability of Amazon Kendra for enterprises.
AWS Summit Online for the U.S. and Canada
AI played a starring role during the AWS Summit; two of the conference tracks were dedicated to the technology, with tools for machine learning being a key topic. Speakers highlighted Amazon SageMaker Studio, which provides full lifecycle management for machine learning models.
Amazon SageMaker Studio is an integrated environment that incudes tools for building, training, deploying, and monitoring models. For the build stage, Amazon SageMaker notebooks provide a pre-built environment that promotes collaboration by allowing team members to easily share work with colleagues. Amazon SageMaker Experiments helps team members capture the metadata that is used when building an experiment and allows them to organize, compare, and evaluate experiments and trials. Amazon SageMaker Autopilot supports regression and classification and automatically runs training trials, speeding up the model development process. During the training process, Amazon SageMaker tools help optimize infrastructure: SageMaker training jobs allow users to specify infrastructure configurations, and Amazon SageMaker automatic model tuning finds the best values for hyperparameters. A new feature, SageMaker Debugger, analyzes the state of a model while it is being trained and identifies potential concerns that arise during training jobs. To support the deployment phase, AWS offers Amazon SageMaker Hosting Service and Amazon SageMaker batch transform. For the final phase of the model lifecycle, the company offers Amazon SageMaker Model Monitor, which automatically manages analytics and infrastructure, as well as detecting data drift. As new data becomes available, Amazon SageMaker Ground Truth assists with data labeling: humans label data to start, after which a machine takes over. Data with low confidence scores is sent back to the human labelers for further review.
A few days prior to the AWS Americas Summit, Amazon announced the general availability of Amazon Kendra, which provides AI-enabled enterprise search. Amazon Kendra is designed to make it easier for employees to find information, including corporate information such as legal documents and operating manuals.
Amazon Kendra uses natural language processing, both for query interpretation and to analyze documents. The service includes reading comprehension of unstructured data, a ‘frequently asked question’ algorithm, and document ranking. Amazon Kendra was initially launched with expertise in seven domains; seven more domains were added at the general availability launch so that Kendra is now optimized to understanding complex language from IT, financial services, insurance, pharmaceuticals, manufacturing, energy, legal, media and entertainment, travel and hospitality, healthcare, HR, news, telecom, and the automotive sectors. (More domains will be added throughout 2020.) Amazon Kendra also supports Relevance tuning, which allows users to tell Kendra how to prioritize content, fine-tuning Kendra’s accuracy. Later this year, Kendra will be augmented with incremental learning, which means the model will improve by analyzing user actions and feedback. Built-in connectors allow Kendra to easily connect with data sources, providing an out-of-the-box experience for customers. Amazon notes that customers can quickly be up and running with Kendra since the most popular connectors are built in and models are pre-trained so additional coding isn’t required. Customers are billed by usage, starting at $7.00 per hour for 500,000 documents and 40,000 queries per day. A lower-cost, more limited developer version was launched last week to support proof-of-concept deployments and exploration.
AWS Market Positioning
Many organizations are eager to reap the benefits of artificial intelligence, but building machine learning models from the ground up can be time consuming and obtaining the necessary talent can be expensive. Tools that support model development and management, as well as pre-built solutions that can be easily deployed by developers that aren’t AI experts, help streamline AI adoption.
With its full portfolio of AI and data management tools, there is no doubt that AWS is a very strong competitor in the market for cloud-based AI solutions. The company has an impressive portfolio of tools and solutions as well as a leading go-to-market strategy that have created enviable momentum.
As it moves forward, AWS should consider offering more industry-specific AI solutions, which can help smaller organizations better understand the applicability of AI in their business as well as streamline adoption. Furthermore, it should ensure that it is considered a thought leader in ‘responsible AI,’ a topic that is becoming increasingly relevant as organizations look to expand their use of AI technology beyond the contact center and chatbots and as automation and predictive analytics gain traction in a post COVID-19 world. Organizations will need help not only with the logistics of managing AI technology, but also with tackling the ethical issues raised by broader use and with fostering a more data-driven corporate culture.