Conference Coverage

Browse Sections

BACKGROUND IMAGE: iSTOCK/GETTY IMAGES

This content is part of the Conference Coverage: A complete guide to AWS re:Invent 2018 news and analysis
Get started Bring yourself up to speed with our introductory content.

For AI on AWS, it all starts with experimentation

For enterprises, the term artificial intelligence comes with a lot of baggage. Start from scratch with these expert tips to get your Amazon AI project off the ground.

AI and machine learning have been hyped to no end, but don't get distracted by the noise, which can come from zealots and skeptics alike. While these technologies aren't magic elixirs, they can prove useful when applied correctly. Enterprises have already found real-world applications for them -- and yours can, too.

You don't need to be a data scientist to incorporate AI products, though it certainly doesn't hurt. AWS' tools cover nearly the entire spectrum of AI, which means experts can build, train and tweak their own models on the platform. Similarly, beginners can get their feet wet with AI on AWS and incorporate pretrained models with their existing applications.

Regardless of your experience, there are a few things to keep in mind before you dabble with AI on AWS. Here are five expert tips to get you started.

Practice your 'skills' set

Amazon Lex is based on the same deep learning technology that underpins Alexa, Amazon's popular virtual assistant. Users can interact with applications through Lex's natural language processing, which opens the door to all sorts of use cases.

Developers can build chatbots based on basic elements that define what the program can do and how it responds to commands. These chatbots integrate with AWS Lambda -- though only as a fulfillment mechanism -- and should generally be treated as code. Developers can define these chatbots in a JSON format and update them through an API.

Lex includes prebuilt integrations with popular apps -- such as Facebook and Slack -- which saves development time but has some formatting limitations. Developers should watch for system latency and other back-end issues.

Lex can be a good first step for those who want to use AI on AWS, but keep the focus on specific use cases to help your business. There's a flood of skills -- voice-based applications for smart homes -- and the market just isn't there yet for developers to envision themselves as future chatbot moguls.

Rekognize where to begin

Though often intrigued by AI, enterprises struggle to see the practical application of these technologies. That chasm is understandable, but enterprises needn't be daunted by it.

Sometimes, it's best to experiment. Use Lex to work with a Raspberry Pi microcontroller, and build an IoT application. Or test Amazon Polly to generate voice prompts from text for real-time home monitoring alerts. You could even train Amazon Rekognition to identify images of celebrities -- or other, less famous people you happen to know.

The major cloud providers have invested heavily in AI, both internally and with their customer-facing services. They've placed big bets on these technologies being integral to the future of IT, so it's in AWS' interest to make its users as comfortable with the services as possible. And these are just a few of the examples enterprises can find as they familiarize themselves with AWS' AI toolkit. Sample experiments range from simple to complex, so businesses have ample opportunity to test the tools that fit their needs and skill sets.

Sage advice on a machine learning tool

Once IT teams acclimate with the suite of machine learning application services, they can go deeper down the rabbit hole with AWS' platform-level services. AWS gears its Amazon SageMaker product toward data scientists. AWS wants its machine learning platform to be the springboard that expands the number of IT professionals that can build AI-infused applications, but as of yet, novice analysts might find themselves unable to navigate its intricacies. As a result, they wouldn't yield much in the way of productive results if they used SageMaker.

But, for those familiar with machine learning, SageMaker's appeal is its simplicity. The service covers the full development lifecycle. Data scientists can use it to build, optimize, validate and deploy machine learning models. It has 11 preconfigured algorithms to address a range of problems, or data scientists can use custom TensorFlow or MXNet code.

An enterprise should have a data scientist or data analyst on staff that's familiar with SQL, Python, R, Jupyter or TensorFlow. With the right personnel in place, SageMaker can ease the workload on those employees because it offloads the infrastructure management responsibility to AWS. It could even open new opportunities for them, as they can experiment with the platform and avoid the costly, time-consuming process needed to procure on-premises resources.

A lens to picture even more AI experimentation

AWS' other platform-level AI service is more than just software; it's also a camera. AWS DeepLens offers a mix of pretrained models for deep learning and code that links to an AWS-supplied video camera. The video camera is fully programmable, and AWS provides an array of tutorials and training materials.

This service is still new, so it's a work in progress. But it's also a tool that can bring together so much of the ecosystem for AI on AWS. Developers and data scientists can use it to learn more about the aforementioned AI services, as well other native AWS tools, such as Greengrass, Lambda and DynamoDB.

This was last published in September 2018

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

What would you like to see AWS do to expand its AI tool set?
Cancel

-ADS BY GOOGLE

SearchCloudApplications

TheServerSide.com

SearchSoftwareQuality

SearchCloudComputing

Close