vali_111 - Fotolia
AWS was built around the concept of renting space inside its data centers, but market demand has pushed the public cloud behemoth outside its own four walls.
AWS Greengrass is newly available software that caters to clients' needs that can't necessarily be met inside Amazon's vast collection of public cloud data centers. It brings compute, messaging and data caching to local devices, tailored toward the emerging trends around edge computing and internet of things (IoT).
About 35% of companies that build production IoT applications want edge capabilities, according to a study from Forrester Research. That reflects the evolution of more complex uses in IoT, and AWS' recent moves reflect its serious intent to be a player in this market.
"When you look at what Amazon is doing and why they're doing it, you always have to look at what feedback they're getting from their customers," said Jeffrey Hammond, an analyst at Forrester in Cambridge, Mass. "They've started to go to the edge because customers are pushing them to the edge."
Two fronts have emerged in IoT, Hammond said. The first is connected devices, which tend to be simpler and rely on either a sensor or mobile device. The second one -- the one AWS has targeted with Greengrass -- is the industrial internet, where a company that has invested $10 million in some machinery won't flinch to spend a few extra dollars on higher-power CPU edge devices.
AWS isn't alone in this market, however. Cisco, GE and PTC are all angling to use their particular skill sets to gain an advantage. And while Amazon can lean on its cloud expertise, competitors like Microsoft and Google have made their own plays in this market.
"What makes Amazon different is approaching it from the perspective of mainstream developer using AWS Lambda at the heart of Greengrass," Hammond said. "It's an extension of what we see web and mobile developers doing, as opposed to traditional tech and embedded to solve edge."
For years, AWS executives portrayed the rise of public cloud as a one-way street -- all workloads would eventually move there, never to return to customers' corporate offices. The cloud pioneer has since softened its stance to accommodate hybrid deployments. AWS Greengrass takes that a step further, to make AWS compute resources available outside Amazon's cloud data centers. It's the closest Amazon has come to true hybrid capabilities based on the same AWS software in the cloud and on premises.
Greengrass was announced the last day of November, and became generally available in June in four of the 14 AWS regions.* It relies on Lambda functions to execute jobs on customers' own machines, and enterprises can use it to track sensor and machine data across internal networks. It incorporates with AWS IoT platform and AWS IoT Button, and it's built into the Snowball Edge, which adds compute capabilities to the popular data transfer devices.
Bringing snowballs out to sea
AWS Greengrass has pushed Amazon into new terrain, and its integration in Snowball is one of the first examples for the potential to influence workloads beyond its own data centers.
The Hatfield Marine Science Center at Oregon State University collects image data on plankton as part of its studies into coastal ecosystems in the Pacific Ocean. Scientists would return with as much as 60 terabytes of data on hard drives each week, creating a huge burden with the time needed to transfer and process that data, said Chris Sullivan, assistant director of biocomputing for the Center for Genome Research and Biocomputing at OSU.
"The [scientists'] goal is to get the end result as quickly as possible and if I'm in the middle of transferring hard drives," Sullivan said. "If I'm doing that for a month after they come off the ship, how am I ever going to get them that data faster to get that end result?"
The university approached AWS about this problem and became one of the earliest users of Snowball Edge, the latest version of Amazon's data transfer device that also includes Greengrass.
Sullivan would like to see a 40 Gbps port rather than the 10 Gbps one currently found on the Snowball, but beyond that he said the durable devices have tremendously cut down the time and effort needed to move the data. There are even Lambda functions set up to kick off jobs as soon as the data lands inside AWS.
There's even room to use these devices in ways AWS may have never envisioned. Sullivan, for example, has been in contact with other universities about ordering a Snowball and using it to ship terabytes of data between educational institutions before it's sent back to Amazon.
"Right now, the FedEx network is our limit," he said. "The hard drive is no longer the limit."
Not everything can live in the public cloud
Stanley Black & Decker's Digital Accelerator serves all the company's various business units. One group has worked to build IoT capabilities into individual tools and devices, while another wants to use IoT devices to create digital workspaces in nontraditional environments, such as construction sites.
In the latter scenario, edge devices are essential to overcome limitations around connectivity to the cloud.
"That response time has hundreds of milliseconds delay based on the laws of physics, but it has a destabilizing factor in control system," said Hamid Montazeri, director of software engineering at Stanley Black & Decker Digital Accelerator.
AWS' move to address edge computing is a prime example of why the cloud doesn't work for every scenario. It shouldn't be a surprise, though, because every infrastructure provider must plan for the next big frontier of the tech market, said Laz Vekiarides, CTO and co-founder of ClearSky Data, a Boston-based storage startup and AWS partner.
Still, AWS Greengrass doesn't truly solve the connection issue. As long as Amazon and other cloud vendors predicate their business on huge data centers in the middle of nowhere, this networking issue will persist, he said.
Some companies that go to the cloud need to understand what can go in the cloud and what can't, Vekiarides said. They may put POS processing and transactional processing in the cloud, for example, but a lot of IoT applications that require instantaneous response and user interactivity can't go there.
Edge devices have become more sophisticated amid rising demand for compute power and challenges with speed of light and other latency factors, said Steven Martin, vice president and chief digital officer at GE Energy Connections, which works with multiple public cloud providers.
Amazon, so far, has addressed the easiest part of the problem around more compute power at the edge, said Martin, who helped build Microsoft Azure before joining GE last November. The bigger challenges are in areas such as how much business logic is put on a system, how to manage the interactions between the two, and how much autonomy to give an edge device.
"Any developer worth their salt can bridge between two end points," Martin said. "The real value is managing sophisticated business logic and that gets into application design or orchestration between models at a central control versus the edge."
It's also notable, architecturally, that AWS based this service on Lambda instead of bare metal, Vekiarides said. Rather than just offer a box with the same price tag as any other physical infrastructure, Amazon needs to differentiate.
"If they offer a whole core and have Intel core outside of that managed environment in Ashburn or Portland, there's not much to distinguish that versus something I buy from Dell or Quanta or any of the big OEMs," Vekiarides said. "You'll always see this stuff wrapped around compute infrastructure."
But for Stanley Black & Decker, AWS Greengrass solves multiple problems. It's important when a client doesn't want certain data to leave the factory, and it can even act as a filter of sorts.
These sensors can collect an avalanche of data, but not all of that has to go to the cloud, Montazeri said. For example, if a sensor checks every 60 seconds for a change, the administrator could direct the edge device to only send data that indicates those changes, rather than all 60 data points collected over the course of an hour. That ability to effectively prune data reduces storage demands and lowers costs.
Perhaps more importantly, the ability to bring Lambda functions to the factory means the same methodology is applied across environments. That includes the same rules engine and functions as a service that built the applications on AWS, as well as many of the authentication and authorization mechanisms inside the cloud before the software is deployed locally.
"It creates uniformity of development in a new environment we are used to in the cloud; now we can apply that knowledge to the edge as well," Montazeri said. "It's much better than learning a completely new platform and learning all the nuances of that platform."
* Information changed after publication
Trevor Jones is a news writer with SearchCloudComputing and SearchAWS. Contact him at firstname.lastname@example.org.
Use Lambda to drive IoT projects
IT professionals react to slew of new services
Edge computing presents another challenging market for cloud