AWS Data Pipeline helps customers to manage and streamline any workflow that’s data-driven. The service is useful for customers who have data spread across many AWS services and want to harness it all into a single location for configuration and management. For example, a data scientist could assign a job to Data Pipeline so that it accesses log data from the storage service every hour and then transfers it to a relational database for future analysis. While the service is particularly suited to workflows already optimized for AWS, Data Pipeline can also connect with the customer's proprietary data source in house, as well as third-party data sources.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Customers can access information from a data source, process it and then automatically transfer the result to another system or service simply by using a Data Pipeline template. Developers can access Data Pipeline through the AWS Management Console, the command line interface, or the service APIs. Pricing depends on usage and can be free for low frequency activities.
Continue Reading About AWS Data Pipeline (Amazon Web Services Data Pipeline)