Skip to main content

AWS Eventbridge Pipes

What is Amazon Eventbridge pipes?

✓ It's a new feature released by AWS in recent re- invent'23.
✓ This service is available under AWS Eventbridge console.
✓ Eventbridge Pipes helps you connect your source and targets seemlessly without any integration code.
✓ When you are creating an event driven architecture, you must give a chance to eventbridge pipes in your architecture and see the magic.
✓ You can set up pipe easily, by choosing your source, add filtering (optional), add additional enrichment step (optional) and add your target. That's all you are done!

Image source: Amazon Web Services

How it works? What are the source and targets?

Image source: Amazon Web Services

✓ You have Amazon AWS, Amazon Kinesis Data Streams, Amazon DynamoDB, Amazon MSK, Amazon MQ as source for pipes currently.
✓ Optionally, in your filtering step you can select or refine you input data and send to next step seemlessly.
✓ Additionally an enrichment step is available before you connect a target for your pipes. This enrichment step allow you to perform some additional manipulation, or data retrieve from other sources or you can even perform some other task with you input data before sending it to the target.
✓ To perform this enrichment step you can avail any of the available AWS service within pipes. 
✓ Here comes the final step, targets. You can connect the pipes with available targets and all done.

When and where to use eventbridge pipes?

My Thoughts 1:
✓ You have AWS SQS connected with AWS Lambda, in this case you can introduce pipes in between SQS and Lambda and do any filtering to remove unwanted data flowing in your lambda, also you can reject the false payload or unstructed payload reaching your lambda function by adding filtering patterns in pipes. Thus saves some cost and some lines of code, time, performance in your lambda.

My Thoughts 2:
✓ You have an use case like you have data in DynamoDB and writing those data to redshift clusters using a glue jobs. Assume, In this glue job you are pulling data from DynamoDB and changing the columns names in the data and putting to redshift clusters. If you have these kind of use case, directly you can eliminate the glue job and introduce pipes instead of glue job, point your source DynamoDB and add filtering step to change the column name and connect your redshift clusters directly from pipes and see the magic and see your AWS bills going steep down, since you have replaced glue job with pipes option.

My Thoughts 3:
✓ You have an event based operations, like some data in your DynamoDB is updated with some items behinds the scenes you want validate and make sure this items updated or entered is right and correct. In this case connect your pipes with your source DynamoDB and add an enrichment step to validate the data updated in your DynamoDB is right, and if not right send those details to pipes targets where you can have a step function to send an email to the respective team or contact that recently updated entry in the source DynamoDB is wrong, please check and act.

Reference:

Tags:
#aws #architecture #eventbridge #eventbridgepipes #cloud

Have a Great Day
:)

Comments

Popular posts from this blog

BIG DATA ANALYTICS

BIG DATA ANALYTICS Have you ever hit upon how Amazon and Flip kart could possible verdict what we want; how the Google auto completes our search; how the YouTube looks into videos we want to watch? When we open YouTube, we will be at sixes and sevens, when we find ads related to what we have searched earlier in the past days. This is where we find ourselves in the era of big data analytics. More than 3 trillion bytes of information are being generated everyday through our smart phones, tablets, GPS devices, etc.  Have we thought about what can be done with all these information? This is where the data analytics comes into play. Big data analytics is just the study of future build up to store data in order to extract the behaviour patterns. The entire social networking website gathers our data which are related to our interest which is usually done by using our past search or any other social information. Data analytics will lead to a walkover in near future. 

Amazon Elastic Compute Cloud (Amazon EC2)

Amazon Elastic Compute Cloud (Amazon EC2) What is AWS EC2 ? Amazon Elastic Compute Cloud, EC2 is a web service from Amazon that provides resizable compute services in the cloud. What do you mean by resizable in AWS EC2 ? You can quickly scale up or scale down the number of server instance you are using based upon on your traffic.  What is called as Instance? An instance is a virtual server for running applications on Amazon’s EC2. Simply Virtual Machine is called as Instance(i.e)it holds the HDD, OS, RAM, Network Connection Whatever things that are need to run a system.  Note : Everything is Virtual You can’t able to see the HDD, RAM , or CPU. Only thing is you can able to configure it based on your need.    So here is the Definition….  Amazon Elastic Compute Cloud (Amazon EC2) is a web service that provides secure, resizable compute capacity in the cloud. It is designed to make web-scale cloud computing easier for developers. Why Amazon EC2 ?   Pay-as-you

CLOUD COMPUTING SERVICES

Services provided by Cloud Computing   1. Software as a service :  It is process by which the software will be provided by the cloud server's. Instead of using the software in our local machine we can directly use the cloud services directly to get our work done.   Example : Google Slides, Google Docs, Google Sheets, Zoho Writer,....etc. 2. Platform as a service : It is process by which you can directly use platform like windows , linux , Mac, ....Which ever you need to done your work. Example : There are certain software which might be run only certain platform. Assume I have windows operating system but i am doing research in BIG DATA. There Linux Operation system might be handy for my research work. Where it is hard for me to put Operating System again and again to my local machine and taking backup data. So Cloud Computing Services Makes our work easier whatever platform we need it will provide us without any loss in data or hard backup pro