Integration with AWS SQS

I’m running a Tyk POC and one of my requirements is to take a request and write the JSON payload directly to an AWS SQS queue. Is this possible through a custom plugin? Has anyone tried this with SQS or any other queue?

Hi @Jonathan_Gurwitz

This is something which is definitely possible with a custom Tyk plugin.

I’m not certain of your development stack, and I have never used SQS before, but some pointers for you.

If you are a Go developer, you should be able to create a plugin using this library: sqs - Amazon Web Services - Go SDK

If you need another language, you can write your plugin using gRPC.

Here is an example gRPC plugin blog post I wrote about 5-6 yrs ago which talks to RabbitMQ - Decoupling micro-services using Message-based RPC | by Ahmet Soormally | Medium

Here is an example Python plugin which uses Kombu as the messaging library GitHub - TykTechnologies/tyk-plugin-queue: Send messages to a queue server.

Here are a list of example plugins all doing different things: GitHub - TykTechnologies/tyk-awesome-plugins

1 Like

Thanks @ahmet, we have a golang developer so that could work. I want to create a generic plugin which receives the queue name and payload as parameters. The plugin should then write the payload as a message to an SQS queue. I want this to work directly from the plugin and not via my own service which does the sending. Does this sound reasonable?
Have you dealt with AWS SDK authentication in a plugin before? The Tyk Kubernetes pod would run under an IAM role which has the necessary permissions to write to the queue. This is much more manageable than working with API keys.

Yes that sounds reasonable.

Have you deal with AWS SDK authentication in a plugin before?

Yes - I use AWS SDK authentication to interact with Lambda from a go plugin. See Configuring the AWS SDK for Go V2 | AWS SDK for Go V2

1 Like