I have a backend worker servers which get a request, process that request, and return a result.
The case is that the server only can handle around 100 requests in parallel, so I have 2 separate servers for now.
My guess is that I need a queue management system which will queue requests from users, check if any of server have resources to process the request (api.example.com/status), and send it or put it into a queue.
Is it possible to do it with Tyk gateway?
I’m trying to avoid create another instance just to manage queue