Hey all,
I’m building a project called Rately. It’s a rate-limiting service that runs on Cloudflare Workers (so at the edge, close to your clients).
The idea is simple: instead of only limiting by IP, you can set rules based on your own data — things like:
- URL params (/users/:id/posts → limit per user ID)
- Query params (?api_key=123 → limit per API key)
- Headers (X-Org-ID, Authorization, etc.)
Example:
Say your API has an endpoint /user/42/posts. With Rately you can tell it: “apply a limit of 100 requests/min per userId”.
So user 42 and user 99 each get their own bucket automatically. No custom nginx or middleware needed.
It has two working modes:
- Proxy mode – you point your API domain (CNAME) to Rately. Requests come in, Rately enforces your limits, then forwards to your origin. Easiest drop-in.
``` Client ---> Rately (enforce limits) ---> Origin API ```
- Control plane mode – you keep running your own API as usual, but your code or middleware can call Rately’s API to ask “is this request allowed?” before handling it. Gives you more flexibility without routing all traffic through Rately.
``` Client ---> Your API ---> Rately /check (allow/deny) ---> Your API logic ```
I’m looking for a few developers with APIs who want to test it out. I’ll help with setup .