Adding 'Addictional Information for Rate Limiting' sub section.

This commit is contained in:
Maxime Delporte
2025-11-16 19:35:28 +01:00
parent 116cdb099f
commit 7de9ef89b7

View File

@@ -388,7 +388,13 @@ Maps are not safe for concurrent use: it's not defined what happens when you rea
So, to get around this, we'll need to synchronize access to the map of rate limiters using a [sync.Mutex](http://golang.org/pkg/sync/#Mutex) (a mutual exclusion lock), so that only one goroutine is able to read or write to the map at any moment in time. So, to get around this, we'll need to synchronize access to the map of rate limiters using a [sync.Mutex](http://golang.org/pkg/sync/#Mutex) (a mutual exclusion lock), so that only one goroutine is able to read or write to the map at any moment in time.
**Important**: How mutexes work, and how to use them, can be quite confusing if you haven't encountered them before and it's impossible to fully explain in a few short sentences. A much more detailed article is available here - [Understanding Mutexes](https://www.alexedwards.net/blog/understanding-mutexes) - which provides a proper explanation. If you're not already confident with mutexes, I highly recommand reading this before you continue. **Important**: How mutexes work, and how to use them, can be quite confusing if you haven't encountered them before and it's impossible to fully explain in a few short sentences. A much more detailed article is available here - [Understanding Mutexes](https://www.alexedwards.net/blog/understanding-mutexes) - which provides a proper explanation. If you're not already confident with mutexes, I highly recommend reading this before you continue.
#### Additional Information for Rate Limiting
**Distributed applications** : Using this pattern for rate-limiting will only work if your API application is running on a single-machine. If your infrastructure is distributed, with your application running on multiple servers behind a load balancer, then you'll need to use an alternative approach.
If you're using HAProxy or Nginx as a load balancer or reverse proxy, both of these have built-in functionality for rate limiting that it would probably be sensible to use. Alternatively, you could use a fast database like Redis to maintain a request count for clients, running on a server which all your application servers can communicate with.
### Performance ### Performance