Skip to content Skip to sidebar Skip to footer

How To Limit Download Rate Of Http Requests In Requests Python Library?

Is it possible to limit the download rate of GET requests using the requests Python library? For instance, with a command like this: r = requests.get('https://stackoverflow.com/')

Solution 1:

There are several approaches to rate limiting; one of them is token bucket, for which you can find a recipe here and another one here.

Usually you would want to do throttling or rate limiting on socket.send() and socket.recv(). You could play with socket-throttle and see if it does what you need.


This is not to be confused with x-ratelimit rate limiting response headers, which are related to a number of requests rather than a download / transfer rate.

Solution 2:

No built-in support but, it is possible to use stream api.

>>> import requests
>>> import time
>>> req = requests.request('GET', 'https://httpbin.org/get', stream=True)
>>> for data in req.iter_content(chunk_size=1024):
...     time.sleep(0.001)
...

In advanced usage there is written that its's allow you to retrieve smaller quantities of the response at a time.

In my network the example above (leading to a several GB large file) without sleep had bandwidth 17.4 MB/s and with sleep 1 ms 2.5 MB/s.

Post a Comment for "How To Limit Download Rate Of Http Requests In Requests Python Library?"