Does Bottle Handle Requests With No Concurrency?
At first, I think Bottle will handle requests concurrently, so I wrote test code bellow: import json from bottle import Bottle, run, request, response, get, post import time app =
Solution 1:
Concurrency isn't a function of your web framework -- it's a function of the web server you use to serve it. Since Bottle is WSGI-compliant, it means you can serve Bottle apps through any WSGI server:
- wsgiref (reference server in the Python stdlib) will give you no concurrency.
- CherryPy dispatches through a thread pool (number of simultaneous requests = number of threads it's using).
- nginx + uwsgi gives you multiprocess dispatch and multiple threads per process.
- Gevent gives you lightweight coroutines that, in your use case, can easily achieve C10K+ with very little CPU load (on Linux -- on Windows it can only handle 1024 simultaneous open sockets) if your app is mostly IO- or database-bound.
The latter two can serve massive numbers of simultaneous connections.
According to http://bottlepy.org/docs/dev/api.html , when given no specific instructions, bottle.run
uses wsgiref to serve your application, which explains why it's only handling one request at once.
Post a Comment for "Does Bottle Handle Requests With No Concurrency?"