@stalwartcoder
Abhishek
π¨βπ» Software Eng. at Essentia SoftServ
π Pythonista
π¨βπ©βπ§βπ¦Β Community first person π
connect with me:
π« not "10x Engineer"
Manage background work - outside the usual HTTP request-response cycle.
For example, a web app poll the API every 10 minutes to collect the names of the top crypto currencies.
A task queue would handle invoking code to call the API, process the results and store them in a persistent DB for
later use.
Input to task queue's are a unit of workΒ - separate code.
Task Queue is a system for parallel execution of tasks.
βΒ Useful in certain situation.
πΒ General guidelines :
distributes task
Broker
sends task
Client
Worker
Worker
Worker
distributes task
distributes task
Some of the response time issues can be solved :
You need a Queue :
π· RQ (Redis Queue)
π· Huey
π· CloudAMQP
Blah....blahh...blaahhh
Celery is an asynchronous task queue/job queue based on distributed message passing.
It is focused on real-time operation, and scheduling as well.
Tasks can execute asynchronously (in the background) or synchronously (wait until ready).
Easy to integrate & multi broker support .
Key concept
it's just simple!
from celery import Celery
app = Celery('tasks', broker='pyamqp://guest@localhost//')
@app.task
def add(x, y):
return x + y
>>> from tasks import add
>>> add.delay(4, 4)
<AsyncResult: 889143a6-39a2-4e52-837b-d80d33efb22d>
Define task
Execute task
$ celery -A tasks worker --loglevel=info
Run the celery worker
$ pip install celery
install celery
client 1
CeleryWorker 1
client 2
Broker
Task Result Storage
Task Queue 1
Task Queue 2
Task Queue N
...
send tasks
send tasks
distribute tasks
distribute tasks
get task result
store task result
store task result
CeleryWorker 2
RabbitMQ
Redis
AWS SQS
More detail :
More: