前言
大家都知道Celery是分布式任務(wù)隊(duì)列,可是在學(xué)習(xí)過程中闷煤,發(fā)現(xiàn)基本都是怎么用celery去異步操作,很少有講到如何分布式命爬。于是把自己的采坑過程分享給大家曹傀。
一、準(zhǔn)備工作
1.選擇一個(gè)消息中間件Broker
官方推薦的有RabbitMQ和Redis饲宛,二選一安裝皆愉。具體安裝可參考
http://docs.celeryproject.org/en/latest/getting-started/first-steps-with-celery.html#first-steps
本人選擇Redis做Broker.
2.準(zhǔn)備一個(gè)虛擬機(jī)
確保本機(jī)和虛擬機(jī)都可以連上redis。有其他網(wǎng)絡(luò)互通的機(jī)器也可以艇抠。
二幕庐、簡單的demo
1.創(chuàng)建項(xiàng)目和python環(huán)境
mkdir myCelery
cd myCelery
virtualenv -p python2 envp2
source envp2/bin/activate
2.安裝Celery和Redis
pip install celery==4.0
pip install redis
請注意這里安裝的是4.0.按照官方的說法4.0是既支持Python2又支持python3的版本。官方內(nèi)容如下:
Version Requirements
Celery version 4.0 runs on
Python ?2.7, 3.4, 3.5?
PyPy ?5.4, 5.5?
This is the last version to support Python 2.7, and from the next version (Celery 5.x) Python 3.5 or newer is required.
If you’re running an older version of Python, you need to be running an older version of Celery:
Python 2.6: Celery series 3.1 or earlier.
Python 2.5: Celery series 3.0 or earlier.
Python 2.4 was Celery series 2.2 or earlier.
Celery is a project with minimal funding, so we don’t support Microsoft Windows. Please don’t open any issues related to that platform.
3.碼代碼
vim tasks.py
conding:
from celery import Celery
app = Celery('tasks', broker='redis://10.211.55.2:6379/0')
@app.task
def add(x, y):
return x + y
4.運(yùn)行
把tasks作為worker運(yùn)行起來
celery -A tasks worker -l info
5.發(fā)起任務(wù)
另開一個(gè)terminal
cd myCelery
envp2/bin/pyton
#python terminal 下
>>> from tasks import add
>>> add.delay(4, 4)
觀察運(yùn)行tasks的terminal家淤,有結(jié)果8出來就證明一切正常异剥。熟悉了簡單的流程后,我們就在這個(gè)基礎(chǔ)上進(jìn)一步深入絮重。
三冤寿、分布式celery
開啟虛擬機(jī)歹苦,把tasks.py拷貝過去,python環(huán)境配置完成后督怜,打開terminal殴瘦。運(yùn)行tasks
celery -A tasks worker -l info
這時(shí)觀察本機(jī)運(yùn)行tasks的terminal,多輸出如下內(nèi)容:
[2018-01-24 21:03:10,114: INFO/MainProcess] sync with celery@ubuntu
而虛擬機(jī)上的terminal將會(huì)輸出如下內(nèi)容:
[2018-01-24 21:03:09,007: INFO/MainProcess] Connected to redis://10.211.55.2:6379/0
[2018-01-24 21:03:09,017: INFO/MainProcess] mingle: searching for neighbors
[2018-01-24 21:03:10,036: INFO/MainProcess] mingle: sync with 1 nodes
[2018-01-24 21:03:10,037: INFO/MainProcess] mingle: sync complete
[2018-01-24 21:03:10,052: INFO/MainProcess] celery@ubuntu ready.
至此,集群已經(jīng)成功連接号杠。下面我們來測試一下蚪腋。在本機(jī)tasks.py同級新建test_tasks.py文件,輸入下面內(nèi)容并新開terminal運(yùn)行:
from tasks import add
#循環(huán)執(zhí)行10次add函數(shù)
for i in range(10):
add.delay(i, i)
來觀察一下本機(jī)tasks terminal:
[2018-01-24 21:37:18,360: INFO/MainProcess] Received task: tasks.add[33586201-082a-4e8f-8153-8b9d757990af]
[2018-01-24 21:37:18,361: INFO/ForkPoolWorker-7] Task tasks.add[33586201-082a-4e8f-8153-8b9d757990af] succeeded in 0.0004948119749315083s: 2
[2018-01-24 21:37:18,362: INFO/MainProcess] Received task: tasks.add[a6e363cf-fd25-4b0d-9da4-04bac1c5476c]
[2018-01-24 21:37:18,363: INFO/MainProcess] Received task: tasks.add[7fd2b545-f87b-49d3-941f-b8723bd1b039]
[2018-01-24 21:37:18,365: INFO/ForkPoolWorker-2] Task tasks.add[7fd2b545-f87b-49d3-941f-b8723bd1b039] succeeded in 0.000525131996255368s: 8
[2018-01-24 21:37:18,365: INFO/ForkPoolWorker-8] Task tasks.add[a6e363cf-fd25-4b0d-9da4-04bac1c5476c] succeeded in 0.000518032000400126s: 6
[2018-01-24 21:37:18,366: INFO/MainProcess] Received task: tasks.add[79d1ead3-1077-4bfd-8300-3a335f533b74]
[2018-01-24 21:37:18,368: INFO/MainProcess] Received task: tasks.add[0d0eefab-c6f0-4fa6-945c-6c7931b74e7b]
[2018-01-24 21:37:18,368: INFO/ForkPoolWorker-4] Task tasks.add[79d1ead3-1077-4bfd-8300-3a335f533b74] succeeded in 0.00042340101208537817s: 12
[2018-01-24 21:37:18,369: INFO/MainProcess] Received task: tasks.add[230eb9d1-7fa5-4f18-8fd4-f535e4c190d2]
[2018-01-24 21:37:18,370: INFO/ForkPoolWorker-6] Task tasks.add[0d0eefab-c6f0-4fa6-945c-6c7931b74e7b] succeeded in 0.00048609700752422214s: 16
[2018-01-24 21:37:18,370: INFO/ForkPoolWorker-7] Task tasks.add[230eb9d1-7fa5-4f18-8fd4-f535e4c190d2] succeeded in 0.00046275201020762324s: 18
在來看一下虛擬機(jī)上的terminal:
[2018-01-24 21:37:17,261: INFO/MainProcess] Received task: tasks.add[f95a4a20-e245-4cdc-b48a-4b79416b14c1]
[2018-01-24 21:37:17,263: INFO/ForkPoolWorker-1] Task tasks.add[f95a4a20-e245-4cdc-b48a-4b79416b14c1] succeeded in 0.00107001400102s: 0
[2018-01-24 21:37:17,264: INFO/MainProcess] Received task: tasks.add[3ddfbcda-7d75-488c-bc69-243f991bb49a]
[2018-01-24 21:37:17,267: INFO/MainProcess] Received task: tasks.add[b0a36bfe-87c4-43ef-9e6e-fb8740dd26d0]
[2018-01-24 21:37:17,267: INFO/ForkPoolWorker-1] Task tasks.add[3ddfbcda-7d75-488c-bc69-243f991bb49a] succeeded in 0.00107501300226s: 4
[2018-01-24 21:37:17,270: INFO/ForkPoolWorker-2] Task tasks.add[b0a36bfe-87c4-43ef-9e6e-fb8740dd26d0] succeeded in 0.00111001500045s: 10
[2018-01-24 21:37:17,272: INFO/MainProcess] Received task: tasks.add[7bcec842-65e5-407d-9e7d-99183956ef3e]
[2018-01-24 21:37:17,277: INFO/ForkPoolWorker-1] Task tasks.add[7bcec842-65e5-407d-9e7d-99183956ef3e] succeeded in 0.000870012001542s: 14
我們只觀察succeeded in 及后面的值,本機(jī)輸出了: 2 8 6 12 16 18
虛擬機(jī)上輸出了:0 4 10 14
充分證明了10個(gè)add任務(wù)已經(jīng)被分發(fā)處理了。
四姨蟋、結(jié)束語
理解了celery的運(yùn)行規(guī)則后屉凯,可以很方便的搭建分布式系統(tǒng)。后續(xù)將會(huì)寫一篇celery如何和flask系統(tǒng)集成眼溶,介紹celery在Python2和Python3中使用的坑悠砚。