NO IMAGE

celery就不多做介紹了,直接上程式碼(Django2.0/Celery4.1)[網上找到的很多例子都是django-clerey的,但是celery的官網說4.0以後版本的可以直接配置到Django裡]:

$ mkdir proj
$ cd proj
$ pipenv --three
$ head -2 Pipfile
[[source]]
url = "https://pypi.org/simple"
$ sed -i 's|pypi.org/simple|mirrors.aliyun.com/pypi/simple|g' Pipfile
$ head -2 Pipfile
[[source]]
url = "https://mirrors.aliyun.com/pypi/simple"
$ pipenv shell
(proj-fcaE2KAe) $ pipenv install django celery
(proj-fcaE2KAe) $ pipenv graph
celery==4.1.1
- billiard [required: <3.6.0,>=3.5.0.2, installed: 3.5.0.3]
- kombu [required: >=4.2.0,<5.0, installed: 4.2.0]
- amqp [required: <3.0,>=2.1.4, installed: 2.2.2]
- vine [required: >=1.1.3, installed: 1.1.4]
- pytz [required: >dev, installed: 2018.4]
Django==2.0.5
- pytz [required: Any, installed: 2018.4]
(proj-fcaE2KAe) $ django-admin startproject proj .
(proj-fcaE2KAe) $ vi proj/celery.py
(proj-fcaE2KAe) $ cat proj/celery.py
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "proj.settings")
app = Celery('proj')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
(proj-fcaE2KAe) $ vi proj/__init__.py
(proj-fcaE2KAe) $ cat proj/__init__.py
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ['celery_app']
(proj-fcaE2KAe) $ python manage.py startapp app1
(proj-fcaE2KAe) $ vi proj/settings.py
(proj-fcaE2KAe) $ cat -n proj/settings.py|head -41|tail -9
33  INSTALLED_APPS = [
34      'django.contrib.admin',
35      'django.contrib.auth',
36      'django.contrib.contenttypes',
37      'django.contrib.sessions',
38      'django.contrib.messages',
39      'django.contrib.staticfiles',
40      'app1',
41  ]
(proj-fcaE2KAe) $ vi app1/tasks.py
(proj-fcaE2KAe) $ cat app1/tasks.py
from celery.schedules import crontab
from celery.decorators import periodic_task
from django.utils import timezone
@periodic_task(run_every=crontab())
def update_weather():
# do something
print(timezone.now())
(proj-fcaE2KAe) $ celery -A proj worker -l info -B
-------------- [email protected] v4.1.1 (latentcall)
---- **** -----
--- * ***  * -- Linux-4.15.0-20-generic-x86_64-with-Ubuntu-18.04-bionic 2018-05-22 03:48:12
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         proj:0x7fcfa57004a8
- ** ---------- .> transport:   amqp://guest:**@localhost:5672//
- ** ---------- .> results:     disabled://
- *** --- * --- .> concurrency: 2 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery           exchange=celery(direct) key=celery
[tasks]
. app1.tasks.run_every_minute
[2018-05-22 03:48:12,091: INFO/Beat] beat: Starting...
[2018-05-22 03:48:12,147: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2018-05-22 03:48:12,158: INFO/MainProcess] mingle: searching for neighbors
[2018-05-22 03:48:13,186: INFO/MainProcess] mingle: all alone
[2018-05-22 03:48:13,208: WARNING/MainProcess] /home/vagrant/.local/share/virtualenvs/proj-fcaE2KAe/lib/python3.6/site-packages/celery/fixups/django.py:200: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2018-05-22 03:48:13,210: INFO/MainProcess] [email protected] ready.
[2018-05-22 03:49:00,026: INFO/Beat] Scheduler: Sending due task app1.tasks.run_every_minute (app1.tasks.run_every_minute)
[2018-05-22 03:49:00,036: INFO/MainProcess] Received task: app1.tasks.run_every_minute[b0c142f5-adaa-43b2-8efc-6ee38eb1f191]
[2018-05-22 03:49:00,039: WARNING/ForkPoolWorker-3] 2018-05-22 03:49:00.039156 00:00
[2018-05-22 03:49:00,040: INFO/ForkPoolWorker-3] Task app1.tasks.run_every_minute[b0c142f5-adaa-43b2-8efc-6ee38eb1f191] succeeded in 0.0017904169999383157s: None

檔案結構:

$ tree
.
├── Pipfile
├── Pipfile.lock
├── app1
│   ├── __init__.py
│   ├── admin.py
│   ├── apps.py
│   ├── migrations
│   │   └── __init__.py
│   ├── models.py
│   ├── tasks.py
│   ├── tests.py
│   └── views.py
├── celerybeat-schedule.db
├── manage.py
└── proj
├── __init__.py
├── celery.py
├── settings.py
├── urls.py
└── wsgi.py
3 directories, 17 files

py檔案

# proj/celery.py
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "proj.settings")
app = Celery('proj')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
# app1/tasks.py
from celery.schedules import crontab
from celery.decorators import periodic_task
from django.utils import timezone
@periodic_task(run_every=crontab())
def run_every_minute():
# do something
print(timezone.now())
# proj/__init__.py
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ['celery_app']

參考:http://docs.celeryproject.org/en/latest/django/first-steps-with-django.html

https://medium.com/@yehandjoe/celery-4-periodic-task-in-django-9f6b5a8c21c7

http://yshblog.com/blog/164