Introduction to the use of Celery in Django

Keywords: Python Celery Redis Django crontab

Celery introduction

Cellery is a simple, flexible and reliable distributed system dealing with a large number of messages, and provides the necessary tools to maintain such a system.

It is a task queue that focuses on real-time processing and supports task scheduling.

What is task queue

Task queue: a mechanism to distribute tasks between threads and machines.

Three parts of cellery

worker

Task execution unit > worker is the unit of task execution provided by Celery, and the worker runs concurrently in the distributed system node.

broker (warehouse for tasks)

Message middleware: Celery itself does not provide message services, but it can be easily integrated with the message middleware provided by the third party. Including, RabbitMQ, Redis, etc

backend (warehouse for results)

ask result store is used to store the results of tasks performed by workers. Celery supports storing the results of tasks in different ways, including AMQP, redis, etc

Usage scenarios

  • Asynchronous task: submit time-consuming operation tasks to cellery for asynchronous execution, such as sending SMS, email, message push, audio and video processing, etc.
  • Scheduled task: perform something regularly, such as daily data statistics

Basic command

#1. Start the cellery service:
#Non windows:
#Instruction: cell worker - a cell task (cell project file) - l info
 #windows: you need to download the eventlet module first, pip install eventlet
 #Instruction: cellery worker - a cellery_task - L Info - P eventlet

#2. Add task: add manually, and you need to customize the add task script; add task automatically, and configure it in cellery.py

#3. Acquisition structure: manual acquisition, need to customize task script

The use of cellery in Django project

Cellery directory structure

project
    |---celery_task
        |---celery.py # If you want to add tasks automatically, you can also configure the relevant configuration in cellery.py;
        |---tasks.py  # All task functions
    |---add_task.py   # Add task manually: immediate task, delayed task, scheduled task;
    |---get_result.py # Get results

The latter two documents can be added without any need, depending on the requirements.

Use

celery.py

from celery import Celery
# Import time related packages. See the following for usage
from datetime import timedelta
from celery.schedules import crontab

# Because we need to call models in Django project, we need to add Django environment
import os,django
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'bookapi.settings.dev')
django.setup()

# Add the warehouse for the task. Redis is used here
broker = "redis://127.0.0.1:6379/11"
# Warehouse to receive processing results
backend = "redis://127.0.0.1:6379/12"
# Specify tasks to be processed
include = ['celery_task.tasks']
app = Celery(broker=broker,backend=backend,include=include)

# Configure task time zone
app.conf.timezone = 'Asia/Shanghai'
app.conf.enable_utc = False

# Configure scheduled tasks
app.conf.beat_schedule = {
    'recommend-task': {
        'task': 'celery_task.tasks.recommend_num',
        # 'schedule': timedelta(seconds=20),
        'schedule': crontab(hour=24),  
        'args': ()
    },
    'monthly-task': {
        'task': 'celery_task.tasks.monthly_num',
        # 'schedule': timedelta(seconds=60),
        'schedule': crontab(day_of_month=1,hour=0),
        'args': ()
    }
}

tasks.py

from .celery import app
from bookapi.apps.user import models


@app.task
def recommend_num():
    user_list = models.User.objects.all()
    # print(user_list)   ## <QuerySet [<User: admin>, <User: 18700022899>]>
    for user in user_list:
        models.User.objects.filter(username=user.username).update(recommend_nums=3)


@app.task
def monthly_num():
    user_list = models.User.objects.all()
    # print(user_list)   ## <QuerySet [<User: admin>, <User: 18700022899>]>
    for user in user_list:
        models.User.objects.filter(username=user.username).update(monthly_nums=2)

Posted by seaten on Mon, 04 Nov 2019 20:40:47 -0800