Not Getting Any Output from Dockerized Celery

  celery, celerybeat, django, docker, python

I have been working off of two similar tutorials (here and here) trying to create a scheduled task for my Django project, but I can’t seem to get any output from the Celery services.

I’ve added Celery and Redis to the requirements.txt.

Django==3.0.7
celery==4.4.7
redis==3.5.3

Added the celery, celery-beat, and redis services to the docker-compose.yaml.

version: "3.8"

services:
  cache:
    image: redis:alpine

  database:
    image: "postgis/postgis"
    ports:
      - "5432:5432"
    environment:
      POSTGRES_DB: postgres
      POSTGRES_USER: ${DATABASE_DEFAULT_USERNAME}
      POSTGRES_PASSWORD: ${DATABASE_DEFAULT_PASSWORD}

  web:
    build: .
    ports:
      - "8000:8000"
    volumes:
      - .:/app
    depends_on:
      - cache
      - database
    env_file:
      - .env

  queue:
    build: .
    command: celery -A api worker -l info
    volumes:
      - .:/app
    links:
      - cache
    depends_on:
      - cache
      - database

  scheduler:
    build: .
    command: celery -A api beat -l info
    volumes:
      - .:/app
    links:
      - cache
    depends_on:
      - cache
      - database

Added the celery.py file to the api/api/ directory.

import os

from celery import Celery

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'api.settings')

app = Celery('api')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()

@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

Added the following to api/api/__init__.py.

from .celery import app as celery_app

__all__ = ('celery_app',)

Created a task in api/server/tasks.py.

import logging

from celery import shared_task

logger = logging.getLogger(__name__)


@shared_task
def sample_task():
    logger.debug('The sample task just ran.')

And added the following to api/api/settings/base.py (base.py is imported into api/api/settings/__init__.py).

CELERY_BROKER_URL = 'redis://cache:6379'
CELERY_RESULT_BACKEND = 'redis://cache:6379'

CELERY_BEAT_SCHEDULE = {
    'sample_task': {
        'task': 'server.tasks.sample_task',
        'schedule': crontab(),
    },
}

The cache service seems to start up fine (outputting a bunch of logs, ending with "Ready to accept connections"), but the queue and scheduler services don’t seem to output anything. When I run either docker-compose logs -f 'queue' or docker-compose logs -f 'scheduler', I get the following output.

Attaching to

I can import and run the sample_task() as a Celery task from the python console just fine.

>>> from server.tasks import sample_task
>>> sample_task()
2020-09-10 13:47:15,020 server.tasks DEBUG    The sample task just ran.
>>> sample_task.delay()
<AsyncResult: 1594ed42-23d0-4302-8754-ba9097c170bd>

So what am I doing wrong? What did I miss?

Source: Docker Questions

LEAVE A COMMENT