相关文章推荐
有腹肌的啄木鸟  ·  android - How to fix ...·  2 年前    · 
伤情的红豆  ·  使用 Python ...·  2 年前    · 
绅士的皮蛋  ·  java.io.IOException: ...·  2 年前    · 
慈祥的核桃  ·  Spark 连接kafka报错: ...·  2 年前    · 
Collectives™ on Stack Overflow

Find centralized, trusted content and collaborate around the technologies you use most.

Learn more about Collectives

Teams

Q&A for work

Connect and share knowledge within a single location that is structured and easy to search.

Learn more about Teams

I run: celeryd --loglevel=INFO

/usr/local/lib/python2.7/dist-packages/celery/loaders/default.py:64: NotConfigured: No 'celeryconfig' module found! Please make sure it exists and is available to Python.
  "is available to Python." % (configname, )))
[2012-03-19 04:26:34,899: WARNING/MainProcess]  
 -------------- celery@ubuntu v2.5.1
---- **** -----
--- * ***  * -- [Configuration]
-- * - **** ---   . broker:      amqp://guest@localhost:5672//
- ** ----------   . loader:      celery.loaders.default.Loader
- ** ----------   . logfile:     [stderr]@INFO
- ** ----------   . concurrency: 4
- ** ----------   . events:      OFF
- *** --- * ---   . beat:        OFF
-- ******* ----
--- ***** ----- [Queues]
 --------------   . celery:      exchange:celery (direct) binding:celery

tasks.py:

# -*- coding: utf-8 -*-
from celery.task import task
@task
def add(x, y):
    return x + y

run_task.py:

# -*- coding: utf-8 -*-
from tasks import add
result = add.delay(4, 4)
print (result)
print (result.ready())
print (result.get())

In same folder celeryconfig.py:

CELERY_IMPORTS = ("tasks", )
CELERY_RESULT_BACKEND = "amqp"
BROKER_URL = "amqp://guest:guest@localhost:5672//"
CELERY_TASK_RESULT_EXPIRES = 300

When I run "run_task.py":

on python console

eb503f77-b5fc-44e2-ac0b-91ce6ddbf153
False

errors on celeryd server

[2012-03-19 04:34:14,913: ERROR/MainProcess] Received unregistered task of type 'tasks.add'.
The message has been ignored and discarded.
Did you remember to import the module containing this task?
Or maybe you are using relative imports?
Please see http://bit.ly/gLye1c for more information.
The full contents of the message body was:
{'retries': 0, 'task': 'tasks.add', 'utc': False, 'args': (4, 4), 'expires': None, 'eta': None, 'kwargs': {}, 'id': '841bc21f-8124-436b-92f1-e3b62cafdfe7'}
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/celery/worker/consumer.py", line 444, in receive_message
    self.strategies[name](message, body, message.ack_log_error)
KeyError: 'tasks.add'

Please explain what's the problem.

Hi, could you please share what the problem was and how you resolved? The accepted answer doesn't make it clear how others could solve this problem. Thanks. – Jordan Reiter Apr 17, 2012 at 22:30 This fixed it for me. If you're using celeryd scripts, the worker imports your task module(s) at startup. Even if you then create more task functions or alter existing ones, the worker will be using its in-memory copies as they were when it read them. – Mark Jul 23, 2013 at 8:19 Note: you can verify that your tasks is or is not registered by running celery inspect registered – Nick Brady Mar 8, 2016 at 18:52 You also can start celery with option --autoreload which will restart celery each time code was changed. – Sergey Lyapustin Aug 2, 2016 at 15:09 Unfortunately deprecated. One could use a solution from this link: avilpage.com/2017/05/… – Tomasz Szkudlarek May 17, 2019 at 8:53

I had the same problem: The reason of "Received unregistered task of type.." was that celeryd service didn't find and register the tasks on service start (btw their list is visible when you start ./manage.py celeryd --loglevel=info ).

These tasks should be declared in CELERY_IMPORTS = ("tasks", ) in settings file.
If you have a special celery_settings.py file it has to be declared on celeryd service start as --settings=celery_settings.py as digivampire wrote.

Thanks, I actually had the issue because I started celery using ~/path/to/celery/celeryd instead of using the manage.py command! – Antoine Feb 17, 2014 at 10:22

You can see the current list of registered tasks in the celery.registry.TaskRegistry class. Could be that your celeryconfig (in the current directory) is not in PYTHONPATH so celery can't find it and falls back to defaults. Simply specify it explicitly when starting celery.

celeryd --loglevel=INFO --settings=celeryconfig

You can also set --loglevel=DEBUG and you should probably see the problem immediately.

celeryd is obsolete. Now one should run celery worker e.g for Django like this celery --app=your_app.celery worker --loglevel=info – andilabs Jan 14, 2016 at 11:25 For me (celery 3.1.23), I had to use celery.registry.tasks to see a list of all of my current tasks. You can always check by running dir(celery.registry). – Nick Brady Sep 30, 2016 at 18:21

Whether you use CELERY_IMPORTS or autodiscover_tasks, the important point is the tasks are able to be found and the name of the tasks registered in Celery should match the names the workers try to fetch.

When you launch the Celery, say celery worker -A project --loglevel=DEBUG, you should see the name of the tasks. For example, if I have a debug_task task in my celery.py.

[tasks]
. project.celery.debug_task
. celery.backend_cleanup
. celery.chain
. celery.chord
. celery.chord_unlock
. celery.chunks
. celery.group
. celery.map
. celery.starmap

If you can't see your tasks in the list, please check your celery configuration imports the tasks correctly, either in --setting, --config, celeryconfig or config_from_object.

If you are using celery beat, make sure the task name, task, you use in CELERYBEAT_SCHEDULE matches the name in the celery task list.

This was very helpful. The name of the task needs to match the the 'task' key in your CELERYBEAT_SCHEDULE – ss_millionaire Dec 2, 2018 at 1:23 *The important point is the tasks are able to be found and the name of the tasks registered in Celery should match the names the workers try to fetch. * Good point!!! – Light.G Jan 25, 2019 at 7:09 This is the correct answer. Your task name in the BEAT_SCHEDULER should match whatever shows up on the list of autodiscovered tasks. So if you used @task(name='check_periodically') then it should match what you put in the beat schedule, IE: CELERY_BEAT_SCHEDULE = { 'check_periodically': { 'task': 'check_periodically', 'schedule': timedelta(seconds=1) } – Mormoran Aug 13, 2019 at 14:03

please include=['proj.tasks'] You need go to the top directory, then execute this

celery -A app.celery_module.celeryapp worker --loglevel=info
celery -A celeryapp worker --loglevel=info

in your celeryconfig.py input imports = ("path.path.tasks",)

please in other module invoke task!!!!!!!!

The include param need to be add if you're using relative imports. I've solved my issue by adding it – CK.Nguyen Sep 28, 2018 at 11:32 This should be the accepted answer; you need to call the worker from the top dir so that that path in the celery launch command matches the import path in the client. – Edward Gaere Feb 13, 2022 at 20:34 I don't understand this at all. is there any change of a code sample or to explain what 'proj.tasks' means? are you giving the root folder name where settings.py is or the app where tasks.py is held? – codyc4321 Sep 4, 2022 at 19:00

Using --settings did not work for me. I had to use the following to get it all to work:

celery --config=celeryconfig --loglevel=INFO

Here is the celeryconfig file that has the CELERY_IMPORTS added:

# Celery configuration file
BROKER_URL = 'amqp://'
CELERY_RESULT_BACKEND = 'amqp://'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'America/Los_Angeles'
CELERY_ENABLE_UTC = True
CELERY_IMPORTS = ("tasks",)

My setup was a little bit more tricky because I'm using supervisor to launch celery as a daemon.

What worked for me, was to add explicit name to celery task decorator. I changed my task declaration from @app.tasks to @app.tasks(name='module.submodule.task')

Here is an example

At first my task was like:

# tasks/test_tasks.py
@celery.task
def test_task():
    print("Celery Task  !!!!")

I changed it to :

# tasks/test_tasks.py
@celery.task(name='tasks.test_tasks.test_task')
def test_task():
    print("Celery Task  !!!!")

This method is helpful when you don't have a dedicated tasks.py file to include it in celery config.

This also worked for me, but not if I indicated the full path in the name kwarg, but only if I just copied the name, so just celery.task(name='test_task'). Stupid, but it worked. Trying to figure out why – Chris Oct 5, 2021 at 15:11

In my case the issue was, my project was not picking up autodiscover_tasks properly.

In celery.py file the code was for getting autodiscover_tasks was:

app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

I changed it to the following one:

from django.apps import apps
app.autodiscover_tasks(lambda: [n.name for n in apps.get_app_configs()])

Best wishes to you.

I had this problem mysteriously crop up when I added some signal handling to my django app. In doing so I converted the app to use an AppConfig, meaning that instead of simply reading as 'booking' in INSTALLED_APPS, it read 'booking.app.BookingConfig'.

Celery doesn't understand what that means, so I added, INSTALLED_APPS_WITH_APPCONFIGS = ('booking',) to my django settings, and modified my celery.py from

app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
app.autodiscover_tasks(
    lambda: settings.INSTALLED_APPS + settings.INSTALLED_APPS_WITH_APPCONFIGS

I had the same problem running tasks from Celery Beat. Celery doesn't like relative imports so in my celeryconfig.py, I had to explicitly set the full package name:

app.conf.beat_schedule = {
   'add-every-30-seconds': {
        'task': 'full.path.to.add',
        'schedule': 30.0,
        'args': (16, 16)
                I wish the celery docs had more examples with full package names. After seeing full.path.to.add in this answer, I found out I did not need the imports. I knew the solution was simple, and just needed to have a better example of the app.conf.beat_schedule.
– zerocog
                Aug 11, 2017 at 17:26

Try importing the Celery task in a Python Shell - Celery might silently be failing to register your tasks because of a bad import statement.

I had an ImportError exception in my tasks.py file that was causing Celery to not register the tasks in the module. All other module tasks were registered correctly.

This error wasn't evident until I tried importing the Celery task within a Python Shell. I fixed the bad import statement and then the tasks were successfully registered.

This, strangely, can also be because of a missing package. Run pip to install all necessary packages: pip install -r requirements.txt

autodiscover_tasks wasn't picking up tasks that used missing packages.

I had a similar issue. I think what happens is an exception during import causes parts of the auto-discovery to not complete. – Tim Tisdall Dec 19, 2018 at 14:42

I did not have any issue with Django. But encountered this when I was using Flask. The solution was setting the config option.

celery worker -A app.celery --loglevel=DEBUG --config=settings

while with Django, I just had:

python manage.py celery worker -c 2 --loglevel=info

I encountered this problem as well, but it is not quite the same, so just FYI. Recent upgrades causes this error message due to this decorator syntax.

ERROR/MainProcess] Received unregistered task of type 'my_server_check'.

@task('my_server_check')

Had to be change to just

@task()

No clue why.

Then in your config app, import the task in ready method like this:

from django.apps import AppConfig
class MyAppConfig(AppConfig):
    name = 'apps.myapp'
    def ready(self):
            import apps.myapp.signals  # noqa F401
            import apps.myapp.tasks
        except ImportError:

As some other answers have already pointed out, there are many reasons why celery would silently ignore tasks, including dependency issues but also any syntax or code problem.

One quick way to find them is to run:

./manage.py check

Many times, after fixing the errors that are reported, the tasks are recognized by celery.

If you are using docker or docker-compose this is the answer. Re-build, for some reason, it doesn't work quite right. I have my suspicions why, but not the time to explore them. Not just restart, rebuild. – ThatGuyRob Dec 7, 2021 at 20:10 Probably, your app context and celery worker's context don't match. Using celery with 3 different frameworks taught me the real reason. :D – AgE Jun 14, 2022 at 17:26

For me, restarting the broker (Redis) solved it.

The task already showed up correctly in Celery's task list and all relevant Django settings and imports worked fine.

My broker was running before I wrote the task, and restarting Celery and Django alone didn't solve it.

However, stopping Redis with Ctrl+C and then restarting it with redis-server helped Celery to correctly identify the task.

If you are running into this kind of error, there are a number of possible causes but the solution I found was that my celeryd config file in /etc/defaults/celeryd was configured for standard use, not for my specific django project. As soon as I converted it to the format specified in the celery docs, all was well.

Only the latter command was showing task names at all.

I have also tried adding CELERY_APP line /etc/default/celeryd but that didn't worked either.

CELERY_APP="tasks"

I had the issue with PeriodicTask classes in django-celery, while their names showed up fine when starting the celery worker every execution triggered:

KeyError: u'my_app.tasks.run'

My task was a class named 'CleanUp', not just a method called 'run'.

When I checked table 'djcelery_periodictask' I saw outdated entries and deleting them fixed the issue.

Just to add my two cents for my case with this error...

My path is /vagrant/devops/test with app.py and __init__.py in it.

When I run cd /vagrant/devops/ && celery worker -A test.app.celery --loglevel=info I am getting this error.

But when I run it like cd /vagrant/devops/test && celery worker -A app.celery --loglevel=info everything is OK.

I've found that one of our programmers added the following line to one of the imports:

os.chdir(<path_to_a_local_folder>)

This caused the Celery worker to change its working directory from the projects' default working directory (where it could find the tasks) to a different directory (where it couldn't find the tasks).

After removing this line of code, all tasks were found and registered.

Celery doesn't support relative imports so in my celeryconfig.py, you need absolute import.

CELERYBEAT_SCHEDULE = {
        'add_num': {
            'task': 'app.tasks.add_num.add_nums',
            'schedule': timedelta(seconds=10),
            'args': (1, 2)

An additional item to a really useful list.

I have found Celery unforgiving in relation to errors in tasks (or at least I haven't been able to trace the appropriate log entries) and it doesn't register them. I have had a number of issues with running Celery as a service, which have been predominantly permissions related.

The latest related to permissions writing to a log file. I had no issues in development or running celery at the command line, but the service reported the task as unregistered.

I needed to change the log folder permissions to enable the service to write to it.

My 2 cents

I was getting this in a docker image using alpine. The django settings referenced /dev/log for logging to syslog. The django app and celery worker were both based on the same image. The entrypoint of the django app image was launching syslogd on start, but the one for the celery worker was not. This was causing things like ./manage.py shell to fail because there wouldn't be any /dev/log. The celery worker was not failing. Instead, it was silently just ignoring the rest of the app launch, which included loading shared_task entries from applications in the django project

Thanks for contributing an answer to Stack Overflow!

  • Please be sure to answer the question. Provide details and share your research!

But avoid

  • Asking for help, clarification, or responding to other answers.
  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.