Skip to content Skip to sidebar Skip to footer

Error Message 'no Handlers Could Be Found For Logger "multiprocessing"' Using Celery

RabbitMQ now seems to be working correctly. However, when I try python -m celery.bin.celeryd --loglevel=INFO` (regular celeryd doesn't work), I get the error No handlers could

Solution 1:

You need to ensure that all processes started from the main process also set up logging correctly. Here's a post which discusses how best to do logging with multiprocessing. Though that discusses features with came in with Python 3.2, you can also get this functionality for earlier Python versions - see this other post.

Update: The point is that each process needs to initialise logging, and you need to arrange this in your code, in a similar way to what is done in the example in the first post: see listener_configurer and worker_configurer. You can use logutils to help you, but the main thing is to realise that each process needs to configure logging, if you are to avoid that "no handlers could be found" message.

Solution 2:

There seems to be something strange in the way celery initializes logging. If I put the switch --logfile=yourfile.log on the command-line, it works reliably, but reading the logging from the config gives unreliable results including the stuff you're seeing.

Solution 3:

Make sure the log file directory exists the and processes has write permission to the log directory.

I faced similar issue , after creating log file location directory the problem went away.

Post a Comment for "Error Message 'no Handlers Could Be Found For Logger "multiprocessing"' Using Celery"