Start redis worker automatically - redis

Am using celery with redis its working fine but problem is to start worker have to give manually.Is there any way to start 'Redis woker' automatically without doing manually

There is a whole section in the Celery documentation about it.

Related

Is it safe to run SCRIPT FLUSH on Redis cluster?

Recently, I started to have some trouble with one of me Redis cluster. used_memroy and used_memory_rss increasing constantly.
According to some Googling, I found following discussion:
https://github.com/antirez/redis/issues/4570
Now I am wandering if it is safe to run SCRIPT FLUSH command on my production Redis cluster?
Yes - you can run the SCRIPT FLUSH command safely in a production cluster. The only potential side effect is blocking the server while it executes. Note, however, that you'll want to call it in each of your nodes.

Airflow celery with redis - timeout after 6h

I'm having some troubles using airflow 1.9.0 with CeleryExecutor using redis as broker.
I need to run a job that takes more than 6 hours to complete and I'm losing my celery workers.
Looking into airflow code in GitHub, There is a hard-coded configuration:
https://github.com/apache/incubator-airflow/blob/d760d63e1a141a43a4a43daee9abd54cf11c894b/airflow/config_templates/default_celery.py#L31
How could I bypass this problem?
This is configurable in airflow.cfg under the section celery_broker_transport_options.
See the commit adding this possibility https://github.com/apache/incubator-airflow/commit/be79f87f36b6b99649e0a1f6ab92b41640b3beaa

Redis still has data after Celery run

I have set up a Celery task that is using RabbitMQ as the broker and Redis as the backend. After running I noticed that my Redis server was still using a lot of memory. Upon inspection I found that there were still keys for each task that was created.
Is there a way to get Celery to clean up these keys only after the response has been received? I know some MessageBrokers use acks, is there an equivalent for a redis backend in Celery?
Yes, use result_expires. Please note that celery beat should run as well, as written in the documentation:
A built-in periodic task will delete the results after this time (celery.backend_cleanup), assuming that celery beat is enabled. The task runs daily at 4am.
Unfortunately Celery doesn't have acks for its backend, so the best solution for my project was to call forget on my responses after I was done with them.

Celery works without broker and backend running

I'm running Celery on my laptop, with rabbitmq being the broker and redis being the backend. I just used all the default settings and ran celery -A tasks worker --loglevel=info, then it all worked. The workers can get jobs done and I get fetch the execution results by calling result.get(). My question here is that why it works even if I didn't run the rebbitmq and redis servers at all. I did not set the accounts on the servers either. In many tutorials, the first step is to run the broker and backend servers before starting celery.
I'm new to these tools and do not quite understand how they work behind the scene. Any input would be greatly appreciated. Thanks in advance.
Never mind. I just realized that redis and rabbitmq automatically run after installation or shell startup. They must be running for celery to work.

Trying to start redis and resque scheduler within a rake task

I want to start redis and redis-scheduler from a rake task so I'm doing the following:
namespace :raketask do
task :start do
system("QUEUE=* rake resque:work &")
system("rake redis:start")
system("rake resque:scheduler")
end
end
The problem is the redis starts in the foreground and then this never kicks off the scheduler. If It won't start in the background (using &). Scheduler must be started AFTER redis is up and running.
similar to nirvdrum. The resque workers are going to fail/quit if redis isn't already running and accepting connections.
check out this gist for an example of how to get things started with monit (linux stuff).
Monit allows one service to be dependent on another, and makes sure they stay alive by monitoring a .pid file.
That strikes me as not a great idea. You should have your redis server started via an init script or something. But, if you really want to go this way, you probably need to modify your redis:start task to use nohup and background the process so you can disconnect from the TTY and keep the process running.