Celery 2.2 DocumentationSQLAlchemy [http://www.sqlalchemy.org/] or the Django ORM [http://djangoproject.com/]) is also available. Celery is easy to integrate with Django [http://djangoproject.com/], Pylons [http://pylonshq.com/] you can only fetch the result of a task once (as it’s sent as a message). For list of backends available and related options see Task result backend settings. 3. Finally we list the modules the worker tasks.py, which we added earlier: CELERY_IMPORTS = ("tasks", ) That’s it. There are more options available, like how many processes you want to use to process work in parallel (the CELERY_CONCURRENCY setting)0 码力 | 505 页 | 878.66 KB | 1 年前3
Celery 2.3 DocumentationSQLAlchemy [http://www.sqlalchemy.org/] or the Django ORM [http://djangoproject.com/]) is also available. Celery is easy to integrate with Django [http://djangoproject.com/], Pylons [http://pylonshq.com/] you can only fetch the result of a task once (as it’s sent as a message). For list of backends available and related options see Task result backend settings. 3. Finally we list the modules the worker tasks.py, which we added earlier: CELERY_IMPORTS = ("tasks", ) That’s it. There are more options available, like how many processes you want to use to process work in parallel (the CELERY_CONCURRENCY setting)0 码力 | 530 页 | 900.64 KB | 1 年前3
Celery 3.1 DocumentationCelery('hello', broker='amqp://guest@localhost//') @app.task def hello(): return 'hello world' Highly Available Workers and clients will automatically retry in the event of connection loss or failure, and migrate tasks to a new broker see a list of event message types contribute to Celery learn about available configuration settings receive email when a task fails get a list of people and companies using "celery[librabbitmq]" $ pip install "celery[librabbitmq,redis,auth,msgpack]" The following bundles are available: Serializers celery[auth]: for using the auth serializer. celery[msgpack]: for using the msgpack0 码力 | 887 页 | 1.22 MB | 1 年前3
Celery 3.1 DocumentationCelery('hello', broker='amqp://guest@localhost//') @app.task def hello(): return 'hello world' • Highly Available Workers and clients will automatically retry in the event of connection loss or failure, and tasks to a new broker • see a list of event message types • contribute to Celery • learn about available configuration settings • receive email when a task fails • get a list of people and companies using "celery[librabbitmq]" $ pip install "celery[librabbitmq,redis,auth,msgpack]" The following bundles are available: Serializers celery[auth] for using the auth serializer. celery[msgpack] for using the msgpack0 码力 | 607 页 | 2.27 MB | 1 年前3
Celery 2.4 DocumentationSQLAlchemy [http://www.sqlalchemy.org/] or the Django ORM [http://djangoproject.com/]) is also available. Celery is easy to integrate with Django [http://djangoproject.com/], Pylons [http://pylonshq.com/] be used to install Celery and the dependencies for a given feature. The following bundles are available: celery-with-redis: for using Redis as a broker. celery-with-mongodb: for using MongoDB as a For a description of broker URLs and a full list of the various broker configuration options available to Celery, see Broker Settings. Installing the RabbitMQ Server See Installing RabbitMQ [http://www0 码力 | 543 页 | 957.42 KB | 1 年前3
Celery 2.3 Documentationfor Redis, Beanstalk, MongoDB, CouchDB and databases (using SQLAlchemy or the Django ORM) is also available. Celery is easy to integrate with Django, Pylons and Flask, using the django-celery, celery-pylons you can only fetch the result of a task once (as it’s sent as a message). For list of backends available and related options see Task result backend settings. 3. Finally we list the modules the worker tasks.py, which we added earlier: CELERY_IMPORTS = ("tasks", ) That’s it. There are more options available, like how many processes you want to use to process work in parallel (the CELERY_CONCURRENCY setting)0 码力 | 334 页 | 1.25 MB | 1 年前3
Celery 2.2 Documentationfor Redis, Beanstalk, MongoDB, CouchDB and databases (using SQLAlchemy or the Django ORM) is also available. Celery is easy to integrate with Django, Pylons and Flask, using the django-celery, celery-pylons you can only fetch the result of a task once (as it’s sent as a message). For list of backends available and related options see Task result backend settings. 3. Finally we list the modules the worker tasks.py, which we added earlier: CELERY_IMPORTS = ("tasks", ) That’s it. There are more options available, like how many processes you want to use to process work in parallel (the CELERY_CONCURRENCY setting)0 码力 | 314 页 | 1.26 MB | 1 年前3
Celery 2.5 DocumentationSQLAlchemy [http://www.sqlalchemy.org/] or the Django ORM [http://djangoproject.com/]) is also available. Celery is easy to integrate with web frameworks, some of which even have integration packages: be used to install Celery and the dependencies for a given feature. The following bundles are available: celery-with-redis: for using Redis as a broker. celery-with-mongodb: for using MongoDB as a For a description of broker URLs and a full list of the various broker configuration options available to Celery, see Broker Settings. Installing the RabbitMQ Server See Installing RabbitMQ [http://www0 码力 | 647 页 | 1011.88 KB | 1 年前3
Celery 1.0 Documentationwebhooks. The recommended message broker is RabbitMQ, but support for Redis and databases is also available. 1.1.1 Overview This is a high level overview of the architecture. The broker pushes tasks to this example we don’t want to store the results of the tasks, so we’ll use the simplest backend available; the AMQP backend: CELERY_RESULT_BACKEND = "amqp" 3. Finally, we list the modules to import, that Getting Started Celery Documentation, Release 1.0.6 (stable) That’s it. There are more options available, like how many processes you want to process work in parallel (the CELERY_CONCURRENCY setting),0 码力 | 123 页 | 400.69 KB | 1 年前3
Celery 1.0 Documentationmessage broker is RabbitMQ [http://www.rabbitmq.com/], but support for Redis and databases is also available. Overview This is a high level overview of the architecture. http://cloud.github.com/downloa this example we don’t want to store the results of the tasks, so we’ll use the simplest backend available; the AMQP backend: CELERY_RESULT_BACKEND = "amqp" 3. Finally, we list the modules to import, that tasks.py, which we added earlier: CELERY_IMPORTS = ("tasks", ) That’s it. There are more options available, like how many processes you want to process work in parallel (the CELERY_CONCURRENCY setting),0 码力 | 221 页 | 283.64 KB | 1 年前3
共 51 条
- 1
- 2
- 3
- 4
- 5
- 6













