{ "info": { "author": "Stefan Talpalaru", "author_email": "stefantalpalaru@yahoo.com", "bugtrack_url": null, "classifiers": [ "Development Status :: 5 - Production/Stable", "License :: OSI Approved :: BSD License", "Operating System :: POSIX", "Programming Language :: Python", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.3", "Programming Language :: Python :: Implementation :: CPython", "Topic :: Software Development :: Object Brokering", "Topic :: System :: Distributed Computing" ], "description": "problem\n-------\n\nThe Celery daemon needs to be restarted every time an existing task is\nmodified or new tasks are added.\n\nsolution\n--------\n\nUse only one Celery task that's generic enough to run arbitrary Python\nfunctions with arbitrary arguments and wrap this task in a custom\ndecorator.\n\nAs a bonus, the job will still work (synchronously) when Celery/the\nbroker are not running.\n\nThe job's calling API is the same as Celery's: .s(), .delay() and\n.apply\\_async()\n\nexample\n-------\n\n- in celeryapp.py which is in the same directory as celeryconfig.py,\n create a Celery app and then import the custom decorator:\n\n.. code:: python\n\n import celery\n import os\n import sys\n\n sys.path.insert(0, os.path.join(os.path.dirname(__file__), '.'))\n #os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'settings')\n app = celery.Celery()\n app.config_from_object('celeryconfig')\n\n from generic_celery_task.decorators import task\n\n- start Celery along with its broker. You no longer need to restart\n Celery after this.\n\n- in another file, importable from celeryapp.py, create your task:\n\n.. code:: python\n\n from celeryapp import task\n\n @task\n def job(x, y):\n return 'x + y = %d' % (x + y)\n\n- now use it:\n\n.. code:: python\n\n import celery\n\n # direct function call\n assert job(1, 2) == 'x + y = 3'\n\n # using the delay() method\n res = job.delay(1, 2)\n # if the Celery daemon and its backend broker are running, 'res' is an instance of AsyncResult\n if isinstance(res, celery.result.AsyncResult):\n assert res.get() == 'x + y = 3'\n else:\n # if either one is not running, a warning is issued and the function is executed synchronously\n # with 'res' being the actual function result\n # if you want to silence the warnings, use '@task(quiet=True)'\n assert res == 'x + y = 3'\n\n # using the apply_async() method\n res = job.apply_async(args=[1, 2])\n # and process the result as above, if you need to\n\ninstallation\n------------\n\nA setup.py is provided. You know what to do with it.\n\ntesting\n-------\n\nThe tests require nose, redis, redis-py and assume that the port 6389 is\nfree.\n\nRun the tests with \"python setup.py test\" or with \"nosetests -v\".\n\nThis package was tested with python-2.7.6, python-3.3.4, nose-1.3.0,\ncelery-3.1.10, redis-2.8.7 and redis-py-2.9.1 .\n\ncaveats\n-------\n\n- the module which holds your custom task will be reloaded. If it\n contains a class using 'super' and its instance, you might run into\n the problem described\n `here `__.\n Apply one of the proposed fixes.\n\n- the state of the Celery daemon and its broker are checked only once,\n when the first .delay() or .apply\\_async() method is called on a\n custom task.\n\ncredits\n-------\n\n- author: Stefan Talpalaru stefantalpalaru@yahoo.com\n\n- homepage: https://github.com/stefantalpalaru/generic\\_celery\\_task", "description_content_type": null, "docs_url": null, "download_url": "UNKNOWN", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/stefantalpalaru/generic_celery_task", "keywords": null, "license": "BSD", "maintainer": null, "maintainer_email": null, "name": "generic_celery_task", "package_url": "https://pypi.org/project/generic_celery_task/", "platform": "UNKNOWN", "project_url": "https://pypi.org/project/generic_celery_task/", "project_urls": { "Download": "UNKNOWN", "Homepage": "https://github.com/stefantalpalaru/generic_celery_task" }, "release_url": "https://pypi.org/project/generic_celery_task/0.2/", "requires_dist": null, "requires_python": null, "summary": "A workaround for the lack of dynamic tasks in Celery", "version": "0.2" }, "last_serial": 1049466, "releases": { "0.2": [ { "comment_text": "", "digests": { "md5": "d53611ee3c98f5e6ab39aaa6e8bcfd46", "sha256": "96f3530da8b3cf073ebdc17a92334e8e1bbbc3980deb9a6964ad10acb3e763d0" }, "downloads": -1, "filename": "generic_celery_task-0.2.tar.gz", "has_sig": false, "md5_digest": "d53611ee3c98f5e6ab39aaa6e8bcfd46", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 5638, "upload_time": "2014-04-02T20:08:29", "url": "https://files.pythonhosted.org/packages/b9/91/1540e6cbabed30f51b580e4800a51ae4e4c91846d545f459e93ab9e21912/generic_celery_task-0.2.tar.gz" } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "d53611ee3c98f5e6ab39aaa6e8bcfd46", "sha256": "96f3530da8b3cf073ebdc17a92334e8e1bbbc3980deb9a6964ad10acb3e763d0" }, "downloads": -1, "filename": "generic_celery_task-0.2.tar.gz", "has_sig": false, "md5_digest": "d53611ee3c98f5e6ab39aaa6e8bcfd46", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 5638, "upload_time": "2014-04-02T20:08:29", "url": "https://files.pythonhosted.org/packages/b9/91/1540e6cbabed30f51b580e4800a51ae4e4c91846d545f459e93ab9e21912/generic_celery_task-0.2.tar.gz" } ] }