{ "info": { "author": "Horst Gutmann", "author_email": "horst@zerokspot.com", "bugtrack_url": null, "classifiers": [ "Development Status :: 3 - Alpha", "Environment :: Console", "License :: OSI Approved :: MIT License", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3 :: Only" ], "description": "==========================\ncelery-prometheus-exporter\n==========================\n\n.. image:: https://img.shields.io/docker/automated/zerok/celery-prometheus-exporter.svg?maxAge=2592000\n :target: https://hub.docker.com/r/zerok/celery-prometheus-exporter/\n\ncelery-prometheus-exporter is a little exporter for Celery related metrics in\norder to get picked up by Prometheus. As with other exporters like\nmongodb\\_exporter or node\\_exporter this has been implemented as a\nstandalone-service to make reuse easier across different frameworks.\n\nSo far it provides access to the following metrics:\n\n* ``celery_tasks`` exposes the number of tasks currently known to the queue\n grouped by ``state`` (RECEIVED, STARTED, ...).\n* ``celery_tasks_by_name`` exposes the number of tasks currently known to the queue\n grouped by ``name`` and ``state``.\n* ``celery_workers`` exposes the number of currently probably alive workers\n* ``celery_task_latency`` exposes a histogram of task latency, i.e. the time until\n tasks are picked up by a worker\n* ``celery_tasks_runtime_seconds`` tracks the number of seconds tasks take\n until completed as histogram\n\n\nHow to use\n==========\n\nThere are multiple ways to install this. The obvious one is using ``pip install\ncelery-prometheus-exporter`` and then using the ``celery-prometheus-exporter``\ncommand::\n\n $ celery-prometheus-exporter\n Starting HTTPD on 0.0.0.0:8888\n\nThis package only depends on Celery directly, so you will have to install\nwhatever other dependencies you will need for it to speak with your broker \ud83d\ude42\n\nCelery workers have to be configured to send task-related events:\nhttp://docs.celeryproject.org/en/latest/userguide/configuration.html#worker-send-task-events.\n\nRunning ``celery-prometheus-exporter`` with the ``--enable-events`` argument\nwill periodically enable events on the workers. This is useful because it\nallows running celery workers with events disabled, until\n``celery-prometheus-exporter`` is deployed, at which time events get enabled\non the workers.\n\nAlternatively, you can use the bundle Makefile and Dockerfile to generate a\nDocker image.\n\nBy default, the HTTPD will listen at ``0.0.0.0:8888``. If you want the HTTPD\nto listen to another port, use the ``--addr`` option or the environment variable\n``DEFAULT_ADDR``.\n\nBy default, this will expect the broker to be available through\n``redis://redis:6379/0``, although you can change via environment variable\n``BROKER_URL``. If you're using AMQP or something else other than\nRedis, take a look at the Celery documentation and install the additioinal\nrequirements \ud83d\ude0a Also use the ``--broker`` option to specify a different broker\nURL.\n\nIf you need to pass additional options to your broker's transport use the\n``--transport-options`` option. It tries to read a dict from a JSON object.\nE.g. to set your master name when using Redis Sentinel for broker discovery:\n``--transport-options '{\"master_name\": \"mymaster\"}'``\n\nUse ``--tz`` to specify the timezone the Celery app is using. Otherwise the\nsystems local time will be used.\n\nBy default, buckets for histograms are the same as default ones in the prometheus client:\nhttps://github.com/prometheus/client_python#histogram.\nIt means they are intended to cover typical web/rpc requests from milliseconds to seconds,\nso you may want to customize them.\nIt can be done via environment variable ``RUNTIME_HISTOGRAM_BUCKETS`` for tasks runtime and\nvia environment variable ``LATENCY_HISTOGRAM_BUCKETS`` for tasks latency.\nBuckets should be passed as a list of float values separated by a comma.\nE.g. ``\".005, .05, 0.1, 1.0, 2.5\"``.\n\nUse ``--queue-list`` to specify the list of queues that will have its length\nmonitored (Automatic Discovery of queues isn't supported right now, see limitations/\ncaveats. You can use the `QUEUE_LIST` environment variable as well.\n\nIf you then look at the exposed metrics, you should see something like this::\n\n $ http get http://localhost:8888/metrics | grep celery_\n # HELP celery_workers Number of alive workers\n # TYPE celery_workers gauge\n celery_workers 1.0\n # HELP celery_tasks Number of tasks per state\n # TYPE celery_tasks gauge\n celery_tasks{state=\"RECEIVED\"} 3.0\n celery_tasks{state=\"PENDING\"} 0.0\n celery_tasks{state=\"STARTED\"} 1.0\n celery_tasks{state=\"RETRY\"} 2.0\n celery_tasks{state=\"FAILURE\"} 1.0\n celery_tasks{state=\"REVOKED\"} 0.0\n celery_tasks{state=\"SUCCESS\"} 8.0\n # HELP celery_tasks_by_name Number of tasks per state\n # TYPE celery_tasks_by_name gauge\n celery_tasks_by_name{name=\"my_app.tasks.calculate_something\",state=\"RECEIVED\"} 0.0\n celery_tasks_by_name{name=\"my_app.tasks.calculate_something\",state=\"PENDING\"} 0.0\n celery_tasks_by_name{name=\"my_app.tasks.calculate_something\",state=\"STARTED\"} 0.0\n celery_tasks_by_name{name=\"my_app.tasks.calculate_something\",state=\"RETRY\"} 0.0\n celery_tasks_by_name{name=\"my_app.tasks.calculate_something\",state=\"FAILURE\"} 0.0\n celery_tasks_by_name{name=\"my_app.tasks.calculate_something\",state=\"REVOKED\"} 0.0\n celery_tasks_by_name{name=\"my_app.tasks.calculate_something\",state=\"SUCCESS\"} 1.0\n celery_tasks_by_name{name=\"my_app.tasks.fetch_some_data\",state=\"RECEIVED\"} 3.0\n celery_tasks_by_name{name=\"my_app.tasks.fetch_some_data\",state=\"PENDING\"} 0.0\n celery_tasks_by_name{name=\"my_app.tasks.fetch_some_data\",state=\"STARTED\"} 1.0\n celery_tasks_by_name{name=\"my_app.tasks.fetch_some_data\",state=\"RETRY\"} 2.0\n celery_tasks_by_name{name=\"my_app.tasks.fetch_some_data\",state=\"FAILURE\"} 1.0\n celery_tasks_by_name{name=\"my_app.tasks.fetch_some_data\",state=\"REVOKED\"} 0.0\n celery_tasks_by_name{name=\"my_app.tasks.fetch_some_data\",state=\"SUCCESS\"} 7.0\n # HELP celery_task_latency Seconds between a task is received and started.\n # TYPE celery_task_latency histogram\n celery_task_latency_bucket{le=\"0.005\"} 2.0\n celery_task_latency_bucket{le=\"0.01\"} 3.0\n celery_task_latency_bucket{le=\"0.025\"} 4.0\n celery_task_latency_bucket{le=\"0.05\"} 4.0\n celery_task_latency_bucket{le=\"0.075\"} 5.0\n celery_task_latency_bucket{le=\"0.1\"} 5.0\n celery_task_latency_bucket{le=\"0.25\"} 5.0\n celery_task_latency_bucket{le=\"0.5\"} 5.0\n celery_task_latency_bucket{le=\"0.75\"} 5.0\n celery_task_latency_bucket{le=\"1.0\"} 5.0\n celery_task_latency_bucket{le=\"2.5\"} 8.0\n celery_task_latency_bucket{le=\"5.0\"} 11.0\n celery_task_latency_bucket{le=\"7.5\"} 11.0\n celery_task_latency_bucket{le=\"10.0\"} 11.0\n celery_task_latency_bucket{le=\"+Inf\"} 11.0\n celery_task_latency_count 11.0\n celery_task_latency_sum 16.478713035583496\n celery_queue_length{queue_name=\"queue1\"} 35.0\n celery_queue_length{queue_name=\"queue2\"} 0.0\n\nLimitations\n===========\n\n* Among tons of other features celery-prometheus-exporter doesn't support stats\n for multiple queues. As far as I can tell, only the routing key is exposed\n through the events API which might be enough to figure out the final queue,\n though.\n* This has only been tested with Redis so far.\n* At this point, you should specify the queues that will be monitored using an\n environment variable or an arg (`--queue-list`).\n\n", "description_content_type": "", "docs_url": null, "download_url": "", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/zerok/celery-prometheus-exporter", "keywords": "", "license": "MIT", "maintainer": "", "maintainer_email": "", "name": "celery-prometheus-exporter", "package_url": "https://pypi.org/project/celery-prometheus-exporter/", "platform": "", "project_url": "https://pypi.org/project/celery-prometheus-exporter/", "project_urls": { "Homepage": "https://github.com/zerok/celery-prometheus-exporter" }, "release_url": "https://pypi.org/project/celery-prometheus-exporter/1.7.0/", "requires_dist": [ "celery (>=3)", "prometheus-client (>=0.0.20)" ], "requires_python": "", "summary": "Simple Prometheus metrics exporter for Celery", "version": "1.7.0" }, "last_serial": 5226013, "releases": { "1.0.0": [ { "comment_text": "", "digests": { "md5": "5da8ecf627c32dbc236e38a672598e84", "sha256": "2c7dc102a058653c15c22616499ee207697b404dc050d8d326f7f550bc5af116" }, "downloads": -1, "filename": "celery_prometheus_exporter-1.0.0-py3-none-any.whl", "has_sig": false, "md5_digest": "5da8ecf627c32dbc236e38a672598e84", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 6629, "upload_time": "2016-06-02T19:53:29", "url": "https://files.pythonhosted.org/packages/1b/dd/dde4ae5a71e73fac312edd3ec475a020a48de188ee6ea66a18efa19fc19c/celery_prometheus_exporter-1.0.0-py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "33d8a4ee490f41c7c3ac632e35f27468", "sha256": "c068d707a8c22d9b005017d14d6120ac2920a268c569105f4794860d20d3b227" }, "downloads": -1, "filename": "celery-prometheus-exporter-1.0.0.tar.gz", "has_sig": false, "md5_digest": "33d8a4ee490f41c7c3ac632e35f27468", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4116, "upload_time": "2016-06-02T19:53:33", "url": "https://files.pythonhosted.org/packages/8c/5f/cb844d4dc4518007d350087f46aa80018f81876d959dfde6e3cf33fbffa8/celery-prometheus-exporter-1.0.0.tar.gz" } ], "1.0.1": [ { "comment_text": "", "digests": { "md5": "8659c0096a97c8abf6bc4b7d1f82318c", "sha256": "42aee25b996999bb0dda2ac7573c559ad6e7268a6b4f9c31490a5c90bc6cd8a5" }, "downloads": -1, "filename": "celery-prometheus-exporter-1.0.1.tar.gz", "has_sig": false, "md5_digest": "8659c0096a97c8abf6bc4b7d1f82318c", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4266, "upload_time": "2017-06-15T10:03:19", "url": "https://files.pythonhosted.org/packages/50/3f/f7b14a07afc1817ba58c13b586203fe9c13a2396239ebcdf51178b7b8396/celery-prometheus-exporter-1.0.1.tar.gz" } ], "1.1.0": [ { "comment_text": "", "digests": { "md5": "44effce5431cee7ac3a38ea9c1f45379", "sha256": "4534b7b11f6dc3763fd5e1b2ab6c06af230953632c849e025bef02953ccc6dcd" }, "downloads": -1, "filename": "celery-prometheus-exporter-1.1.0.tar.gz", "has_sig": false, "md5_digest": "44effce5431cee7ac3a38ea9c1f45379", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4475, "upload_time": "2017-06-18T09:48:06", "url": "https://files.pythonhosted.org/packages/6e/1c/c59c5017d016ad9caab1a173b8dd46af9bb251b915b2489fcb881cbcc33e/celery-prometheus-exporter-1.1.0.tar.gz" } ], "1.2.0": [ { "comment_text": "", "digests": { "md5": "4bd4aeb7189175f69d717d9e5c514e41", "sha256": "7a674105eb1460825bd1063239c3adab11fe6edbbf8ca2578875e2343e4ef4ef" }, "downloads": -1, "filename": "celery-prometheus-exporter-1.2.0.tar.gz", "has_sig": false, "md5_digest": "4bd4aeb7189175f69d717d9e5c514e41", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 7658, "upload_time": "2018-04-05T18:39:56", "url": "https://files.pythonhosted.org/packages/a0/93/91044286521e3518a986fb3e6e25ad6702034ecc76b20a06f7d9ec102d2a/celery-prometheus-exporter-1.2.0.tar.gz" } ], "1.3.0": [ { "comment_text": "", "digests": { "md5": "c969508b56a38f85d7209434cc5c6263", "sha256": "7c45bd5f9e5543a67925694672db0956f5c7577f2229773b15a06d8a7e3a4826" }, "downloads": -1, "filename": "celery-prometheus-exporter-1.3.0.tar.gz", "has_sig": false, "md5_digest": "c969508b56a38f85d7209434cc5c6263", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 8099, "upload_time": "2018-08-06T15:02:06", "url": "https://files.pythonhosted.org/packages/81/2e/38d978cde0b30a1e891e059bac960697d092bb2d7ebf771f67933e176d87/celery-prometheus-exporter-1.3.0.tar.gz" } ], "1.4.0": [ { "comment_text": "", "digests": { "md5": "fc4ae7926667e98476ef41c214ac1e54", "sha256": "4a6ada47e9b4b6a25c8b3da2b9943735d23e0569941572cb21446e97d3bd06b8" }, "downloads": -1, "filename": "celery-prometheus-exporter-1.4.0.tar.gz", "has_sig": false, "md5_digest": "fc4ae7926667e98476ef41c214ac1e54", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 8093, "upload_time": "2018-12-16T16:57:17", "url": "https://files.pythonhosted.org/packages/28/d6/adf099e3e5dee72b54732e4a64cd20fc3a53dbca5c9dc4c65b4c541189ce/celery-prometheus-exporter-1.4.0.tar.gz" } ], "1.5.0": [ { "comment_text": "", "digests": { "md5": "40770fced858ab1763d15210565bb839", "sha256": "178605026e1e9921e28c5083edccac1d95578739f09870b7fa797b3741262ade" }, "downloads": -1, "filename": "celery-prometheus-exporter-1.5.0.tar.gz", "has_sig": false, "md5_digest": "40770fced858ab1763d15210565bb839", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 8141, "upload_time": "2019-02-07T18:16:10", "url": "https://files.pythonhosted.org/packages/cb/bb/d06b77ea1b3298b056c23bd8324c65ab1c30236f7c62718803e7a3b33449/celery-prometheus-exporter-1.5.0.tar.gz" } ], "1.6.0": [ { "comment_text": "", "digests": { "md5": "f5dee6ad07ff0e1fe95cfa09b14da703", "sha256": "2a7b4e0768d74a1f5ca7566f3d9ecd1c15c2e985a3b1b2d7bbd31b34ae528699" }, "downloads": -1, "filename": "celery-prometheus-exporter-1.6.0.tar.gz", "has_sig": false, "md5_digest": "f5dee6ad07ff0e1fe95cfa09b14da703", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 8472, "upload_time": "2019-04-01T18:53:35", "url": "https://files.pythonhosted.org/packages/de/ec/63543693549df6e016de17b55247beedd06f149ee9c2e2bfb3805fbb6e7b/celery-prometheus-exporter-1.6.0.tar.gz" } ], "1.7.0": [ { "comment_text": "", "digests": { "md5": "aa842d5414f24c0f09b2de12b2b96955", "sha256": "a3ba0d3340b546ae82b36fef7645ccbc54c2b696fc3df05bb9ee28a402e710e1" }, "downloads": -1, "filename": "celery_prometheus_exporter-1.7.0-py2-none-any.whl", "has_sig": false, "md5_digest": "aa842d5414f24c0f09b2de12b2b96955", "packagetype": "bdist_wheel", "python_version": "py2", "requires_python": null, "size": 8987, "upload_time": "2019-05-04T16:24:18", "url": "https://files.pythonhosted.org/packages/bd/5d/71ff513923ed105303b4697b9f982fcfc16ca29b3b27557005a7d001a346/celery_prometheus_exporter-1.7.0-py2-none-any.whl" }, { "comment_text": "", "digests": { "md5": "7a191a0e26eb3d2eefe5e5e3462085dd", "sha256": "8fc2d5909921c44f01c8c1b7d956d92e6966f2e14eec196bf60735e39a0e0991" }, "downloads": -1, "filename": "celery-prometheus-exporter-1.7.0.tar.gz", "has_sig": false, "md5_digest": "7a191a0e26eb3d2eefe5e5e3462085dd", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 11369, "upload_time": "2019-05-04T16:24:20", "url": "https://files.pythonhosted.org/packages/c5/3f/c7ae53ad33c736c48067730d184f27d58cbd0e60031f321d5787534751d7/celery-prometheus-exporter-1.7.0.tar.gz" } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "aa842d5414f24c0f09b2de12b2b96955", "sha256": "a3ba0d3340b546ae82b36fef7645ccbc54c2b696fc3df05bb9ee28a402e710e1" }, "downloads": -1, "filename": "celery_prometheus_exporter-1.7.0-py2-none-any.whl", "has_sig": false, "md5_digest": "aa842d5414f24c0f09b2de12b2b96955", "packagetype": "bdist_wheel", "python_version": "py2", "requires_python": null, "size": 8987, "upload_time": "2019-05-04T16:24:18", "url": "https://files.pythonhosted.org/packages/bd/5d/71ff513923ed105303b4697b9f982fcfc16ca29b3b27557005a7d001a346/celery_prometheus_exporter-1.7.0-py2-none-any.whl" }, { "comment_text": "", "digests": { "md5": "7a191a0e26eb3d2eefe5e5e3462085dd", "sha256": "8fc2d5909921c44f01c8c1b7d956d92e6966f2e14eec196bf60735e39a0e0991" }, "downloads": -1, "filename": "celery-prometheus-exporter-1.7.0.tar.gz", "has_sig": false, "md5_digest": "7a191a0e26eb3d2eefe5e5e3462085dd", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 11369, "upload_time": "2019-05-04T16:24:20", "url": "https://files.pythonhosted.org/packages/c5/3f/c7ae53ad33c736c48067730d184f27d58cbd0e60031f321d5787534751d7/celery-prometheus-exporter-1.7.0.tar.gz" } ] }