{ "info": { "author": "", "author_email": "", "bugtrack_url": null, "classifiers": [ "Environment :: Web Environment", "Intended Audience :: System Administrators", "Natural Language :: English", "Operating System :: OS Independent", "Programming Language :: Python", "Programming Language :: Python :: 2", "Programming Language :: Python :: 3", "Topic :: System :: Monitoring" ], "description": "# Airflow prometheus exporter\n\nExposes dag and task based metrics from Airflow to a Prometheus compatible endpoint.\n\n## Discussion\n\nYou can ask questions in Gitter channel: https://gitter.im/epoch8/airflow-exporter\n\n## Screenshots\n\n\n\n## Compatibility\n\n**Note: this version is compatible with Airflow 1.10.3+ only, see [#46](https://github.com/epoch8/airflow-exporter/issues/46) for details**\n\nFor compatibility with previous versions of Airflow use older version: [v0.5.4](https://github.com/epoch8/airflow-exporter/releases/tag/v0.5.4)\n\n* Airflow: airflow1.10.3+\n* Python: python2, python3\n* DataBase: postgresql, mysql\n\n## Install\n\n```sh\npip install airflow-exporter\n```\n\nThat's it. You're done.\n\n## Exporting extra labels to Prometheus\n\nIt is possible to add extra labels to DAG-related metrics by providing `labels` dict to DAG `params`.\n\n### Example\n\n```\ndag = DAG(\n 'dummy_dag',\n schedule_interval=timedelta(hours=5),\n default_args=default_args,\n catchup=False,\n params={\n 'labels': {\n 'env': 'test'\n }\n }\n)\n```\n\nLabel `env` with value `test` will be added to all metrics related to `dummy_dag`:\n\n`airflow_dag_status{dag_id=\"dummy_dag\",env=\"test\",owner=\"owner\",status=\"running\"} 12.0`\n\n## Metrics\n\nMetrics will be available at \n\n```\nhttp:///admin/metrics/\n```\n\n### `airflow_task_status`\n\nLabels:\n\n* `dag_id`\n* `task_id`\n* `owner`\n* `status`\n\nValue: number of tasks in specific status.\n\n### `airflow_dag_status`\n\nLabels:\n\n* `dag_id`\n* `owner`\n* `status`\n\nValue: number of dags in specific status.\n\n### `airflow_dag_run_duration`\n\nLabels:\n\n* `dag_id`: unique identifier for a given DAG\n\nValue: duration in seconds of the longest DAG Run for given DAG. This metric \nis not available for DAGs that have already completed.\n\n## License\n\nDistributed under the BSD license. See [LICENSE](LICENSE) for more\ninformation.\n\n\n", "description_content_type": "text/markdown", "docs_url": null, "download_url": "", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/epoch8/airflow-exporter", "keywords": "airflow plugin prometheus exporter metrics", "license": "", "maintainer": "", "maintainer_email": "", "name": "airflow-exporter", "package_url": "https://pypi.org/project/airflow-exporter/", "platform": "", "project_url": "https://pypi.org/project/airflow-exporter/", "project_urls": { "Homepage": "https://github.com/epoch8/airflow-exporter" }, "release_url": "https://pypi.org/project/airflow-exporter/1.2.0/", "requires_dist": [ "apache-airflow (>=1.10.3)", "prometheus-client (>=0.4.2)" ], "requires_python": "", "summary": "Airflow plugin to export dag and task based metrics to Prometheus.", "version": "1.2.0" }, "last_serial": 5771702, "releases": { "1.0": [ { "comment_text": "", "digests": { "md5": "b0684b48fdfc79de6a8a0eea0dd10da7", "sha256": "4a15a72782c84136abc2b3616da4f663d3eb52f74febd7e773b404d617823d86" }, "downloads": -1, "filename": "airflow-exporter-1.0.tar.gz", "has_sig": false, "md5_digest": "b0684b48fdfc79de6a8a0eea0dd10da7", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 3796, "upload_time": "2019-06-08T18:50:32", "url": "https://files.pythonhosted.org/packages/8a/55/b0e2618a5b9d844f5eedb60cd0a2b38b9729e93cb4dda8e8bdfb1baa7a8f/airflow-exporter-1.0.tar.gz" } ], "1.1.0": [ { "comment_text": "", "digests": { "md5": "d514872a91b60460d6fc804b61205b5a", "sha256": "b0fd1b0effdb7949e7bfaa6d1e190d8407b3d8da4b1d948d4d8c123540f95478" }, "downloads": -1, "filename": "airflow_exporter-1.1.0-py2.py3-none-any.whl", "has_sig": false, "md5_digest": "d514872a91b60460d6fc804b61205b5a", "packagetype": "bdist_wheel", "python_version": "py2.py3", "requires_python": null, "size": 5393, "upload_time": "2019-08-20T13:05:15", "url": "https://files.pythonhosted.org/packages/77/95/78f3d7387b29acfcb3ffabb9d4498745ed96348df6b5ab25a08c25834952/airflow_exporter-1.1.0-py2.py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "ef8603621e4c9045d768fa56d651c464", "sha256": "fbe028160494fa962edc709feacf0bd4b52ee6f96c6fcb32278d04ea4bde9c5e" }, "downloads": -1, "filename": "airflow-exporter-1.1.0.tar.gz", "has_sig": false, "md5_digest": "ef8603621e4c9045d768fa56d651c464", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 3778, "upload_time": "2019-08-20T13:05:16", "url": "https://files.pythonhosted.org/packages/bc/21/406cd73e965737956542d069985bb4011d2023b889a2f7f9397529142e4e/airflow-exporter-1.1.0.tar.gz" } ], "1.2.0": [ { "comment_text": "", "digests": { "md5": "3af189b4c3cc2485b2e4e2dc763ec568", "sha256": "856241ff8a7abc49a10cb9483f4431ba8bb1a2a1ed7c9407c083208fd81cc017" }, "downloads": -1, "filename": "airflow_exporter-1.2.0-py2.py3-none-any.whl", "has_sig": false, "md5_digest": "3af189b4c3cc2485b2e4e2dc763ec568", "packagetype": "bdist_wheel", "python_version": "py2.py3", "requires_python": null, "size": 5772, "upload_time": "2019-09-02T16:17:40", "url": "https://files.pythonhosted.org/packages/60/0e/ae96a03ec748b8ea5d6065003ac4b81322bcff73475630732d22ffd905a9/airflow_exporter-1.2.0-py2.py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "923f8d4e981634d6a36702ff897c4abc", "sha256": "e296c0f770289396335a2c5209acfa7068f71792cf3c5600d67386199bed8cf6" }, "downloads": -1, "filename": "airflow-exporter-1.2.0.tar.gz", "has_sig": false, "md5_digest": "923f8d4e981634d6a36702ff897c4abc", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4203, "upload_time": "2019-09-02T16:17:41", "url": "https://files.pythonhosted.org/packages/f5/b1/c293bbd43818a3d47449f54a64dc63d34fe6f58bd3ab1ebb60c81f3f52ee/airflow-exporter-1.2.0.tar.gz" } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "3af189b4c3cc2485b2e4e2dc763ec568", "sha256": "856241ff8a7abc49a10cb9483f4431ba8bb1a2a1ed7c9407c083208fd81cc017" }, "downloads": -1, "filename": "airflow_exporter-1.2.0-py2.py3-none-any.whl", "has_sig": false, "md5_digest": "3af189b4c3cc2485b2e4e2dc763ec568", "packagetype": "bdist_wheel", "python_version": "py2.py3", "requires_python": null, "size": 5772, "upload_time": "2019-09-02T16:17:40", "url": "https://files.pythonhosted.org/packages/60/0e/ae96a03ec748b8ea5d6065003ac4b81322bcff73475630732d22ffd905a9/airflow_exporter-1.2.0-py2.py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "923f8d4e981634d6a36702ff897c4abc", "sha256": "e296c0f770289396335a2c5209acfa7068f71792cf3c5600d67386199bed8cf6" }, "downloads": -1, "filename": "airflow-exporter-1.2.0.tar.gz", "has_sig": false, "md5_digest": "923f8d4e981634d6a36702ff897c4abc", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4203, "upload_time": "2019-09-02T16:17:41", "url": "https://files.pythonhosted.org/packages/f5/b1/c293bbd43818a3d47449f54a64dc63d34fe6f58bd3ab1ebb60c81f3f52ee/airflow-exporter-1.2.0.tar.gz" } ] }