{ "info": { "author": "Hubble", "author_email": "dev@hubblehq.com", "bugtrack_url": null, "classifiers": [ "Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3" ], "description": "# Heroku Kafka · [![CircleCI](https://circleci.com/gh/HubbleHQ/heroku-kafka.svg?style=shield)](https://circleci.com/gh/HubbleHQ/heroku-kafka)\n\n**THIS IS AN UNOFFICIAL PACKAGE**\n\nHeroku Kafka is a python package to help you get setup quickly and easily with Kafka on Heroku. There is an [offical package](https://github.com/heroku/kafka-helper) that is possibly more secure however it has not been updated to support python 3 correctly and does not seem to be maintained anymore.\n\n## Install\n\nThe easiest way to install the package is through pip.\n\n```\npip install heroku-kafka\n```\n\n## Usage\n\nThis package uses the [kafka-python package](https://github.com/dpkp/kafka-python) and the `HerokuKafkaProducer` and `HerokuKafkaConsumer` classes both inherit from the kafka-python base classes, and will contain all the same methods.\n\nNote: You can use this package on local by setting up an .env file using the same kafka\nenvironment variables as you have on your heroku site.\n\nNote: To test it is working on local I would install [heroku-kafka-util](https://github.com/osada9000/heroku-kafka-util) so you can see messages are being sent etc.\n\n### Producer\n\n```python\nfrom heroku_kafka import HerokuKafkaProducer\n\n\"\"\"\nAll the variable names here match the heroku env variable names.\nJust pass the env values straight in and it will work.\n\"\"\"\nproducer = HerokuKafkaProducer(\n url: KAFKA_URL, # Url string provided by heroku\n ssl_cert: KAFKA_CLIENT_CERT, # Client cert string\n ssl_key: KAFKA_CLIENT_CERT_KEY, # Client cert key string\n ssl_ca: KAFKA_TRUSTED_CERT, # Client trusted cert string\n prefix: KAFKA_PREFIX # Prefix provided by heroku\n)\n\n\"\"\"\nThe .send method will automatically prefix your topic with the KAFKA_PREFIX\nNOTE: If the message doesn't seem to be sending try `producer.flush()` to force send.\n\"\"\"\nproducer.send('topic_without_prefix', b\"some message\")\n```\n\nFor all other methods and properties refer to: [KafkaProducer Docs](https://kafka-python.readthedocs.io/en/master/apidoc/KafkaProducer.html).\n\n### Consumer\n\n```python\nfrom heroku_kafka import HerokuKafkaConsumer\n\n\"\"\"\nAll the variable names here match the heroku env variable names,\njust pass the env values straight in and it will work.\n\n*topics are optional and you can pass as many as you want in for the consumer to track,\nhowever if you want to subscribe after creation just use .subscribe as shown below.\n\nNote: The KAFKA_PREFIX will be added on automatically so don't worry about passing it in.\n\"\"\"\nconsumer = HerokuKafkaConsumer(\n 'topic_without_prefix_1', # Optional: You don't need to pass any topic at all\n 'topic_without_prefix_2', # You can list as many topics as you want to consume\n url: KAFKA_URL, # Url string provided by heroku\n ssl_cert: KAFKA_CLIENT_CERT, # Client cert string\n ssl_key: KAFKA_CLIENT_CERT_KEY, # Client cert key string\n ssl_ca: KAFKA_TRUSTED_CERT, # Client trusted cert string\n prefix: KAFKA_PREFIX # Prefix provided by heroku\n)\n\n\"\"\"\nTo subscribe to topic(s) after creating a consumer pass in a list of topics without the\nKAFKA_PREFIX.\n\"\"\"\nconsumer.subscribe(topics=('topic_without_prefix'))\n\n\"\"\"\n.assign requires a full topic name with prefix\n\"\"\"\nconsumer.assign([TopicPartition('topic_with_prefix', 2)])\n\n\"\"\"\nListening to events it is exactly the same as in kafka_python.\nRead the documention linked below for more info!\n\"\"\"\nfor msg in consumer:\n print (msg)\n```\n\nFor all other methods and properties refer to: [KafkaConsumer Docs](https://kafka-python.readthedocs.io/en/master/apidoc/KafkaConsumer.html).\n\n## Known Issues\n\n- `.assign` does not add in the topic prefix.\n- .NamedTemporaryFile may not work properly on a Windows system\n\n## Contribution\n\nIf you come across any issues feel free to fork and create a PR!\n\n## Setup\n\nFork the repo, requires [Docker](https://www.docker.com/products/docker-desktop)\n\n```bash\n>>> git clone git@github.com:.git\n>>> cd \n>>> make dev-build\n```\n\nCreate a .env file with working kafka information (that includes 2 working topics at the moment).\n\n```\nKAFKA_URL=\nKAFKA_CLIENT_CERT=\nKAFKA_CLIENT_CERT_KEY=\nKAFKA_TRUSTED_CERT=\nKAFKA_PREFIX=\n\nTOPIC1=\nTOPIC2=\n```\n\nNOTE: The way docker reads .env files is a bit strange. You can't have any quotes around your variable values and no new lines, replace all new lines with `\\n`.\n\n## Tests\n\n**The only way to check to see if the package work is to run the tests.**\n\nPlease make sure that any extra code you write comes with a test, it doesn't need to be over the top but just check what you have written works.\n\nAll tests at the moment require a working kafka setup as its pretty hard to check it is connecting correctly without them. This means it will also require an internet connection. You can copy across the connection details from heroku's kafka environment variables - also note you will need 2 test topics.\n\nTo run the tests:\n\n```bash\n>>> make dev-test\n```\n\n## Distribution\n\nTo create & upload the package:\n\n```bash\n>>> make package\n>>> make upload\n```\n\nNOTE: You will need to login to PIP to upload the package.\n\n[https://packaging.python.org/tutorials/packaging-projects/](https://packaging.python.org/tutorials/packaging-projects/)\n\n\n", "description_content_type": "text/markdown", "docs_url": null, "download_url": "", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/HubbleHQ/heroku-kafka", "keywords": "", "license": "MIT", "maintainer": "", "maintainer_email": "", "name": "heroku-kafka", "package_url": "https://pypi.org/project/heroku-kafka/", "platform": "", "project_url": "https://pypi.org/project/heroku-kafka/", "project_urls": { "Homepage": "https://github.com/HubbleHQ/heroku-kafka" }, "release_url": "https://pypi.org/project/heroku-kafka/2.1.2/", "requires_dist": [ "kafka-python (==1.4.6)" ], "requires_python": "", "summary": "Python kafka package for use with heroku's kafka.", "version": "2.1.2" }, "last_serial": 5546237, "releases": { "1.0.0": [ { "comment_text": "", "digests": { "md5": "e6c3602d460fb6c4890f954eae6f115e", "sha256": "9ec025ce6b8bd5305c1d6640f4b136e1668ad3f7edc8b28730ca966b17a25396" }, "downloads": -1, "filename": "heroku-kafka-1.0.0.tar.gz", "has_sig": false, "md5_digest": "e6c3602d460fb6c4890f954eae6f115e", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4683, "upload_time": "2018-01-04T18:01:16", "url": "https://files.pythonhosted.org/packages/24/f6/5d645e47ea76869e492e013d678a1e8d042bb0fddb19d9f2a9c838169762/heroku-kafka-1.0.0.tar.gz" } ], "1.1.0": [ { "comment_text": "", "digests": { "md5": "fd6e17921394ea959c6c79a267b51870", "sha256": "983a46391d66ab0c672bc8ba893745694c1311f9cea550d9ede11aa2ec2abf97" }, "downloads": -1, "filename": "heroku-kafka-1.1.0.tar.gz", "has_sig": false, "md5_digest": "fd6e17921394ea959c6c79a267b51870", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4626, "upload_time": "2018-01-09T17:18:29", "url": "https://files.pythonhosted.org/packages/f2/f1/a082cad000c900c125089c775836299cb97a100a8aae6feea3a6998ee5d7/heroku-kafka-1.1.0.tar.gz" } ], "2.0.0": [ { "comment_text": "", "digests": { "md5": "6ad01ea198f250f3e2f5c7405f508e7c", "sha256": "b17b34303b64d8f9e81972625f95277f399472b4d58625acb3de0907f8887234" }, "downloads": -1, "filename": "heroku_kafka-2.0.0-py2.py3-none-any.whl", "has_sig": false, "md5_digest": "6ad01ea198f250f3e2f5c7405f508e7c", "packagetype": "bdist_wheel", "python_version": "py2.py3", "requires_python": null, "size": 4484, "upload_time": "2018-06-13T11:38:56", "url": "https://files.pythonhosted.org/packages/b7/fc/8964b2a823edbe7db305aa8c1aa3c9e4797141e6c168eac5b891855616df/heroku_kafka-2.0.0-py2.py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "c0ce67c239c65b502ae6bc4c6dc2abc7", "sha256": "e8f7a494ebdb4f536fc257cc86b384236c79506d7a41d24a5e0be6d4829a244d" }, "downloads": -1, "filename": "heroku-kafka-2.0.0.tar.gz", "has_sig": false, "md5_digest": "c0ce67c239c65b502ae6bc4c6dc2abc7", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4748, "upload_time": "2018-06-13T11:38:58", "url": "https://files.pythonhosted.org/packages/72/24/a2e4f231f115493749651325e4cf4ba9dc44693f922179a617119b82ef41/heroku-kafka-2.0.0.tar.gz" } ], "2.1.1": [ { "comment_text": "", "digests": { "md5": "a3f41915548cac7b08cbc2a4d4372cb7", "sha256": "99af37b0e47f78d597791858a3e5b2b07ad42c030c38c7819f15a4c3acd43cd5" }, "downloads": -1, "filename": "heroku_kafka-2.1.1-py2.py3-none-any.whl", "has_sig": false, "md5_digest": "a3f41915548cac7b08cbc2a4d4372cb7", "packagetype": "bdist_wheel", "python_version": "py2.py3", "requires_python": null, "size": 5479, "upload_time": "2019-07-17T14:03:39", "url": "https://files.pythonhosted.org/packages/b2/7e/28df16d6403d7c126368d1864a9ba13a179e1a6d915c2900bb06855ae361/heroku_kafka-2.1.1-py2.py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "4eb748b0b224d7a397856e3f395a17e5", "sha256": "193c3ad084f9ddb6563500d12c945fca2a830c5889d70ff9a4bd22ba292e2e37" }, "downloads": -1, "filename": "heroku-kafka-2.1.1.tar.gz", "has_sig": false, "md5_digest": "4eb748b0b224d7a397856e3f395a17e5", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4927, "upload_time": "2019-07-17T14:03:41", "url": "https://files.pythonhosted.org/packages/c5/9c/621dd85ee694cee8ffa0c3e06ac2152acb8c412dfb51e88a33f191120134/heroku-kafka-2.1.1.tar.gz" } ], "2.1.2": [ { "comment_text": "", "digests": { "md5": "24d0ec492ad6ddb7a8f5171dc81b48a2", "sha256": "4dff50fddc58f3b4ca6556e4d6dcff9985a0c939a44530d4ac58246ef969b547" }, "downloads": -1, "filename": "heroku_kafka-2.1.2-py2.py3-none-any.whl", "has_sig": false, "md5_digest": "24d0ec492ad6ddb7a8f5171dc81b48a2", "packagetype": "bdist_wheel", "python_version": "py2.py3", "requires_python": null, "size": 5478, "upload_time": "2019-07-17T14:36:56", "url": "https://files.pythonhosted.org/packages/3c/25/91fecb9437ca0518274cabb54c0688b99c241adae9aeb3c31c0e1a621067/heroku_kafka-2.1.2-py2.py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "9531dad58f1cf06255cdb16066f0037c", "sha256": "3b3a09466579fa1a87998cc427d21feb28a9877133791563741b214886c457c1" }, "downloads": -1, "filename": "heroku-kafka-2.1.2.tar.gz", "has_sig": false, "md5_digest": "9531dad58f1cf06255cdb16066f0037c", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4926, "upload_time": "2019-07-17T14:36:57", "url": "https://files.pythonhosted.org/packages/89/4b/3d7b1dcad4ec3bce5410a4e54e803193de5145854910f0d1b13780ccddd3/heroku-kafka-2.1.2.tar.gz" } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "24d0ec492ad6ddb7a8f5171dc81b48a2", "sha256": "4dff50fddc58f3b4ca6556e4d6dcff9985a0c939a44530d4ac58246ef969b547" }, "downloads": -1, "filename": "heroku_kafka-2.1.2-py2.py3-none-any.whl", "has_sig": false, "md5_digest": "24d0ec492ad6ddb7a8f5171dc81b48a2", "packagetype": "bdist_wheel", "python_version": "py2.py3", "requires_python": null, "size": 5478, "upload_time": "2019-07-17T14:36:56", "url": "https://files.pythonhosted.org/packages/3c/25/91fecb9437ca0518274cabb54c0688b99c241adae9aeb3c31c0e1a621067/heroku_kafka-2.1.2-py2.py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "9531dad58f1cf06255cdb16066f0037c", "sha256": "3b3a09466579fa1a87998cc427d21feb28a9877133791563741b214886c457c1" }, "downloads": -1, "filename": "heroku-kafka-2.1.2.tar.gz", "has_sig": false, "md5_digest": "9531dad58f1cf06255cdb16066f0037c", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4926, "upload_time": "2019-07-17T14:36:57", "url": "https://files.pythonhosted.org/packages/89/4b/3d7b1dcad4ec3bce5410a4e54e803193de5145854910f0d1b13780ccddd3/heroku-kafka-2.1.2.tar.gz" } ] }