{ "info": { "author": "Sreejoy Halder", "author_email": "sreejoy4242@gmail.com", "bugtrack_url": null, "classifiers": [ "License :: OSI Approved :: MIT License", "Operating System :: OS Independent", "Programming Language :: Python :: 2.7" ], "description": "## CrawlerFriend\n\nA light weight **Web Crawler** that supports **Python 2.7** which gives search results in HTML form or in\nDictionary form given URLs and Keywords. If you regularly visit a few websites and look for a few keywords\nthen this python package will automate the task for you and\nreturn the result in a HTML file in your web browser.\n\n### Installation\n```\npip install CrawlerFriend\n```\n\n### How to use?\n#### All Result in HTML\n```\nimport CrawlerFriend\n\nurls = [\"http://www.goal.com/\",\"http://www.skysports.com/football\",\"https://www.bbc.com/sport/football\"]\nkeywords = [\"Ronaldo\",\"Liverpool\",\"Salah\",\"Real Madrid\",\"Arsenal\",\"Chelsea\",\"Man United\",\"Man City\"]\n\ncrawler = CrawlerFriend.Crawler(urls, keywords)\ncrawler.crawl()\ncrawler.get_result_in_html()\n```\n\nThe above code will open the following HTML document in Browser\n\n![](https://i.imgur.com/aPoNAYu.png)\n\n#### All Result in Dictionary\n```\nresult_dict = crawler.get_result()\n```\n\n#### Changing Default Arguments\nCrawlerFriend uses four HTML tags 'title', 'h1', 'h2', 'h3' and max_link_limit = 50 by default for searching.\nBut it can be changed by passing arguments to the constructor:\n ```\ncrawler = CrawlerFriend.Crawler(urls, keywords, max_link_limit=200, tags=['p','h4'])\ncrawler.crawl()\n```\n\n\n\n", "description_content_type": "text/markdown", "docs_url": null, "download_url": "", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/Sreejoy/CrawlerFriend", "keywords": "", "license": "", "maintainer": "", "maintainer_email": "", "name": "CrawlerFriend", "package_url": "https://pypi.org/project/CrawlerFriend/", "platform": "", "project_url": "https://pypi.org/project/CrawlerFriend/", "project_urls": { "Homepage": "https://github.com/Sreejoy/CrawlerFriend" }, "release_url": "https://pypi.org/project/CrawlerFriend/1.0.11/", "requires_dist": [ "requests", "beautifulsoup4" ], "requires_python": "", "summary": "A light weight crawler which gives search results in HTML form or in Dictionary form, given urls and keywords.", "version": "1.0.11" }, "last_serial": 4169884, "releases": { "1.0.10": [ { "comment_text": "", "digests": { "md5": "6a80677d13516c0f77e39c56b10833c9", "sha256": "379e91f9f3e9fc8691d4c28e0a4a00e4ec9eb17845b80f2d3ae9f340b9040c99" }, "downloads": -1, "filename": "CrawlerFriend-1.0.10-py2-none-any.whl", "has_sig": false, "md5_digest": "6a80677d13516c0f77e39c56b10833c9", "packagetype": "bdist_wheel", "python_version": "py2", "requires_python": null, "size": 6394, "upload_time": "2018-08-14T08:07:57", "url": "https://files.pythonhosted.org/packages/51/1a/54b682ac8b969d0e90ea83c790aee2e7b21ead282b1dca0955b9ef852c21/CrawlerFriend-1.0.10-py2-none-any.whl" }, { "comment_text": "", "digests": { "md5": "4aef41b21323c9d3124d4dc79d766dda", "sha256": "47e5b7767f5bd36d78c953e27ba233975ed804403ab027ea3143e59b46eea5e1" }, "downloads": -1, "filename": "CrawlerFriend-1.0.10.tar.gz", "has_sig": false, "md5_digest": "4aef41b21323c9d3124d4dc79d766dda", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4439, "upload_time": "2018-08-14T08:07:59", "url": "https://files.pythonhosted.org/packages/24/41/61e31b9fbaf0dfaecd9ab4760815cefc1d0d8228e145f41de7d3fe05ba54/CrawlerFriend-1.0.10.tar.gz" } ], "1.0.11": [ { "comment_text": "", "digests": { "md5": "0d774bf665e2f509d03077ce9e574435", "sha256": "609aefc0c1004cf31441cd5442a4f3bd3e450821c650fd7affb2179d12f53c19" }, "downloads": -1, "filename": "CrawlerFriend-1.0.11-py2-none-any.whl", "has_sig": false, "md5_digest": "0d774bf665e2f509d03077ce9e574435", "packagetype": "bdist_wheel", "python_version": "py2", "requires_python": null, "size": 5023, "upload_time": "2018-08-14T15:14:29", "url": "https://files.pythonhosted.org/packages/bd/5f/f92072259846abe7c5253f9d1388778fb9aed6a675374925a5c814ba6f0c/CrawlerFriend-1.0.11-py2-none-any.whl" }, { "comment_text": "", "digests": { "md5": "7f8918aab8edc656dafd154be11cedbd", "sha256": "23982159c4c6f2e6678d0f519ad2c984968a1d503d20ef5585296b7cd1054e91" }, "downloads": -1, "filename": "CrawlerFriend-1.0.11.tar.gz", "has_sig": false, "md5_digest": "7f8918aab8edc656dafd154be11cedbd", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4637, "upload_time": "2018-08-14T15:14:31", "url": "https://files.pythonhosted.org/packages/21/8c/b16ab03a45152625455419e121a3bd052e38aed75de8b6937c0c14aa78d8/CrawlerFriend-1.0.11.tar.gz" } ], "1.0.8": [ { "comment_text": "", "digests": { "md5": "653a0a757785844b9bdc9ed6d23d0356", "sha256": "8e5fbe7678fe0b79c4491f78af533535b9e8ccb9097321f3c34a59b1ceb4179d" }, "downloads": -1, "filename": "CrawlerFriend-1.0.8-py2-none-any.whl", "has_sig": false, "md5_digest": "653a0a757785844b9bdc9ed6d23d0356", "packagetype": "bdist_wheel", "python_version": "py2", "requires_python": null, "size": 4762, "upload_time": "2018-07-28T07:07:39", "url": "https://files.pythonhosted.org/packages/00/42/5318d23ccf517312c3ed6d7e44fa285e18235e1f383ca1e337bdd4ee9109/CrawlerFriend-1.0.8-py2-none-any.whl" }, { "comment_text": "", "digests": { "md5": "e26497b7f50c863f6acb8b3cb31d119f", "sha256": "2f51efee0a5743f65a148bf6610a549d75c595ff5555ebd8e67e40831f9a80d1" }, "downloads": -1, "filename": "CrawlerFriend-1.0.8.tar.gz", "has_sig": false, "md5_digest": "e26497b7f50c863f6acb8b3cb31d119f", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4432, "upload_time": "2018-07-28T07:07:40", "url": "https://files.pythonhosted.org/packages/95/95/890b97e5866e3505ecccd938c9be4dcbbdc937fef7f307bd1582b046cdf9/CrawlerFriend-1.0.8.tar.gz" } ], "1.0.9": [ { "comment_text": "", "digests": { "md5": "a78a17d06746817c956e1a3e61539909", "sha256": "e2b3bebcc9c12d02d13034577d02a20abab5fabf4ccacd711f6c7aa307bb6fd9" }, "downloads": -1, "filename": "CrawlerFriend-1.0.9-py2-none-any.whl", "has_sig": false, "md5_digest": "a78a17d06746817c956e1a3e61539909", "packagetype": "bdist_wheel", "python_version": "py2", "requires_python": null, "size": 5217, "upload_time": "2018-08-04T04:10:05", "url": "https://files.pythonhosted.org/packages/30/09/c5f48a8ac349e9c139f923914eb47d8c08404c185a2c031b160637fa05d4/CrawlerFriend-1.0.9-py2-none-any.whl" }, { "comment_text": "", "digests": { "md5": "3954dbd874243a1e22b8c206bf1a56e8", "sha256": "93e41f2c63c0f7aae2002d3306ee630fc267cc4e9e9411f028452d5d250317f2" }, "downloads": -1, "filename": "CrawlerFriend-1.0.9.tar.gz", "has_sig": false, "md5_digest": "3954dbd874243a1e22b8c206bf1a56e8", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4941, "upload_time": "2018-08-04T04:10:06", "url": "https://files.pythonhosted.org/packages/9a/66/32ee9967f19bc9fc17baf103ad8e5f32ea70618824f8b0e4018d1158dd57/CrawlerFriend-1.0.9.tar.gz" } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "0d774bf665e2f509d03077ce9e574435", "sha256": "609aefc0c1004cf31441cd5442a4f3bd3e450821c650fd7affb2179d12f53c19" }, "downloads": -1, "filename": "CrawlerFriend-1.0.11-py2-none-any.whl", "has_sig": false, "md5_digest": "0d774bf665e2f509d03077ce9e574435", "packagetype": "bdist_wheel", "python_version": "py2", "requires_python": null, "size": 5023, "upload_time": "2018-08-14T15:14:29", "url": "https://files.pythonhosted.org/packages/bd/5f/f92072259846abe7c5253f9d1388778fb9aed6a675374925a5c814ba6f0c/CrawlerFriend-1.0.11-py2-none-any.whl" }, { "comment_text": "", "digests": { "md5": "7f8918aab8edc656dafd154be11cedbd", "sha256": "23982159c4c6f2e6678d0f519ad2c984968a1d503d20ef5585296b7cd1054e91" }, "downloads": -1, "filename": "CrawlerFriend-1.0.11.tar.gz", "has_sig": false, "md5_digest": "7f8918aab8edc656dafd154be11cedbd", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4637, "upload_time": "2018-08-14T15:14:31", "url": "https://files.pythonhosted.org/packages/21/8c/b16ab03a45152625455419e121a3bd052e38aed75de8b6937c0c14aa78d8/CrawlerFriend-1.0.11.tar.gz" } ] }