{ "info": { "author": "Neal Wong", "author_email": "ibprnd@gmail.com", "bugtrack_url": null, "classifiers": [ "Development Status :: 5 - Production/Stable", "Framework :: Scrapy", "Intended Audience :: Developers", "License :: OSI Approved :: BSD License", "Programming Language :: Python", "Topic :: Internet :: WWW/HTTP" ], "description": "scrapy-fake-useragent-fix\n=====================\n\nRandom User-Agent middleware based on\n`fake-useragent `__. It\npicks up ``User-Agent`` strings based on `usage\nstatistics `__\nfrom a `real world database `__.\n\nInstallation\n-------------\n\nThe simplest way is to install it via `pip`:\n\n pip install scrapy-fake-useragent-fix\n\nConfiguration\n-------------\n\nTurn off the built-in ``UserAgentMiddleware`` and add\n``RandomUserAgentMiddleware``.\n\nIn Scrapy >=1.0:\n\n.. code:: python\n\n DOWNLOADER_MIDDLEWARES = {\n 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware': None,\n 'scrapy_fake_useragent.middleware.RandomUserAgentMiddleware': 400,\n }\n\nIn Scrapy <1.0:\n\n.. code:: python\n\n DOWNLOADER_MIDDLEWARES = {\n 'scrapy.contrib.downloadermiddleware.useragent.UserAgentMiddleware': None,\n 'scrapy_fake_useragent.middleware.RandomUserAgentMiddleware': 400,\n }\n\nConfiguring User-Agent type\n---------------------------\n\nThere's a configuration parameter ``RANDOM_UA_TYPE`` defaulting to ``random`` which is passed verbatim to the fake-user-agent to random choose user agents. ``random``, ``chrome``, ``firefox``, ``safari``, ``internetexplorer`` are supported. If you want to choose from a specific device type, you can use a device prefix before browse type, such as ``desktop.chrome``, ``mobile.chrome``, only ``desktop``, ``mobile``, ``tablet`` are supported.\n\nUsage with `scrapy-proxies`\n---------------------------\n\nTo use with middlewares of random proxy such as `scrapy-proxies `_, you need:\n\n1. set ``RANDOM_UA_PER_PROXY`` to True to allow switch per proxy\n\n2. set priority of ``RandomUserAgentMiddleware`` to be greater than ``scrapy-proxies``, so that proxy is set before handle UA\n\n\n.. |GitHub version| image:: https://badge.fury.io/gh/alecxe%2Fscrapy-fake-useragent.svg\n :target: http://badge.fury.io/gh/alecxe%2Fscrapy-fake-useragent\n.. |Requirements Status| image:: https://requires.io/github/alecxe/scrapy-fake-useragent/requirements.svg?branch=master\n :target: https://requires.io/github/alecxe/scrapy-fake-useragent/requirements/?branch=master\n\nConfiguring Fake-UserAgent fallback\n-----------------------------------\n\nThere's a configuration parameter ``FAKEUSERAGENT_FALLBACK`` defaulting to\n``None``. You can set it to a string value, for example ``Mozilla`` or\n``Your favorite browser``, this configuration can completely disable any\nannoying exception.\n\n\n", "description_content_type": "", "docs_url": null, "download_url": "", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/Qiulin-Wang/scrapy-fake-useragent", "keywords": "scrapy proxy user-agent web-scraping", "license": "New BSD License", "maintainer": "", "maintainer_email": "", "name": "scrapy-fake-useragent-fix", "package_url": "https://pypi.org/project/scrapy-fake-useragent-fix/", "platform": "", "project_url": "https://pypi.org/project/scrapy-fake-useragent-fix/", "project_urls": { "Homepage": "https://github.com/Qiulin-Wang/scrapy-fake-useragent" }, "release_url": "https://pypi.org/project/scrapy-fake-useragent-fix/0.1.1/", "requires_dist": [ "fake-useragent", "user-agents" ], "requires_python": "", "summary": "Use a random User-Agent provided by fake-useragent for every request", "version": "0.1.1" }, "last_serial": 4365133, "releases": { "0.1.0": [ { "comment_text": "", "digests": { "md5": "d82e854436f6a83bf3e6f981d87d050b", "sha256": "6b6626324646016610505717628739a12cfc9828eced66beff7a058f1fccf595" }, "downloads": -1, "filename": "scrapy_fake_useragent_fix-0.1.0-py2.py3-none-any.whl", "has_sig": false, "md5_digest": "d82e854436f6a83bf3e6f981d87d050b", "packagetype": "bdist_wheel", "python_version": "py2.py3", "requires_python": null, "size": 5554, "upload_time": "2018-10-11T16:40:21", "url": "https://files.pythonhosted.org/packages/b4/ea/dfeda14979491715ce1a37fcca81770ddfe1c7016ed1653c3815dec50d89/scrapy_fake_useragent_fix-0.1.0-py2.py3-none-any.whl" } ], "0.1.1": [ { "comment_text": "", "digests": { "md5": "edd8714fdcf8119ff36ad977d9b760d8", "sha256": "5a952a3004c57145c0cb55ebcfa81542a60913eae5c34efa88f61eaaf13bc52a" }, "downloads": -1, "filename": "scrapy_fake_useragent_fix-0.1.1-py2.py3-none-any.whl", "has_sig": false, "md5_digest": "edd8714fdcf8119ff36ad977d9b760d8", "packagetype": "bdist_wheel", "python_version": "py2.py3", "requires_python": null, "size": 5552, "upload_time": "2018-10-11T16:42:33", "url": "https://files.pythonhosted.org/packages/d3/55/95575029a529f244216dda18149ef17bd1d023c0c8a6af60b5582d339c08/scrapy_fake_useragent_fix-0.1.1-py2.py3-none-any.whl" } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "edd8714fdcf8119ff36ad977d9b760d8", "sha256": "5a952a3004c57145c0cb55ebcfa81542a60913eae5c34efa88f61eaaf13bc52a" }, "downloads": -1, "filename": "scrapy_fake_useragent_fix-0.1.1-py2.py3-none-any.whl", "has_sig": false, "md5_digest": "edd8714fdcf8119ff36ad977d9b760d8", "packagetype": "bdist_wheel", "python_version": "py2.py3", "requires_python": null, "size": 5552, "upload_time": "2018-10-11T16:42:33", "url": "https://files.pythonhosted.org/packages/d3/55/95575029a529f244216dda18149ef17bd1d023c0c8a6af60b5582d339c08/scrapy_fake_useragent_fix-0.1.1-py2.py3-none-any.whl" } ] }