{ "info": { "author": "Thomas Gaigher", "author_email": "info@pypyr.io", "bugtrack_url": null, "classifiers": [ "Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: Apache Software License", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", "Topic :: Software Development :: Build Tools", "Topic :: Software Development :: Libraries :: Python Modules" ], "description": "#################\npypyr aws plug-in\n#################\n\n.. image:: https://pypyr.io/images/pypyr-logo-small.png\n :alt: pypyr-logo\n :align: left\n\n*pypyr*\n pronounce how you like, but I generally say *piper* as in \"piping down the\n valleys wild\"\n\n `pypyr `__ is a command line interface to\n run pipelines defined in yaml.\n\n*the pypyr aws plug-in*\n Run anything on aws. No really, anything. If the aws api supports it, the\n pypyr aws plug-in supports it.\n\n It's a pretty easy way of invoking the aws api as a step\n in a series of steps.\n Why use this when you could just use the aws-cli instead? The aws cli is all\n kinds of awesome, but I find more often than not it's not just one or two aws\n *ad hoc* cli or aws api methods you have to execute, but especially when\n automating and scripting you actually need to run a sequence of commands,\n where the output of a previous command influences what you pass to the next\n command.\n\n Sure, you can bash it up, and I do that too, but running it as a pipeline\n via pypyr has actually made my life quite a bit easier in terms of not having\n to deal with conditionals, error traps and input validation.\n\n|build-status| |coverage| |pypi|\n\n.. contents::\n\n.. section-numbering::\n\n************\nInstallation\n************\n\npip\n===\n.. code-block:: bash\n\n # pip install --upgrade pypyraws\n\npypyraws depends on the ``pypyr`` cli. The above ``pip`` will install it for\nyou if you don't have it already.\n\nPython version\n==============\nTested against Python >=3.6\n\n********\nExamples\n********\nIf you prefer reading code to reading words, https://github.com/pypyr/pypyr-example\n\n*****\nsteps\n*****\n+-------------------------------+-------------------------------------------------+------------------------------+\n| **step** | **description** | **input context properties** |\n+-------------------------------+-------------------------------------------------+------------------------------+\n| `pypyraws.steps.client`_ | Execute any low-level aws client method | awsClientIn (dict) |\n+-------------------------------+-------------------------------------------------+------------------------------+\n| `pypyraws.steps.ecswaitprep`_ | Run me after an ecs task run or stop to prepare | awsClientOut (dict) |\n| | an ecs waiter. | |\n| | | awsEcsWaitPrepCluster (str) |\n+-------------------------------+-------------------------------------------------+------------------------------+\n| `pypyraws.steps.s3fetchjson`_ | Fetch a json file from s3 into the pypyr | s3Fetch (dict) |\n| | context. | |\n+-------------------------------+-------------------------------------------------+------------------------------+\n| `pypyraws.steps.s3fetchyaml`_ | Fetch a yaml file from s3 into the pypyr | s3Fetch (dict) |\n| | context. | |\n+-------------------------------+-------------------------------------------------+------------------------------+\n| `pypyraws.steps.wait`_ | Wait for an aws client waiter method to | awsWaitIn (dict) |\n| | complete. | |\n+-------------------------------+-------------------------------------------------+------------------------------+\n| `pypyraws.steps.waitfor`_ | Wait for any aws client method to complete, | awsWaitFor (dict) |\n| | even when it doesn't have an official waiter. | |\n+-------------------------------+-------------------------------------------------+------------------------------+\n\npypyraws.steps.client\n=====================\nWhat can I do with pypyraws.steps.client?\n-----------------------------------------\nThis step provides an easy way of getting at the low-level AWS api from the\npypyr pipeline runner. So in short, pretty much anything you can do with the\nAWS api, you got it, as the Big O might have said.\n\nThis step lets you specify the service name and the service method you want to\nexecute dynamically. You can also control the service header arguments and the\nmethod arguments themselves.\n\nThe arguments you pass to the service and its methods are exactly as given by\nthe AWS help documentation. So you do not have to learn yet another\nconfiguration based abstraction on top of the AWS api that might not even\nsupport all the methods you need.\n\nYou can actually pretty much just grab the json as written from the excellent\nAWS help docs, paste it into some json that pypyr consumes and tadaaa!\nAlternatively, grab the samples from the boto3 python documentation to include\nin some yaml - the python dictionary structures map to yaml without too much\nfaff.\n\nSupported AWS services\n----------------------\nClients provide a low-level interface to AWS whose methods map close to 1:1\nwith the AWS REST service APIs. All service operations are supported by clients.\n\nRun any method on any of the following aws low-level client services:\n\n acm, apigateway, application-autoscaling, appstream, autoscaling,\n batch, budgets, clouddirectory, cloudformation, cloudfront, cloudhsm,\n cloudsearch, cloudsearchdomain, cloudtrail, cloudwatch, codebuild, codecommit,\n codedeploy, codepipeline, codestar, cognito-identity, cognito-idp,\n cognito-sync, config, cur, datapipeline, devicefarm, directconnect, discovery,\n dms, ds, dynamodb, dynamodbstreams, ec2, ecr, ecs, efs, elasticache,\n elasticbeanstalk, elastictranscoder, elb, elbv2, emr, es, events, firehose,\n gamelift, glacier, health, iam, importexport, inspector, iot, iot-data,\n kinesis, kinesisanalytics, kms, lambda, lex-models, lex-runtime, lightsail,\n logs, machinelearning, marketplace-entitlement, marketplacecommerceanalytics,\n meteringmarketplace, mturk, opsworks, opsworkscm, organizations, pinpoint,\n polly, rds, redshift, rekognition, resourcegroupstaggingapi, route53,\n route53domains, s3, sdb, servicecatalog, ses, shield, sms, snowball, sns, sqs,\n ssm, stepfunctions, storagegateway, sts, support, swf, waf, waf-regional,\n workdocs, workspaces, xray\n\nYou can find full details for the supported services and what methods you can\nrun against them here: http://boto3.readthedocs.io/en/latest/reference/services/\n\nWith the speed of new features and services AWS introduces, it's pretty\nunlikely I'll get round to updating the list each and every time.\n\npypyr-aws will automatically support new services AWS releases for the boto3\nclient, in case the list above gets out of date. So while the document might\nnot update, the code already will dynamically use new features and services on\nthe boto3 client.\n\npypyr context\n-------------\nRequires the following context items:\n\n.. code-block:: yaml\n\n awsClientIn:\n serviceName: 'aws service name here'\n methodName: 'execute this method of the aws service'\n clientArgs: # optional\n arg1Name: arg1Value\n arg2Name: arg2Value\n methodArgs: # optional\n arg1Name: arg1Value\n arg2Name: arg2Value\n\nThe *awsClientIn* context supports text `Substitutions`_.\n\nAWS response\n------------\nAfter this step completes the full response is available to subsequent steps\nin the pypyr context in the *awsClientOut* key.\n\nSample pipeline\n---------------\nHere is some sample yaml of what a pipeline using the pypyr-aws plug-in *client*\nstep could look like:\n\n.. code-block:: yaml\n\n context_parser: pypyr.parser.keyvaluepairs\n steps:\n - name: pypyraws.steps.client\n description: upload a file to s3\n in:\n awsClientIn:\n serviceName: s3\n methodName: upload_file\n methodArgs:\n Filename: ./testfiles/arb.txt\n Bucket: '{bucket}'\n Key: arb.txt\n\nIf you saved this yaml as ``./pipelines/go-go-s3.yaml``, you can run\nfrom ./ the following to upload *arb.txt* to your specified bucket:\n\n.. code-block:: bash\n\n $ pypyr go-go-s3 \"bucket=myuniquebucketname\"\n\n\nSee a worked example for `pypyr aws s3 here\n`__.\n\npypyraws.steps.ecswaitprep\n==========================\nRun me after an ecs task run or stop to prepare an ecs waiter.\n\nPrepares the awsWaitIn context key for pypyraws.steps.wait\n\nAvailable ecs waiters are:\n\n- ServicesInactive\n- ServicesStable\n- TasksRunning\n- TasksStopped\n\nFull details here: http://boto3.readthedocs.io/en/latest/reference/services/ecs.html#waiters\n\nUse this step after any of the following ecs client methods if you want to use\none of the ecs waiters to wait for a specific state:\n\n- describe_services\n- describe_tasks\n- list_services - specify awsEcsWaitPrepCluster if you don't want default\n- list_tasks - specify awsEcsWaitPrepCluster if you don't want default\n- run_task\n- start_task\n- stop_task\n- update_service\n\nYou don't have to use this step, you could always just construct the awsWaitIn\ndictionary in context yourself. It just so happens this step saves you some\nlegwork to do so.\n\nRequired context:\n\n- awsClientOut\n\n - dict. mandatory.\n - This is the context key that any ecs command executed by\n pypyraws.steps.service adds. Chances are pretty good you don't want to\n construct this by hand yourself - the idea is to use the output as\n generated by one of the supported ecs methods.\n\n- awsEcsWaitPrepCluster\n\n - string. optional.\n - The short name or full arn of the cluster that hosts the task to\n describe. If you do not specify a cluster, the default cluster is\n assumed. For most of the ecs methods the code automatically deduces the\n cluster from awsClientOut, so don't worry about it.\n - But, when following list_services and list_tasks, you have to specify\n this parameter.\n - Specifying this parameter will override any automatically deduced cluster arn\n\nSee a worked example for `pypyr aws ecs here\n`__.\n\npypyraws.steps.s3fetchjson\n==========================\nFetch a json file from s3 and put the json values into context.\n\nRequired input context is:\n\n.. code-block:: yaml\n\n s3Fetch:\n clientArgs: # optional\n arg1Name: arg1Value\n methodArgs:\n Bucket: '{bucket}'\n Key: arb.json\n outKey: 'destination pypyr context key' # optional\n\n- *clientArgs* are passed to the aws s3 client constructor. These are optional.\n- *methodArgs* are passed the the s3 ``get_object`` call. The minimum required\n values are:\n\n - Bucket\n - Key\n\n- Check here for all available arguments (including SSE server-side encryption):\n http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.get_object\n- *outKey* writes fetched json to this context key. If not specified, json\n writes directly to context root.\n\nJson parsed from the file will be merged into the pypyr context. This will\noverwrite existing values if the same keys are already in there.\n\nI.e if file json has ``{'eggs' : 'boiled'}``, but context ``{'eggs': 'fried'}``\nalready exists, returned ``context['eggs']`` will be 'boiled'.\n\nIf *outKey* is not specified, the json should not be an Array [] at the root\nlevel, but rather an Object {}.\n\nThe *s3Fetch* input context supports text `Substitutions`_.\n\nSee a worked example for `pypyr aws s3fetch here\n`__.\n\npypyraws.steps.s3fetchyaml\n==========================\nFetch a yaml file from s3 and put the yaml structure into context.\n\nRequired input context is:\n\n.. code-block:: yaml\n\n s3Fetch:\n clientArgs: # optional\n arg1Name: arg1Value\n methodArgs:\n Bucket: '{bucket}'\n Key: arb.yaml\n outKey: 'destination pypyr context key' # optional\n\n- *clientArgs* are passed to the aws s3 client constructor. These are optional.\n- *methodArgs* are passed the the s3 ``get_object`` call. The minimum required\n values are:\n\n - Bucket\n - Key\n\n- Check here for all available arguments (including SSE server-side encryption):\n http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.get_object\n- *outKey* writes fetched yaml to this context key. If not specified, yaml\n writes directly to context root.\n\nThe *s3Fetch* context supports text `Substitutions`_.\n\nYaml parsed from the file will be merged into the pypyr context. This will\noverwrite existing values if the same keys are already in there.\n\nI.e if file yaml has\n\n.. code-block:: yaml\n\n eggs: boiled\n\nbut context ``{'eggs': 'fried'}`` already exists, returned ``context['eggs']``\nwill be 'boiled'.\n\nIf *outKey* is not specified, the yaml should not be a list at the top level,\nbut rather a mapping. So the top-level yaml should not look like this:\n\n.. code-block:: yaml\n\n - eggs\n - ham\n\nbut rather like this:\n\n.. code-block:: yaml\n\n breakfastOfChampions:\n - eggs\n - ham\n\nSee a worked example for `pypyr aws s3fetch here\n`__.\n\npypyraws.steps.wait\n===================\nWait for things in AWS to complete before continuing pipeline.\n\nRun any low-level boto3 client wait() from get_waiter.\n\nWaiters use a client's service operations to poll the status of an AWS resource\nand suspend execution until the AWS resource reaches the state that the waiter\nis polling for or a failure occurs while polling.\n\nhttp://boto3.readthedocs.io/en/latest/guide/clients.html#waiters\n\nThe input context requires:\n\n.. code-block:: yaml\n\n awsWaitIn:\n serviceName: 'service name' # Available services here: http://boto3.readthedocs.io/en/latest/reference/services/\n waiterName: 'waiter name' # Check service docs for available waiters for each service\n waiterArgs:\n arg1Name: arg1Value # optional. Dict. kwargs for get_waiter\n waitArgs:\n arg1Name: arg1Value #optional. Dict. kwargs for wait\n\nThe *awsWaitIn* context supports text `Substitutions`_.\n\npypyraws.steps.waitfor\n======================\nCustom waiter for any aws client operation. Where `pypyraws.steps.wait`_ uses\nthe official AWS waiters from the low-level client api, this step allows you to\nexecute *any* aws low-level client method and wait for a specified field in\nthe response to become the value you want it to be.\n\nThis is especially handy for things like Beanstalk, because Elastic Beanstalk\ndoes not have Waiters for environment creation.\n\nThe input context looks like this:\n\n.. code-block:: yaml\n\n awsWaitFor:\n awsClientIn: # required. awsClientIn allows the same arguments as pypyraws.steps.client.\n serviceName: elasticbeanstalk\n methodName: describe_environments\n methodArgs:\n ApplicationName: my wonderful beanstalk default application\n EnvironmentNames:\n - my-wonderful-environment\n VersionLabel: v0.1\n waitForField: '{Environments[0][Status]}' # required. format expression for field name to check in awsClient response\n toBe: Ready # required. Stop waiting when waitForField equals this value\n pollInterval: 30 # optional. Seconds to wait between polling attempts. Defaults to 30 if not specified.\n maxAttempts: 10 # optional. Defaults to 10 if not specified.\n errorOnWaitTimeout: True # optional. Defaults to True if not specified. Stop processing if maxAttempts exhausted without reaching toBe value.\n\nSee `pypyraws.steps.client`_ for a full listing of available arguments under\n*awsClientIn*.\n\nIf ``errorOnWaitTimeout`` is True and ``max_attempts`` exhaust before reaching\nthe desired target state, pypyr will stop processing with a\n``pypyraws.errors.WaitTimeOut`` error.\n\nOnce this step completes it adds ``awsWaitForTimedOut`` to the pypyr context.\nThis is a boolean value with values:\n\n+--------------------------+---------------------------------------------------+\n| awsWaitForTimedOut | Description |\n+--------------------------+---------------------------------------------------+\n| True | ``errorOnWaitTimeout=False`` and ``max_attempts`` |\n| | exhausted without reaching ``toBe``. |\n+--------------------------+---------------------------------------------------+\n| False | ``waitForField``'s value becomes ``toBe`` within |\n| | ``max_attempts``. |\n+--------------------------+---------------------------------------------------+\n\n\nThe *awsWaitFor* context supports text `Substitutions`_. Do note that while\n``waitForField`` uses substitution style format strings, the substitutions are\nmade against the response object that returns from the aws client call specified\nin *awsClientIn*, and not from the pypyr context itself.\n\nSee a worked example for an `elastic beanstalk custom waiter for environmment\ncreation here\n`__.\n\n*************\nSubstitutions\n*************\nYou can use substitution tokens, aka string interpolation, where specified for\ncontext items. This substitutes anything between {curly braces} with the\ncontext value for that key. This also works where you have dictionaries/lists\ninside dictionaries/lists. For example, if your context looked like this:\n\n.. code-block:: yaml\n\n bucketValue: the.bucket\n keyValue: dont.kick\n moreArbText: wild\n awsClientIn:\n serviceName: s3\n methodName: get_object\n methodArgs:\n Bucket: '{bucketValue}'\n Key: '{keyValue}'\n\nThis will run s3 get_object to retrieve file *dont.kick* from *the.bucket*.\n\n- *Bucket: '{bucketValue}'* becomes *Bucket: the.bucket*\n- *Key: '{keyValue}'* becomes *Key: dont.kick*\n\nIn json & yaml, curlies need to be inside quotes to make sure they parse as\nstrings.\n\nEscape literal curly braces with doubles: {{ for {, }} for }\n\nSee a worked example `for substitutions here\n`__.\n\n\n******************\naws authentication\n******************\nConfiguring credentials\n=======================\npypyr-aws pretty much just uses the underlying boto3 authentication mechanisms.\nMore info here: http://boto3.readthedocs.io/en/latest/guide/configuration.html\n\nThis means any of the following will work:\n\n- If you are running inside of AWS - on EC2 or inside an ECS container, it will\n automatically use IAM role credentials if it does not find credentials in any\n of the other places listed below.\n- In the pypyr context\n\n .. code-block:: python\n\n context['awsClientIn']['clientArgs'] = {\n aws_access_key_id: ACCESS_KEY,\n aws_secret_access_key: SECRET_KEY,\n aws_session_token: SESSION_TOKEN,\n }\n\n- $ENV variables\n\n - AWS_ACCESS_KEY_ID\n - AWS_SECRET_ACCESS_KEY\n - AWS_SESSION_TOKEN\n\n- Credentials file at *~/.aws/credentials* or *~/.aws/config*\n\n - If you have the aws-cli installed, run ``aws configure`` to get these\n configured for you automatically.\n\nTip: On dev boxes I generally don't bother with credentials, because chances\nare pretty good that I have the aws-cli installed already anyway, so pypyr\nwill just re-use the aws shared configuration files that are there anyway.\n\nEnsure secrets stay secret\n==========================\nBe safe! Don't hard-code your aws credentials. Don't check credentials into a\npublic repo.\n\nTip: if you're running pypyr inside of aws - e.g in an ec2 instance or an ecs\ncontainer that is running under an IAM role, you don't actually *need*\nexplicitly to configure credentials for pypyr-aws.\n\nDo remember not to fling your key & secret around as shell arguments - it could\nvery easily leak that way into logs or expose via a ``ps``. I generally use one\nof the pypyr built-in context parsers like *pypyr.parser.jsonfile* or\n*pypyr.parser.yamlfile*, see\n`here for details `__.\n\nDo remember also that $ENV variables are not a particularly secure place to\nkeep your secrets.\n\n*******\nTesting\n*******\nTesting without worrying about dependencies\n===========================================\nRun from tox to test the packaging cycle inside a virtual env, plus run all\ntests:\n\n.. code-block:: bash\n\n # just run tests\n $ tox -e dev -- tests\n # run tests, validate README.rst, run flake8 linter\n $ tox -e stage -- tests\n\nIf tox is taking too long\n=========================\nThe test framework is pytest. If you only want to run tests:\n\n.. code-block:: bash\n\n $ pip install -e .[dev,test]\n\nDay-to-day testing\n==================\n- Tests live under */tests* (surprising, eh?). Mirror the directory structure of\n the code being tested.\n- Prefix a test definition with *test_* - so a unit test looks like\n\n .. code-block:: python\n\n def test_this_should_totally_work():\n\n- To execute tests, from root directory:\n\n .. code-block:: bash\n\n pytest tests\n\n- For a bit more info on running tests:\n\n .. code-block:: bash\n\n pytest --verbose [path]\n\n- To execute a specific test module:\n\n .. code-block:: bash\n\n pytest tests/unit/arb_test_file.py\n\n*****\nHelp!\n*****\nDon't Panic! For help, community or talk, join the chat on |discord|!\n\n**********\nContribute\n**********\nDevelopers\n==========\nFor information on how to help with pypyr, run tests and coverage, please do\ncheck out the `contribution guide `_.\n\nBugs\n====\nWell, you know. No one's perfect. Feel free to `create an issue\n`_.\n\n\n.. |build-status| image:: https://api.shippable.com/projects/58efdfe130eb380700e559a4/badge?branch=master\n :alt: build status\n :target: https://app.shippable.com/github/pypyr/pypyr-aws\n\n.. |coverage| image:: https://api.shippable.com/projects/58efdfe130eb380700e559a4/coverageBadge?branch=master\n :alt: coverage status\n :target: https://app.shippable.com/github/pypyr/pypyr-aws\n\n.. |pypi| image:: https://badge.fury.io/py/pypyraws.svg\n :alt: pypi version\n :target: https://pypi.python.org/pypi/pypyraws/\n :align: bottom\n\n.. |discord| replace:: `discord `__\n\n\n", "description_content_type": "", "docs_url": null, "download_url": "", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/pypyr/pypyr-aws", "keywords": "pypyr aws plugin devops pipeline runner", "license": "Apache License 2.0", "maintainer": "", "maintainer_email": "", "name": "pypyraws", "package_url": "https://pypi.org/project/pypyraws/", "platform": "", "project_url": "https://pypi.org/project/pypyraws/", "project_urls": { "Homepage": "https://github.com/pypyr/pypyr-aws" }, "release_url": "https://pypi.org/project/pypyraws/1.1.1/", "requires_dist": [ "pypyr", "boto3", "ruamel.yaml", "bumpversion; extra == 'deploy'", "twine; extra == 'deploy'", "check-manifest; extra == 'dev'", "flake8; extra == 'dev'", "pytest; extra == 'test'", "pytest-cov; extra == 'test'", "tox; extra == 'test'" ], "requires_python": "", "summary": "pypyr pipeline runner AWS plugin. Steps for ECS, S3, Beanstalk.", "version": "1.1.1" }, "last_serial": 5693243, "releases": { "0.0.0": [], "0.0.1": [ { "comment_text": "", "digests": { "md5": "88600e65ee5ff2a175ca3f74271c850f", "sha256": "0448c347d6fecac699c60be28852e870aa50ff46c4a6c8fde2f91cdeb0390915" }, "downloads": -1, "filename": "pypyraws-0.0.1-py3-none-any.whl", "has_sig": false, "md5_digest": "88600e65ee5ff2a175ca3f74271c850f", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 14801, "upload_time": "2017-05-12T20:40:00", "url": "https://files.pythonhosted.org/packages/f6/41/380f4948284e55c577d44c75a253aebc3d85e8bdc2751c38ce60afef16ef/pypyraws-0.0.1-py3-none-any.whl" } ], "0.0.2": [ { "comment_text": "", "digests": { "md5": "07839907be21a50944704ff1e25efb83", "sha256": "899d4aca4a1369cf3e75c9a42a325888ab46b48cf65a4a59159e95e052e9ff44" }, "downloads": -1, "filename": "pypyraws-0.0.2-py3-none-any.whl", "has_sig": false, "md5_digest": "07839907be21a50944704ff1e25efb83", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 23841, "upload_time": "2017-05-12T22:13:42", "url": "https://files.pythonhosted.org/packages/e2/b6/ae717aba0f06e928b939af58bc4735632e4fcad74f961597ac8fa36e3c88/pypyraws-0.0.2-py3-none-any.whl" } ], "0.0.3": [ { "comment_text": "", "digests": { "md5": "bb718c4c031b72514e9aec1de9c0ac3d", "sha256": "bb3552cdf7010516209fb8d46578f45850eece7d4c8749cc59ef461d82b65c4a" }, "downloads": -1, "filename": "pypyraws-0.0.3-py3-none-any.whl", "has_sig": false, "md5_digest": "bb718c4c031b72514e9aec1de9c0ac3d", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 28855, "upload_time": "2017-06-02T15:05:02", "url": "https://files.pythonhosted.org/packages/a3/f5/04f85105889eed90df13abc0789e7037372a728ace51ec9a5d30a8e8a553/pypyraws-0.0.3-py3-none-any.whl" } ], "0.0.4": [ { "comment_text": "", "digests": { "md5": "e5fffc3f5573bf020392abd23b932c5c", "sha256": "1983aa7361270eb467b16221d52d9c158e8adcdf0bedf1f759353736aa3815f5" }, "downloads": -1, "filename": "pypyraws-0.0.4-py3-none-any.whl", "has_sig": false, "md5_digest": "e5fffc3f5573bf020392abd23b932c5c", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 28873, "upload_time": "2017-06-05T08:07:23", "url": "https://files.pythonhosted.org/packages/5c/46/4176f8ba05c1bca194493a9f2dd6c3a42188249a075501ba8e800d3ed3a1/pypyraws-0.0.4-py3-none-any.whl" } ], "0.1.1": [ { "comment_text": "", "digests": { "md5": "546087e09830e7ea2fcdfc0139bfc9f8", "sha256": "bc1860145ef940af5d61bc38cf8203b9810cf849d71ae2bc7bca22bfc8a5c9d5" }, "downloads": -1, "filename": "pypyraws-0.1.1-py3-none-any.whl", "has_sig": false, "md5_digest": "546087e09830e7ea2fcdfc0139bfc9f8", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 20708, "upload_time": "2018-06-14T02:19:25", "url": "https://files.pythonhosted.org/packages/1f/66/8577be3b71cd904a99fe2bb53b4a218ee51747384f1668d08c1e76368a67/pypyraws-0.1.1-py3-none-any.whl" } ], "1.0.0": [ { "comment_text": "", "digests": { "md5": "b603f9532b683861ad7758aea3c6aa9c", "sha256": "8f641e1235cfa12094aa9817f29ac03d297e05e28753982352e7269b17774d73" }, "downloads": -1, "filename": "pypyraws-1.0.0-py3-none-any.whl", "has_sig": false, "md5_digest": "b603f9532b683861ad7758aea3c6aa9c", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 20718, "upload_time": "2018-07-26T20:34:54", "url": "https://files.pythonhosted.org/packages/4c/73/9c602778fafa67b05f153b70ef4de162ed82ff0bb44b843d9b89034bf1fd/pypyraws-1.0.0-py3-none-any.whl" } ], "1.1.0": [ { "comment_text": "", "digests": { "md5": "999a4bf251ee478c2248d22b1a4bfb0a", "sha256": "6513bc88f59f21f09c68cabe210700f06171a7166fd1374ee9c554dd200faae5" }, "downloads": -1, "filename": "pypyraws-1.1.0-py3-none-any.whl", "has_sig": false, "md5_digest": "999a4bf251ee478c2248d22b1a4bfb0a", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 21447, "upload_time": "2018-11-21T22:38:01", "url": "https://files.pythonhosted.org/packages/11/e8/2f630a96e61e77199cd55a6d5b2736f7f71f0824017703b033a34ffb766c/pypyraws-1.1.0-py3-none-any.whl" } ], "1.1.1": [ { "comment_text": "", "digests": { "md5": "3a389e016ba92200940763ed24fc5a7c", "sha256": "46a030dd4ce6f168953fabcefd33ef03815524cc9d17ec70ee115497d345c31d" }, "downloads": -1, "filename": "pypyraws-1.1.1-py3-none-any.whl", "has_sig": false, "md5_digest": "3a389e016ba92200940763ed24fc5a7c", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 21433, "upload_time": "2019-08-18T02:20:34", "url": "https://files.pythonhosted.org/packages/9d/5f/c0ac3128650865e703a7e026fe062aaf1b99c3439daaec258f2240d4d1e5/pypyraws-1.1.1-py3-none-any.whl" } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "3a389e016ba92200940763ed24fc5a7c", "sha256": "46a030dd4ce6f168953fabcefd33ef03815524cc9d17ec70ee115497d345c31d" }, "downloads": -1, "filename": "pypyraws-1.1.1-py3-none-any.whl", "has_sig": false, "md5_digest": "3a389e016ba92200940763ed24fc5a7c", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 21433, "upload_time": "2019-08-18T02:20:34", "url": "https://files.pythonhosted.org/packages/9d/5f/c0ac3128650865e703a7e026fe062aaf1b99c3439daaec258f2240d4d1e5/pypyraws-1.1.1-py3-none-any.whl" } ] }