{ "info": { "author": "source{d}", "author_email": "machine-learning@sourced.tech", "bugtrack_url": null, "classifiers": [ "Development Status :: 5 - Production/Stable", "Environment :: Console", "Intended Audience :: Developers", "License :: OSI Approved :: Apache Software License", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.8" ], "description": "

\n\n

\n

Hercules

\n

\n Fast, insightful and highly customizable Git history analysis.

\n \"GoDoc\"\n \"Travis\n \"AppVeyor\n \"PyPi\n \"Docker\n \"Code\n \"Go\n \"Apache\n

\n

\n Overview \u2022\n How To Use \u2022\n Installation \u2022\n Contributions \u2022\n License\n

\n\n--------\n\n\nTable of Contents\n=================\n\n * [Overview](#overview)\n * [Installation](#installation)\n * [Build from source](#build-from-source)\n * [GitHub Action](#github-action)\n * [Contributions](#contributions)\n * [License](#license)\n * [Usage](#usage)\n * [Caching](#caching)\n * [GitHub Action](#github-action-1)\n * [Docker image](#docker-image)\n * [Built-in analyses](#built-in-analyses)\n * [Project burndown](#project-burndown)\n * [Files](#files)\n * [People](#people)\n * [Churn matrix](#overwrites-matrix)\n * [Code ownership](#code-ownership)\n * [Couples](#couples)\n * [Structural hotness](#structural-hotness)\n * [Aligned commit series](#aligned-commit-series)\n * [Added vs changed lines through time](#added-vs-changed-lines-through-time)\n * [Efforts through time](#efforts-through-time)\n * [Sentiment (positive and negative comments)](#sentiment-positive-and-negative-comments)\n * [Everything in a single pass](#everything-in-a-single-pass)\n * [Plugins](#plugins)\n * [Merging](#merging)\n * [Bad unicode errors](#bad-unicode-errors)\n * [Plotting](#plotting)\n * [Custom plotting backend](#custom-plotting-backend)\n * [Caveats](#caveats)\n * [Burndown Out-Of-Memory](#burndown-out-of-memory)\n\n## Overview\n\nHercules is an amazingly fast and highly customizable Git repository analysis engine written in Go. Batteries are included.\nPowered by [go-git](https://github.com/src-d/go-git) and [Babelfish](https://doc.bblf.sh).\n\nThere are two command-line tools: `hercules` and `labours`. The first is a program\nwritten in Go which takes a Git repository and executes a Directed Acyclic Graph (DAG) of [analysis tasks](doc/PIPELINE_ITEMS.md) over the full commit history.\nThe second is a Python script which shows some predefined plots over the collected data. These two tools are normally used together through\na pipe. It is possible to write custom analyses using the plugin system. It is also possible\nto merge several analysis results together - relevant for organizations. \nThe analyzed commit history includes branches, merges, etc.\n\nHercules has been successfully used for several internal projects at [source{d}](https://sourced.tech).\nThere are blog posts: [1](https://blog.sourced.tech/post/hercules-v4), [2](https://blog.sourced.tech/post/hercules) and\na [presentation](http://vmarkovtsev.github.io/gowayfest-2018-minsk/). Please [contribute](#contributions)\nby testing, fixing bugs, adding [new analyses](https://github.com/src-d/hercules/issues?q=is%3Aissue+is%3Aopen+label%3Anew-analysis), or coding swagger!\n\n![Hercules DAG of Burndown analysis](doc/dag.png)\n

The DAG of burndown and couples analyses with UAST diff refining. Generated with hercules --burndown --burndown-people --couples --feature=uast --dry-run --dump-dag doc/dag.dot https://github.com/src-d/hercules

\n\n![git/git image](doc/linux.png)\n

torvalds/linux line burndown (granularity 30, sampling 30, resampled by year). Generated with hercules --burndown --first-parent --pb https://github.com/torvalds/linux | labours -f pb -m burndown-project in 1h 40min.

\n\n## Installation\n\nGrab `hercules` binary from the [Releases page](https://github.com/src-d/hercules/releases).\n`labours` is installable from [PyPi](https://pypi.org/):\n\n```\npip3 install labours\n```\n\n[`pip3`](https://pip.pypa.io/en/stable/installing/) is the Python package manager.\n\nNumpy and Scipy can be installed on Windows using http://www.lfd.uci.edu/~gohlke/pythonlibs/\n\n### Build from source\nYou are going to need Go (>= v1.11) and [`protoc`](https://github.com/google/protobuf/releases).\n```\ngit clone https://github.com/src-d/hercules && cd hercules\nmake\npip3 install -e ./python\n```\n\n### GitHub Action\n\nIt is possible to run Hercules as a [GitHub Action](https://help.github.com/en/articles/about-github-actions):\n[Hercules on GitHub Marketplace](https://github.com/marketplace/actions/hercules-insights).\nPlease refer to the [sample workflow](.github/workflows/main.yml) which demonstrates how to setup.\n\n## Contributions\n\n...are welcome! See [CONTRIBUTING](CONTRIBUTING.md) and [code of conduct](CODE_OF_CONDUCT.md).\n\n## License\n[Apache 2.0](LICENSE.md)\n\n## Usage\n\nThe most useful and reliably up-to-date command line reference:\n\n```\nhercules --help\n```\n\nSome examples:\n\n```\n# Use \"memory\" go-git backend and display the burndown plot. \"memory\" is the fastest but the repository's git data must fit into RAM.\nhercules --burndown https://github.com/src-d/go-git | labours -m burndown-project --resample month\n# Use \"file system\" go-git backend and print some basic information about the repository.\nhercules /path/to/cloned/go-git\n# Use \"file system\" go-git backend, cache the cloned repository to /tmp/repo-cache, use Protocol Buffers and display the burndown plot without resampling.\nhercules --burndown --pb https://github.com/git/git /tmp/repo-cache | labours -m burndown-project -f pb --resample raw\n\n# Now something fun\n# Get the linear history from git rev-list, reverse it\n# Pipe to hercules, produce burndown snapshots for every 30 days grouped by 30 days\n# Save the raw data to cache.yaml, so that later is possible to labours -i cache.yaml\n# Pipe the raw data to labours, set text font size to 16pt, use Agg matplotlib backend and save the plot to output.png\ngit rev-list HEAD | tac | hercules --commits - --burndown https://github.com/git/git | tee cache.yaml | labours -m burndown-project --font-size 16 --backend Agg --output git.png\n```\n\n`labours -i /path/to/yaml` allows to read the output from `hercules` which was saved on disk.\n\n### Caching\n\nIt is possible to store the cloned repository on disk. The subsequent analysis can run on the\ncorresponding directory instead of cloning from scratch:\n\n```\n# First time - cache\nhercules https://github.com/git/git /tmp/repo-cache\n\n# Second time - use the cache\nhercules --some-analysis /tmp/repo-cache\n```\n\n### GitHub Action\n\nThe action produces the artifact named\n`hercules_charts`. Since it is currently impossible to pack several files in one artifact, all the\ncharts and Tensorflow Projector files are packed in the inner tar archive. In order to view the embeddings,\ngo to [projector.tensorflow.org](https://projector.tensorflow.org), click \"Load\" and choose the two TSVs. Then use UMAP or T-SNE.\n\n### Docker image\n\n```\ndocker run --rm srcd/hercules hercules --burndown --pb https://github.com/git/git | docker run --rm -i -v $(pwd):/io srcd/hercules labours -f pb -m burndown-project -o /io/git_git.png\n```\n\n### Built-in analyses\n\n#### Project burndown\n\n```\nhercules --burndown\nlabours -m burndown-project\n```\n\nLine burndown statistics for the whole repository.\nExactly the same what [git-of-theseus](https://github.com/erikbern/git-of-theseus)\ndoes but much faster. Blaming is performed efficiently and incrementally using a custom RB tree tracking\nalgorithm, and only the last modification date is recorded while running the analysis.\n\nAll burndown analyses depend on the values of *granularity* and *sampling*.\nGranularity is the number of days each band in the stack consists of. Sampling\nis the frequency with which the burnout state is snapshotted. The smaller the\nvalue, the more smooth is the plot but the more work is done.\n\nThere is an option to resample the bands inside `labours`, so that you can\ndefine a very precise distribution and visualize it different ways. Besides,\nresampling aligns the bands across periodic boundaries, e.g. months or years.\nUnresampled bands are apparently not aligned and start from the project's birth date.\n\n#### Files\n\n```\nhercules --burndown --burndown-files\nlabours -m burndown-file\n```\n\nBurndown statistics for every file in the repository which is alive in the latest revision.\n\nNote: it will generate separate graph for every file. You don't want to run it on repository with many files.\n\n#### People\n\n```\nhercules --burndown --burndown-people [--people-dict=/path/to/identities]\nlabours -m burndown-person\n```\n\nBurndown statistics for the repository's contributors. If `--people-dict` is not specified, the identities are\ndiscovered by the following algorithm:\n\n0. We start from the root commit towards the HEAD. Emails and names are converted to lower case.\n1. If we process an unknown email and name, record them as a new developer.\n2. If we process a known email but unknown name, match to the developer with the matching email,\nand add the unknown name to the list of that developer's names.\n3. If we process an unknown email but known name, match to the developer with the matching name,\nand add the unknown email to the list of that developer's emails.\n\nIf `--people-dict` is specified, it should point to a text file with the custom identities. The\nformat is: every line is a single developer, it contains all the matching emails and names separated\nby `|`. The case is ignored.\n\n#### Overwrites matrix\n\n![Wireshark top 20 overwrites matrix](doc/wireshark_overwrites_matrix.png)\n

Wireshark top 20 devs - overwrites matrix

\n\n```\nhercules --burndown --burndown-people [--people-dict=/path/to/identities]\nlabours -m overwrites-matrix\n```\n\nBeside the burndown information, `--burndown-people` collects the added and deleted line statistics per\ndeveloper. Thus it can be visualized how many lines written by developer A are removed by developer B.\nThis indicates collaboration between people and defines expertise teams.\n\nThe format is the matrix with N rows and (N+2) columns, where N is the number of developers.\n\n1. First column is the number of lines the developer wrote.\n2. Second column is how many lines were written by the developer and deleted by unidentified developers\n(if `--people-dict` is not specified, it is always 0).\n3. The rest of the columns show how many lines were written by the developer and deleted by identified\ndevelopers.\n\nThe sequence of developers is stored in `people_sequence` YAML node.\n\n#### Code ownership\n\n![Ember.js top 20 code ownership](doc/emberjs_people.png)\n

Ember.js top 20 devs - code ownership

\n\n```\nhercules --burndown --burndown-people [--people-dict=/path/to/identities]\nlabours -m ownership\n```\n\n`--burndown-people` also allows to draw the code share through time stacked area plot. That is,\nhow many lines are alive at the sampled moments in time for each identified developer.\n\n#### Couples\n\n![Linux kernel file couples](doc/tfprojcouples.png)\n

torvalds/linux files' coupling in Tensorflow Projector

\n\n```\nhercules --couples [--people-dict=/path/to/identities]\nlabours -m couples -o [--couples-tmp-dir=/tmp]\n```\n\n**Important**: it requires Tensorflow to be installed, please follow [official instructions](https://www.tensorflow.org/install/).\n\nThe files are coupled if they are changed in the same commit. The developers are coupled if they\nchange the same file. `hercules` records the number of couples throughout the whole commit history\nand outputs the two corresponding co-occurrence matrices. `labours` then trains\n[Swivel embeddings](https://github.com/src-d/tensorflow-swivel) - dense vectors which reflect the\nco-occurrence probability through the Euclidean distance. The training requires a working\n[Tensorflow](http://tensorflow.org) installation. The intermediate files are stored in the\nsystem temporary directory or `--couples-tmp-dir` if it is specified. The trained embeddings are\nwritten to the current working directory with the name depending on `-o`. The output format is TSV\nand matches [Tensorflow Projector](http://projector.tensorflow.org/) so that the files and people\ncan be visualized with t-SNE implemented in TF Projector.\n\n#### Structural hotness\n\n```\n 46 jinja2/compiler.py:visit_Template [FunctionDef]\n 42 jinja2/compiler.py:visit_For [FunctionDef]\n 34 jinja2/compiler.py:visit_Output [FunctionDef]\n 29 jinja2/environment.py:compile [FunctionDef]\n 27 jinja2/compiler.py:visit_Include [FunctionDef]\n 22 jinja2/compiler.py:visit_Macro [FunctionDef]\n 22 jinja2/compiler.py:visit_FromImport [FunctionDef]\n 21 jinja2/compiler.py:visit_Filter [FunctionDef]\n 21 jinja2/runtime.py:__call__ [FunctionDef]\n 20 jinja2/compiler.py:visit_Block [FunctionDef]\n```\n\nThanks to Babelfish, hercules is able to measure how many times each structural unit has been modified.\nBy default, it looks at functions; refer to [Semantic UAST XPath](https://docs.sourced.tech/babelfish/using-babelfish/uast-querying)\nmanual to switch to something else.\n\n```\nhercules --shotness [--shotness-xpath-*]\nlabours -m shotness\n```\n\nCouples analysis automatically loads \"shotness\" data if available.\n\n![Jinja2 functions grouped by structural hotness](doc/jinja.png)\n

hercules --shotness --pb https://github.com/pallets/jinja | labours -m couples -f pb

\n\n#### Aligned commit series\n\n![tensorflow/tensorflow](doc/devs_tensorflow.png)\n

tensorflow/tensorflow aligned commit series of top 50 developers by commit number.

\n\n```\nhercules --devs [--people-dict=/path/to/identities]\nlabours -m devs -o \n```\n\nWe record how many commits made, as well as lines added, removed and changed per day for each developer.\nWe plot the resulting commit time series using a few tricks to show the temporal grouping. In other words,\ntwo adjacent commit series should look similar after normalization.\n\n1. We compute the distance matrix of the commit series. Our distance metric is\n[Dynamic Time Warping](https://en.wikipedia.org/wiki/Dynamic_time_warping).\nWe use [FastDTW](https://cs.fit.edu/~pkc/papers/tdm04.pdf) algorithm which has linear complexity\nproportional to the length of time series. Thus the overall complexity of computing the matrix is quadratic.\n2. We compile the linear list of commit series with\n[Seriation](http://nicolas.kruchten.com/content/2018/02/seriation/) technique.\nParticularly, we solve the [Travelling Salesman Problem](https://en.wikipedia.org/wiki/Travelling_salesman_problem) which is NP-complete.\nHowever, given the typical number of developers which is less than 1,000, there is a good chance that\nthe solution does not take much time. We use [Google or-tools](https://developers.google.com/optimization/routing/tsp) solver.\n3. We find 1-dimensional clusters in the resulting path with [HDBSCAN](https://hdbscan.readthedocs.io/en/latest/how_hdbscan_works.html)\nalgorithm and assign colors accordingly.\n4. Time series are smoothed by convolving with the [Slepian window](https://en.wikipedia.org/wiki/Window_function#DPSS_or_Slepian_window).\n\nThis plot allows to discover how the development team evolved through time. It also shows \"commit flashmobs\"\nsuch as [Hacktoberfest](https://hacktoberfest.digitalocean.com/). For example, here are the revealed\ninsights from the `tensorflow/tensorflow` plot above:\n\n1. \"Tensorflow Gardener\" is classified as the only outlier.\n2. The \"blue\" group of developers covers the global maintainers and a few people who left (at the top).\n3. The \"red\" group shows how core developers join the project or become less active.\n\n#### Added vs changed lines through time\n\n![tensorflow/tensorflow](doc/add_vs_changed.png)\n

tensorflow/tensorflow added and changed lines through time.

\n\n```\nhercules --devs [--people-dict=/path/to/identities]\nlabours -m old-vs-new -o \n```\n\n`--devs` from the previous section allows to plot how many lines were added and how many existing changed\n(deleted or replaced) through time. This plot is smoothed.\n\n#### Efforts through time\n\n![kubernetes/kubernetes](doc/k8s_efforts.png)\n

kubernetes/kubernetes efforts through time.

\n\n```\nhercules --devs [--people-dict=/path/to/identities]\nlabours -m devs-efforts -o \n```\n\nBesides, `--devs` allows to plot how many lines have been changed (added or removed) by each developer.\nThe upper part of the plot is an accumulated (integrated) lower part. It is impossible to have the same scale\nfor both parts, so the lower values are scaled, and hence there are no lower Y axis ticks.\nThere is a difference between the efforts plot and the ownership plot, although changing lines correlate\nwith owning lines.\n\n#### Sentiment (positive and negative comments)\n\n![Django sentiment](doc/sentiment.png)\n

It can be clearly seen that Django comments were positive/optimistic in the beginning, but later became negative/pessimistic.
hercules --sentiment --pb https://github.com/django/django | labours -m sentiment -f pb

\n\nWe extract new and changed comments from source code on every commit, apply [BiDiSentiment](https://github.com/vmarkovtsev/bidisentiment)\ngeneral purpose sentiment recurrent neural network and plot the results. Requires\n[libtensorflow](https://www.tensorflow.org/install/install_go).\nE.g. [`sadly, we need to hide the rect from the documentation finder for now`](https://github.com/pygame/pygame/commit/b6091d38c8a5639d311858660b38841d96598509#diff-eae59f175858fcef57cb17e733981c73R27) is negative and\n[`Theano has a built-in optimization for logsumexp (...) so we can just write the expression directly`](https://github.com/keras-team/keras/commit/7d52af64c03e71bcd23112a7086dc8aab1b37ed2#diff-ff634bb5c5441d7052449f20018872b8R549)\nis positive. Don't expect too much though - as was written, the sentiment model is\ngeneral purpose and the code comments have different nature, so there is no magic (for now).\n\nHercules must be built with \"tensorflow\" tag - it is not by default:\n\n```\nmake TAGS=tensorflow\n```\n\nSuch a build requires [`libtensorflow`](https://www.tensorflow.org/install/install_go).\n\n#### Everything in a single pass\n\n```\nhercules --burndown --burndown-files --burndown-people --couples --shotness --devs [--people-dict=/path/to/identities]\nlabours -m all\n```\n\n### Plugins\n\nHercules has a plugin system and allows to run custom analyses. See [PLUGINS.md](PLUGINS.md).\n\n### Merging\n\n`hercules combine` is the command which joins several analysis results in Protocol Buffers format together.\n\n```\nhercules --burndown --pb https://github.com/src-d/go-git > go-git.pb\nhercules --burndown --pb https://github.com/src-d/hercules > hercules.pb\nhercules combine go-git.pb hercules.pb | labours -f pb -m burndown-project --resample M\n```\n\n### Bad unicode errors\n\nYAML does not support the whole range of Unicode characters and the parser on `labours` side\nmay raise exceptions. Filter the output from `hercules` through `fix_yaml_unicode.py` to discard\nsuch offending characters.\n\n```\nhercules --burndown --burndown-people https://github.com/... | python3 fix_yaml_unicode.py | labours -m people\n```\n\n### Plotting\n\nThese options affects all plots:\n\n```\nlabours [--style=white|black] [--backend=] [--size=Y,X]\n```\n\n`--style` sets the general style of the plot (see `labours --help`).\n`--background` changes the plot background to be either white or black.\n`--backend` chooses the Matplotlib backend.\n`--size` sets the size of the figure in inches. The default is `12,9`.\n\n(required in macOS) you can pin the default Matplotlib backend with\n\n```\necho \"backend: TkAgg\" > ~/.matplotlib/matplotlibrc\n```\n\nThese options are effective in burndown charts only:\n\n```\nlabours [--text-size] [--relative]\n```\n\n`--text-size` changes the font size, `--relative` activate the stretched burndown layout.\n\n### Custom plotting backend\n\nIt is possible to output all the information needed to draw the plots in JSON format.\nSimply append `.json` to the output (`-o`) and you are done. The data format is not fully\nspecified and depends on the Python code which generates it. Each JSON file should\ncontain `\"type\"` which reflects the plot kind.\n\n### Caveats\n\n1. Processing all the commits may fail in some rare cases. If you get an error similar to https://github.com/src-d/hercules/issues/106\nplease report there and specify `--first-parent` as a workaround.\n1. Burndown collection may fail with an Out-Of-Memory error. See the next session for the workarounds.\n1. Parsing YAML in Python is slow when the number of internal objects is big. `hercules`' output\nfor the Linux kernel in \"couples\" mode is 1.5 GB and takes more than an hour / 180GB RAM to be\nparsed. However, most of the repositories are parsed within a minute. Try using Protocol Buffers\ninstead (`hercules --pb` and `labours -f pb`).\n1. To speed up yaml parsing\n ```\n # Debian, Ubuntu\n apt install libyaml-dev\n # macOS\n brew install yaml-cpp libyaml\n\n # you might need to re-install pyyaml for changes to make effect\n pip uninstall pyyaml\n pip --no-cache-dir install pyyaml\n ```\n\n### Burndown Out-Of-Memory\n\nIf the analyzed repository is big and extensively uses branching, the burndown stats collection may\nfail with an OOM. You should try the following:\n\n1. Read the repo from disk instead of cloning into memory.\n2. Use `--skip-blacklist` to avoid analyzing the unwanted files. It is also possible to constrain the `--language`.\n3. Use the [hibernation](doc/HIBERNATION.md) feature: `--hibernation-distance 10 --burndown-hibernation-threshold=1000`. Play with those two numbers to start hibernating right before the OOM.\n4. Hibernate on disk: `--burndown-hibernation-disk --burndown-hibernation-dir /path`.\n5. `--first-parent`, you win.\n\n\n", "description_content_type": "text/markdown", "docs_url": null, "download_url": "https://github.com/src-d/hercules", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/src-d/hercules", "keywords": "git,mloncode,mining software repositories,hercules", "license": "Apache-2.0", "maintainer": "", "maintainer_email": "", "name": "labours", "package_url": "https://pypi.org/project/labours/", "platform": "", "project_url": "https://pypi.org/project/labours/", "project_urls": { "Download": "https://github.com/src-d/hercules", "Homepage": "https://github.com/src-d/hercules" }, "release_url": "https://pypi.org/project/labours/10.5.1/", "requires_dist": [ "matplotlib (<4.0,>=2.0)", "numpy (<2.0,>=1.12.0)", "pandas (<1.0,>=0.20.0)", "PyYAML (<5.0,>=3.0)", "scipy (<1.2.2,>=0.19.0)", "protobuf (<4.0,>=3.5.0)", "munch (<3.0,>=2.0)", "hdbscan (<2.0,>=0.8.0)", "seriate (<2.0,>=1.0)", "fastdtw (<2.0,>=0.3.2)", "python-dateutil (<3.0,>=2.6.0)", "lifelines (<2.0,>=0.20.0)", "tqdm (<5.0,>=4.3)" ], "requires_python": "", "summary": "Python companion for github.com/src-d/hercules to visualize the results.", "version": "10.5.1" }, "last_serial": 5949801, "releases": { "10.0.0": [ { "comment_text": "", "digests": { "md5": "2648d7a3c906068d6f27c41a871aa8bd", "sha256": "8e00a209c00fa2b3bb5eae0af35048698a1749a34254dd833410f788cbee7ba2" }, "downloads": -1, "filename": "labours-10.0.0-py3-none-any.whl", "has_sig": false, "md5_digest": "2648d7a3c906068d6f27c41a871aa8bd", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 55953, "upload_time": "2019-03-23T08:54:16", "url": "https://files.pythonhosted.org/packages/1f/9a/0bc624835472281121090d7ef0c1e274351709b897796a50a0662527a0d7/labours-10.0.0-py3-none-any.whl" } ], "10.0.1": [ { "comment_text": "", "digests": { "md5": "2a7afa25590a5b16ef9dfd2c813d9fc5", "sha256": "2da3a15e68b2dcd62117fcec47a6b5ceb396c18b62812fd2a8d7aa24a48dbade" }, "downloads": -1, "filename": "labours-10.0.1-py3-none-any.whl", "has_sig": false, "md5_digest": "2a7afa25590a5b16ef9dfd2c813d9fc5", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 56497, "upload_time": "2019-04-06T15:20:07", "url": "https://files.pythonhosted.org/packages/d0/2e/f6268dcb4d2b8eb67ba823ea872d246b26ca8b9a8025030f0ae51c74fa20/labours-10.0.1-py3-none-any.whl" } ], "10.0.2": [ { "comment_text": "", "digests": { "md5": "14cb49fa8f78ee0d87926258cdeccb52", "sha256": "8b9824677b5d77e7677647d33821f14aa0989cf11e077ed4823ebb81fc07f9be" }, "downloads": -1, "filename": "labours-10.0.2-py3-none-any.whl", "has_sig": false, "md5_digest": "14cb49fa8f78ee0d87926258cdeccb52", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 56498, "upload_time": "2019-04-09T07:12:19", "url": "https://files.pythonhosted.org/packages/fe/12/0b55ab2985a9a80f0bec20ac979dc5e80c3b7575089178de3bd0eb135a24/labours-10.0.2-py3-none-any.whl" } ], "10.0.3": [ { "comment_text": "", "digests": { "md5": "f365ac4be0751844b3b392d0ba70324e", "sha256": "09dde4997e7dd7608ae0cb20393dda49d00a3a55b11675d4148a2c822215f103" }, "downloads": -1, "filename": "labours-10.0.3-py3-none-any.whl", "has_sig": false, "md5_digest": "f365ac4be0751844b3b392d0ba70324e", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 56969, "upload_time": "2019-04-09T06:46:00", "url": "https://files.pythonhosted.org/packages/da/b1/f84d56ffe1a6ef9093eaf70e47d29137ab6658077fe610cd989e8227e8f0/labours-10.0.3-py3-none-any.whl" } ], "10.0.4": [ { "comment_text": "", "digests": { "md5": "13db9fac259cfd8d98c51b57d6cd77d7", "sha256": "31bb22a7f9bc1555c2b57eee144676d23d91d1161e30d1726025e1e529969bed" }, "downloads": -1, "filename": "labours-10.0.4-py3-none-any.whl", "has_sig": false, "md5_digest": "13db9fac259cfd8d98c51b57d6cd77d7", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 56969, "upload_time": "2019-04-09T21:46:11", "url": "https://files.pythonhosted.org/packages/ef/a3/9158d7a318b1949db4c2c2fa2c8b5857da1a9cb4b07a17ceca8c98b92d5b/labours-10.0.4-py3-none-any.whl" } ], "10.1.0": [ { "comment_text": "", "digests": { "md5": "7adc0954963e4f157f88f09d845cf8fa", "sha256": "c3a945454e47a69db390f508ca59b30ce137b776ce76c89e0f8e9a8bb2361e5e" }, "downloads": -1, "filename": "labours-10.1.0-py3-none-any.whl", "has_sig": false, "md5_digest": "7adc0954963e4f157f88f09d845cf8fa", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 57090, "upload_time": "2019-04-26T08:53:46", "url": "https://files.pythonhosted.org/packages/f8/ce/d6a4818f65e7637ef342fb654c36c2891394164dcdec63d7383a372ffda1/labours-10.1.0-py3-none-any.whl" } ], "10.2.0": [ { "comment_text": "", "digests": { "md5": "b04d3be3858d7d03c0409287b7463c64", "sha256": "742099da014a383a0a65b536c0bf947c59e30ae6441a732910a98dc6a0d4ec13" }, "downloads": -1, "filename": "labours-10.2.0-py3-none-any.whl", "has_sig": false, "md5_digest": "b04d3be3858d7d03c0409287b7463c64", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 57677, "upload_time": "2019-06-21T11:32:11", "url": "https://files.pythonhosted.org/packages/38/a5/25c894dd03a141d309c4ef85f823273f431a9b3d8341911a4e27f7615cc9/labours-10.2.0-py3-none-any.whl" } ], "10.3.0": [ { "comment_text": "", "digests": { "md5": "f6868755c2456e003db8c0498af2d89a", "sha256": "b644bd5b1cf3b88b9f8bb403b0cc66eea1ca5fccf239107267317dd81dee4515" }, "downloads": -1, "filename": "labours-10.3.0-py3-none-any.whl", "has_sig": false, "md5_digest": "f6868755c2456e003db8c0498af2d89a", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 57688, "upload_time": "2019-07-11T20:59:27", "url": "https://files.pythonhosted.org/packages/32/88/bc9f4ffd1c325e4824abd5a46a35e05aa9381c609b2d496ab38c92ba4486/labours-10.3.0-py3-none-any.whl" } ], "10.4.0": [ { "comment_text": "", "digests": { "md5": "1d35794fd35879188c80ac1e923fed83", "sha256": "917e4af10e7d5a5f4aa401b0eebfad31426fd9b63589d875bb9c84b81fc99793" }, "downloads": -1, "filename": "labours-10.4.0-py3-none-any.whl", "has_sig": false, "md5_digest": "1d35794fd35879188c80ac1e923fed83", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 57780, "upload_time": "2019-09-18T09:26:39", "url": "https://files.pythonhosted.org/packages/2c/1f/1d941dcc4e7a5a9d919d00d4df226715abb46e02a8dea4d84fd3bfb12a51/labours-10.4.0-py3-none-any.whl" } ], "10.4.2": [ { "comment_text": "", "digests": { "md5": "39525df9a8185043b8634cc5dcb41f25", "sha256": "93eeb6387bb5696e2d37a96911be7cfd9686452e62ab2cec17982d47ddb705b9" }, "downloads": -1, "filename": "labours-10.4.2-py3-none-any.whl", "has_sig": false, "md5_digest": "39525df9a8185043b8634cc5dcb41f25", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 57874, "upload_time": "2019-09-28T18:03:09", "url": "https://files.pythonhosted.org/packages/d6/5b/fa14316d1ea4f02a67446d900dc86eaa6f26bfa621e7dfce6ea3327422bc/labours-10.4.2-py3-none-any.whl" } ], "10.4.3": [ { "comment_text": "", "digests": { "md5": "affaaa1e1d3bd7c4ddb2c1ee705fe04d", "sha256": "ca8ff52e1ea7e9a3c16f69af0c3af5d9285d71db28588013605f24c4893d1580" }, "downloads": -1, "filename": "labours-10.4.3-py3-none-any.whl", "has_sig": false, "md5_digest": "affaaa1e1d3bd7c4ddb2c1ee705fe04d", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 58032, "upload_time": "2019-09-28T19:00:03", "url": "https://files.pythonhosted.org/packages/d1/6c/4a3c94964fab583ba8b64e7d05b2907fd5fe453841e106a4dcf8a5cd80cc/labours-10.4.3-py3-none-any.whl" } ], "10.5.0": [ { "comment_text": "", "digests": { "md5": "cbf4352d65d992b2f860003fe97da813", "sha256": "14a512df555e03a80899e7f20f274a58b4da0ebc29624e1c66d2b7a07d590959" }, "downloads": -1, "filename": "labours-10.5.0-py3-none-any.whl", "has_sig": false, "md5_digest": "cbf4352d65d992b2f860003fe97da813", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 68351, "upload_time": "2019-10-07T09:12:08", "url": "https://files.pythonhosted.org/packages/d3/91/2584c7615a78ca3142b4261210d3bd78ea93e3255543cc129a570f88643a/labours-10.5.0-py3-none-any.whl" } ], "10.5.1": [ { "comment_text": "", "digests": { "md5": "3a5cb918eb346c70eae8cdc0223c86f0", "sha256": "15fdfc6e7bf9afb728a0e62f64d7392810d7cffe31a3722a9ffcab6467045513" }, "downloads": -1, "filename": "labours-10.5.1-py3-none-any.whl", "has_sig": false, "md5_digest": "3a5cb918eb346c70eae8cdc0223c86f0", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 68348, "upload_time": "2019-10-09T13:29:33", "url": "https://files.pythonhosted.org/packages/9a/f7/aea88e09f04213b8e2e1a899b235f8d2d579cdb42909b43e4e503cc4d427/labours-10.5.1-py3-none-any.whl" } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "3a5cb918eb346c70eae8cdc0223c86f0", "sha256": "15fdfc6e7bf9afb728a0e62f64d7392810d7cffe31a3722a9ffcab6467045513" }, "downloads": -1, "filename": "labours-10.5.1-py3-none-any.whl", "has_sig": false, "md5_digest": "3a5cb918eb346c70eae8cdc0223c86f0", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 68348, "upload_time": "2019-10-09T13:29:33", "url": "https://files.pythonhosted.org/packages/9a/f7/aea88e09f04213b8e2e1a899b235f8d2d579cdb42909b43e4e503cc4d427/labours-10.5.1-py3-none-any.whl" } ] }