{ "info": { "author": "Jarek Potiuk, Szymon Przedwojski, Kamil Bregu\u0142a, Feng Lu, Cameron Moberg", "author_email": "jarek.potiuk@polidea.com, szymon.przedwojski@polidea.com, kamil.bregula@polidea.com, fenglu@google.com, cjmoberg@google.com", "bugtrack_url": null, "classifiers": [ "License :: OSI Approved :: Apache Software License", "Operating System :: OS Independent", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.8" ], "description": "\n# Oozie to Airflow\n\n[![Build Status](https://travis-ci.org/GoogleCloudPlatform/oozie-to-airflow.svg?branch=master)](https://travis-ci.org/GoogleCloudPlatform/oozie-to-airflow)\n[![codecov](https://codecov.io/gh/GoogleCloudPlatform/oozie-to-airflow/branch/master/graph/badge.svg)](https://codecov.io/gh/GoogleCloudPlatform/oozie-to-airflow)\n[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/ambv/black)\n[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)\n[![Dependabot Status](https://api.dependabot.com/badges/status?host=github&repo=GoogleCloudPlatform/oozie-to-airflow)](https://dependabot.com)\n[![Python 3](https://pyup.io/repos/github/GoogleCloudPlatform/oozie-to-airflow/python-3-shield.svg)](https://pyup.io/repos/github/GoogleCloudPlatform/oozie-to-airflow/)\n\nA tool to easily convert between [Apache Oozie](http://oozie.apache.org/) workflows\nand [Apache Airflow](https://airflow.apache.org) workflows.\n\nThe program targets Apache Airflow >= 1.10 and Apache Oozie 1.0 XML schema.\n\nIf you want to contribute to the project, please take a look at [CONTRIBUTING.md](CONTRIBUTING.md)\n\n# Table of Contents\n\n\n\n\n\n- [Background](#background)\n- [Running the Program](#running-the-program)\n - [Installing from PyPi](#installing-from-pypi)\n - [Installing from sources](#installing-from-sources)\n - [Running the conversion](#running-the-conversion)\n - [Structure of the application folder](#structure-of-the-application-folder)\n - [The o2a libraries](#the-o2a-libraries)\n- [Supported Oozie features](#supported-oozie-features)\n - [Control nodes](#control-nodes)\n - [EL Functions](#el-functions)\n - [Workflow and node notifications](#workflow-and-node-notifications)\n- [Airflow-specific optimisations](#airflow-specific-optimisations)\n - [Removing unnecessary control nodes](#removing-unnecessary-control-nodes)\n - [Removing inaccessible nodes](#removing-inaccessible-nodes)\n- [Common Known Limitations](#common-known-limitations)\n - [File/Archive functionality](#filearchive-functionality)\n - [Not all global configuration methods are supported](#not-all-global-configuration-methods-are-supported)\n - [Support for uber.jar feature](#support-for-uberjar-feature)\n - [Support for .so and .jar lib files](#support-for-so-and-jar-lib-files)\n - [Custom messages missing for Kill Node](#custom-messages-missing-for-kill-node)\n - [Capturing output is not supported](#capturing-output-is-not-supported)\n - [Subworkflow DAGs must be placed in examples](#subworkflow-dags-must-be-placed-in-examples)\n - [EL functions support](#el-functions-support)\n - [Notification proxy is not supported](#notification-proxy-is-not-supported)\n- [Cloud execution environment for Oozie to Airflow conversion](#cloud-execution-environment-for-oozie-to-airflow-conversion)\n - [Cloud environment setup](#cloud-environment-setup)\n- [Examples](#examples)\n - [EL Example](#el-example)\n - [SSH Example](#ssh-example)\n - [Email Example](#email-example)\n - [MapReduce Example](#mapreduce-example)\n - [FS Example](#fs-example)\n - [Java Example](#java-example)\n - [Pig Example](#pig-example)\n - [Shell Example](#shell-example)\n - [Spark Example](#spark-example)\n - [Sub-workflow Example](#sub-workflow-example)\n - [DistCp Example](#distcp-example)\n - [Decision Example](#decision-example)\n - [Hive/Hive2 Example](#hivehive2-example)\n - [Demo Example](#demo-example)\n - [Childwf Example](#childwf-example)\n\n\n\n# Background\n\nApache Airflow is a workflow management system developed by AirBnB in 2014.\nIt is a platform to programmatically author, schedule, and monitor workflows.\nAirflow workflows are designed as [Directed Acyclic Graphs](https://airflow.apache.org/tutorial.html#example-pipeline-definition)\n(DAGs) of tasks in Python. The Airflow scheduler executes your tasks on an array of\nworkers while following the specified dependencies.\n\nApache Oozie is a workflow scheduler system to manage Apache Hadoop jobs.\nOozie workflows are also designed as [Directed Acyclic Graphs](https://oozie.apache.org/docs/3.1.3-incubating/DG_Overview.html)\n(DAGs) in XML.\n\nThere are a few differences noted below:\n\n\n| | Spec. | Task | Dependencies | \"Subworkflows\" | Parameterization | Notification |\n|---------|--------|-------------|---------------------------------|----------------|------------------------------|---------------------|\n| Oozie | XML | Action Node | Control Node | Subworkflow | EL functions/Properties file | URL based callbacks |\n| Airflow | Python | Operators | Trigger Rules, set_downstream() | SubDag | jinja2 and macros | Callbacks/Emails |\n\n\n# Running the Program\n\nNote that you need Python >= 3.6 to run the converter.\n\n## Installing from PyPi\n\nYou can install `o2a` from PyPi via `pip install o2a`. After installation, the\n[o2a](bin/o2a) and [o2a-validate-workflows](bin/o2a-validate-workflows) should be available on your path.\n\n## Installing from sources\n\n1. (Optional) Install virtualenv:\n\n In case you use sources of `o2a`, the environment can be set up via the virtualenv setup\n(you can create one using [virtualenvwrapper](https://virtualenvwrapper.readthedocs.io/en/latest/)\nfor example).\n\n2. Install Oozie-to-Airflow - you have 2 options to do so:\n\n 1. automatically: install `o2a` from local folder using `pip install -e .`\n\n This will take care about, among others, adding the [bin](bin) subdirectory to the PATH.\n\n 2. more manually:\n\n 1. While in your virtualenv, you can install all the requirements via `pip install -r requirements.txt`.\n\n 2. You can add the [bin](bin) subdirectory to your\n PATH, then all the scripts below can be run without adding the `./bin` prefix.\n This can be done for example by adding a line similar to the one below to your `.bash_profile`\n or `bin/postactivate` from your virtual environment:\n\n ```bash\n export PATH=${PATH}:/bin\n ```\n\n Otherwise you need to run all the scripts from the bin subdirectory, for example:\n\n ```bash\n ./bin/o2a --help\n ```\n\nIn all the example commands below, it is assumed that the [bin](bin) directory is in your PATH -\neither installed from PyPi or from the sources.\n\n## Running the conversion\n\nYou can run the program by calling:\n`o2a -i -o `\n\nExample:\n`o2a -i examples/demo -o output/demo`\n\nThis is the full usage guide, available by running `o2a -h`\n\n```\nusage: o2a [-h] -i INPUT_DIRECTORY_PATH -o OUTPUT_DIRECTORY_PATH [-n DAG_NAME]\n [-u USER] [-s START_DAYS_AGO] [-v SCHEDULE_INTERVAL] [-d]\n\nConvert Apache Oozie workflows to Apache Airflow workflows.\n\noptional arguments:\n -h, --help show this help message and exit\n -i INPUT_DIRECTORY_PATH, --input-directory-path INPUT_DIRECTORY_PATH\n Path to input directory\n -o OUTPUT_DIRECTORY_PATH, --output-directory-path OUTPUT_DIRECTORY_PATH\n Desired output directory\n -n DAG_NAME, --dag-name DAG_NAME\n Desired DAG name [defaults to input directory name]\n -u USER, --user USER The user to be used in place of all ${user.name}\n [defaults to user who ran the conversion]\n -s START_DAYS_AGO, --start-days-ago START_DAYS_AGO\n Desired DAG start as number of days ago\n -v SCHEDULE_INTERVAL, --schedule-interval SCHEDULE_INTERVAL\n Desired DAG schedule interval as number of days\n -d, --dot Renders workflow files in DOT format\n```\n\n## Structure of the application folder\n\nThe input application directory has to follow the structure defined as follows:\n\n```\n/\n |- job.properties - job properties that are used to run the job\n |- hdfs - folder with application - should be copied to HDFS\n | |- workflow.xml - Oozie workflow xml (1.0 schema)\n | |- ... - additional folders required to be copied to HDFS\n |- configuration.template.properties - template of configuration values used during conversion\n |- configuration.properties - generated properties for configuration values\n```\n\n## The o2a libraries\n\nConverted Airflow DAGs use common libraries. Those libraries should be available on PYTHONPATH for all\nAirflow components - scheduler, webserver and workers - so that they can be imported when DAGs are parsed.\n\nThose libraries are in [o2a/o2a_libs](o2a/o2a_libs) folder and the easiest way to make them available to\nall the DAGs is to copy them (preserving o2a parent directory) to the \"dags\" folder of Airflow. This\nis done automatically during the [automated system tests](CONTRIBUTING.md#running-system-tests)\nexecuted in composer environment where the libs are copied to Composer's DAG folder in Google Cloud Storage.\n\n\n# Supported Oozie features\n\n## Control nodes\n### Fork and Join\n\nA [fork node](https://oozie.apache.org/docs/5.1.0/WorkflowFunctionalSpec.html#a3.1.5_Fork_and_Join_Control_Nodes)\nsplits the path of execution into multiple concurrent paths of execution.\n\nA [join node](https://oozie.apache.org/docs/5.1.0/WorkflowFunctionalSpec.html#a3.1.5_Fork_and_Join_Control_Nodes)\nwaits until every concurrent execution of the previous fork node arrives to it. The fork and join nodes must be used in pairs. The join node\nassumes concurrent execution paths are children of the same fork node.\n~~~~\n\n ...\n \n \n ...\n \n \n ...\n \n ...\n\n~~~~\n\n### Decision\n\nA [decision node](https://oozie.apache.org/docs/5.1.0/WorkflowFunctionalSpec.html#a3.1.4_Decision_Control_Node)\nenables a workflow to make a selection on the execution path to follow.\n\nThe behavior of a decision node can be seen as a switch-case statement.\n\nA decision node consists of a list of predicates-transition pairs plus a default transition. Predicates are evaluated in order or appearance until one of them evaluates to true and the corresponding transition is taken. If none of the predicates evaluates to true the default transition is taken.\n\nPredicates are JSP Expression Language (EL) expressions (refer to section 4.2 of this document) that resolve into a boolean value, true or false . For example:\n`${fs:fileSize('/usr/foo/myinputdir') gt 10 * GB}`\n\n~~~~\n\n ...\n \n \n [PREDICATE]\n ...\n [PREDICATE]\n \n \n \n ...\n\n~~~~\n### Start\n\nThe [start node](https://oozie.apache.org/docs/5.1.0/WorkflowFunctionalSpec.html#a3.1.1_Start_Control_Node)\nis the entry point for a workflow job, it indicates the first workflow node the workflow job must transition to.\n\nWhen a workflow is started, it automatically transitions to the node specified in the start .\n\nA workflow definition must have one start node.\n\n~~~~\n\n ...\n \n ...\n\n~~~~\n### End\n\nThe [end node](https://oozie.apache.org/docs/5.1.0/WorkflowFunctionalSpec.html#a3.1.2_End_Control_Node)\nis the end for a workflow job, it indicates that the workflow job has completed successfully.\n\nWhen a workflow job reaches the end it finishes successfully (SUCCEEDED).\n\nIf one or more actions started by the workflow job are executing when the end node is reached, the actions will be killed. In this scenario the workflow job is still considered as successfully run.\n\nA workflow definition must have one end node.\n\n~~~~\n\n ...\n \n ...\n\n~~~~\n\n### Kill\n\nThe [kill node](https://oozie.apache.org/docs/5.1.0/WorkflowFunctionalSpec.html#a3.1.3_Kill_Control_Node)\nallows a workflow job to exit with an error.\n\nWhen a workflow job reaches the kill it finishes in error (KILLED).\n\nIf one or more actions started by the workflow job are executing when the kill node is reached, the actions will be killed.\n\nA workflow definition may have zero or more kill nodes.\n\n~~~~\n\n ...\n \n [MESSAGE-TO-LOG]\n \n ...\n\n~~~~\n\n## EL Functions\n\nAs of now, a very minimal set of [Oozie EL](https://oozie.apache.org/docs/4.0.1/WorkflowFunctionalSpec.html#a4.2_Expression_Language_Functions)\nfunctions are supported. The way they work is that an EL expression is being translated to\na jinja template. The translation is performed using [Lark](https://lark-parser.readthedocs.io/en/latest/).\nAll required variables should be passed in `job.properties`. Equivalents of EL functions can be found in\n`o2a_libs/functions.py`.\n\nFor example the following EL expression\n```${wf:user() == firstNotNull(arg1, arg2)}```\nis translated to the following jinja equivalent:\n```{{functions.wf.user() == functions.first_not_null(arg1, arg2)}}```\nand it requires that `job.properties` includes values for `arg1` and `arg2`.\n\nThis design allows for custom EL function mapping if one so chooses. By\ndefault everything gets mapped to the module `o2a_libs.functions`. This means in\norder to use EL function mapping, the folder `o2a_libs.functions` should\nbe copied over to the Airflow DAG folder. This should then be picked up and\nparsed by the Airflow workers and then available to all DAGs.\n\n## Workflow and node notifications\n\nWorkflow jobs can be configured to make an HTTP GET notification upon start and end of a workflow action node\nand upon the start and completion of a workflow job. More information in [Oozie docs](https://oozie.apache.org/docs/5.1.0/WorkflowFunctionalSpec.html#a5_Workflow_Notifications).\n\nOozie-to-Airflow supports this feature.\nThe `job.properties` file has contain URLs for workflow and action node notifications - example below:\n\n```\noozie.wf.workflow.notification.url=http://example.com/workflow?job-id=$jobId&status=$status\noozie.wf.action.notification.url=http://example.com/action?job-id=$jobId&node-name=$nodeName&status=$status\n```\n\nIf they are present, Oozie-to-Airflow will insert additional `BashOperator` to the generated DAG\nfor each notification to be sent, right before or after the appropriate node (for node notifications) or at the beginning or end\nof the workflow (for workflow notifications).\nInside the `BashOperator` will use `curl` to send an HTTP GET request to the appropriate URL endpoint.\n\nExample DAG without notifications:\n\n![dag without notifications](images/childwf_without_notifications.png)\n\nThe same DAG with notifications:\n\n![dag with notifications](images/childwf_with_notifications.png)\n\n# Airflow-specific optimisations\n\nDue to the fact that Oozie and Airflow differ with regards to some aspects of running workflows,\nthere may be some differences in the output Airflow DAG with regards to the Oozie XML.\n\n## Removing unnecessary control nodes\n\nIn Airflow you don't need as many explicit control nodes as in Oozie. For example you don't ever need a Start\nnode and in most cases End is also not needed.\n\nWe introduced the concept of `Transformers` in O2A, which modify the workflow. Below are the ones that\nremove unnecessary control nodes:\n\n- `RemoveEndTransformer` - removes End nodes with all relations when it's not connected to a Decision node,\n- `RemoveKillTransformer` - removes Kill nodes with all relations when it's not connected to a Decision node,\n- `RemoveStartTransformer` - removes Start nodes with all relations,\n- `RemoveForkTransformer` - removes Fork nodes when there are no upstream nodes,\n- `RemoveJoinTransformer` - removes Join nodes when there are no downstream nodes.\n\n## Removing inaccessible nodes\n\nIn Oozie for a node to be executed it has to be able to be traced back to the Start node.\nIf a node is \"loose\" and is not connected to Start in any way (directly or indirectly via its \"parents\") it\nwill be skipped.\n\nHowever in Airflow all tasks will be executed. Therefore in order to replicate the \"skipping\" of loose nodes\nbehaviour of Oozie we need to remove nodes unconnected to Start during the conversion phase.\n\nThis is achieved thanks to the `RemoveInaccessibleNodeTransformer`.\n\n# Common Known Limitations\n\nThere are few limitations in the implementation of the Oozie-To-Airflow converter. It's not possible to\nwrite a converter that handles all cases of complex workflows from Ooozie because some of\nfunctionalities available are not possible to map easily to existing Airflow Operators or\ncannot be tested because of the current Dataproc + Composer limitations. Some of those limitations\nmight be removed in the future. Below is a list of common known limitations that we are aware of for now.\n\nMany of those limitations are not blockers - the workflows will still be converted to Python DAGs\nand it should be possible to manually (or automatically) post-process the DAGs to add custom\nfunctionality. So even with those limitations in place you can still save a ton of work when\nconverting many Oozie workflows.\n\nIn the following, \"Examples\" section more specific per-action limitations are listed as well.\n\n## File/Archive functionality\n\nAt the time of this writing we were not able to determine if file/archive\nfunctionality works as intended. While we map appropriate file/archive methods it seems that Oozie\ntreats file/archive somewhat erraticaly. This is not a blocker to run most of the operations, however\nsome particular complex workflows might be problematic. Further testing with real, production Oozie\nworkflows is needed to verify our implementation.\n\n[Example Oozie docs](https://oozie.apache.org/docs/5.1.0/WorkflowFunctionalSpec.html#a3.2.2.1_Adding_Files_and_Archives_for_the_Job)\n\n* [File/Archive in Pig doesn't work](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/243)\n\n## Not all global configuration methods are supported\n\nOozie implements a number of ways how configuration parameters are passed to actions. Out of the existing\nconfiguration options the following ones are not supported (but can be easily added as needed):\n\n* [The config-default.xml file](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/137)\n* [Parameters section of workflow.xml](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/138)\n* [Handle Global configuration properties](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/134)\n\n## Support for uber.jar feature\n\nThe uber.jar feature is not supported. [Oozie docs](https://oozie.apache.org/docs/5.1.0/WorkflowFunctionalSpec.html#AppDeployment)\n\n* [Support uber.jar feature](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/140)\n\n## Support for .so and .jar lib files\n\nOozie adds .so and .jar files from the lib folder to Local Cache for all the jobs run to\nLD_LIBRARY_PATH/CLASSPATH. Currently only Java Mapper supports it.\n\n* [Support for Oozie lib .so files](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/302)\n* [Support for Oozie lib .jar files](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/301)\n\n## Custom messages missing for Kill Node\n\nThe Kill Node might have custom log message specified. This is not implemented.\n[Oozie docs](https://oozie.apache.org/docs/5.1.0/WorkflowFunctionalSpec.html#a3.1.3_Kill_Control_Node)\n\n* [Add handling of custom Kill Node message](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/97)\n\n## Capturing output is not supported\n\nIn several actions you can capture output from tasks. This is not yet implemented.\n[Example Oozie docs](https://oozie.apache.org/docs/5.1.0/WorkflowFunctionalSpec.html#a3.2.6_Java_Action)\n\n* [Add support for capture-ouput](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/155)\n\n## Subworkflow DAGs must be placed in examples\n\nCurrently all subworkflow DAGs must be in examples folder\n\n* [Subworkflow conversion expects to be run in examples](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/213)\n\n## EL functions support\n\nCurrently many EL-functions are implemented (basic functions, fs functions and subset od wf functions).\nCheck this [document](https://docs.google.com/spreadsheets/d/1lQJ101GDEkXyzKmB8l9nESao6CF9G_1qCURdcqmQ0uA/edit?usp=sharing)\nfor full information about current state. The following `wf:functions` are not implemented:\n\n* [`wf:actionTrackerUri`](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/372)\n* [`wf:actionExternalId`](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/371)\n* [`wf:actionData`](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/370)\n* [`wf:run`](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/369)\n* [`wf:errorMessage`](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/368)\n* [`wf:errorCode`](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/367)\n* [`wf:transition`](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/366)\n* [`wf:callback`](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/365)\n* [`wf:group`](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/364)\n* [`wf:appPath`](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/363)\n* [Hadoop Counters](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/392)\n\nAll implemented function could be found in [o2a_libs](https://github.com/GoogleCloudPlatform/oozie-to-airflow/tree/master/o2a/o2a_libs)\nmodule. Camel case names of Oozie functions were substituted with snake case equivalents (ex. lastErrorNode becomes\nlast_error_node).\n\nAdditionally some already implemented functions may not preserve the full logic of the original EL-expression\ndue to differences between Oozie and Airflow. It's difficult to implement it in generic-enough way to cover\nall possible cases, it's much easier to eave the implementation of those functions to the user.\nIt's perfectly possible to provide your own implementation of each of those functions if you need\nto customise it and in many cases it will be easier if it's specific implementation rather than generic one.\n\n## Notification proxy is not supported\n\nIn Oozie, the `oozie.wf.workflow.notification.proxy` property can be used to configure proxy,\nthrough which notifications will be sent.\n\nThis is not supported. Currently notifications will be sent directly, without proxy.\n\n# Cloud execution environment for Oozie to Airflow conversion\n\n## Cloud environment setup\n\nAn easy way of running the workflows of Oozie as well as running the oozie-to-airflow converted DAGs in\nAirflow is by using Cloud Composer and Dataproc in GCP. This the environment supported currently by the\nconverter and one that it was heavily tested with. These services allow testing without much need for an\non-premise setup. Here are some details about the environment that is supported:\n\n### Cloud Composer\n\n* composer-1.5.0-airflow-1.10.1\n* python version 3 (3.6.6)\n* machine n1-standard-1\n* node count: 3\n* Additional PyPi packages:\n * sshtunnel==0.1.4\n\n### Cloud Dataproc Cluster with Oozie\n\n* n1-standard-2, 4 vCPU, 20 GB memory (! Minimum 16 GB RAM needed)\n* primary disk size, 50 GB\n* Image 1.3.29-debian9\n* Hadoop version\n* Init action: [oozie-5.1.sh](dataproc/oozie-5.1.sh)\n\nThose are the steps you should follow to set it up:\n\n1. Create a Dataproc cluster see [Creating Dataproc Cluster](#creating-dataproc-cluster) below\n1. Create a [Cloud Composer Environment](https://cloud.google.com/composer/docs/how-to/managing/creating#creating_a_new_environment)\n with at least Airflow version 1.10 to test the Apache Airflow workflows.\n Since Airflow 1.10 is in Beta for Cloud Composer, you must\n [enable beta features in Cloud Console](https://cloud.google.com/composer/docs/concepts/beta-support#enable-beta))\n1. Set up all required [Airflow Connections](https://airflow.apache.org/howto/connection/index.html)\n in Composer. This is required for things like `SSHOperator`.\n\n### Creating Dataproc cluster\n\nWe prepared Dataproc [initialization action](https://cloud.google.com/dataproc/docs/concepts/configuring-clusters/init-actions)\nthat allows to run Oozie 5.1.0 on Dataproc.\n\nPlease upload [oozie-5.1.sh](dataproc/oozie-5.1.sh) to your GCS bucket and create cluster using following command:\n\nNote that you need at least 20GB RAM to run Oozie jobs on the cluster. The custom machine type below has enough RAM\nto handle oozie.\n\n```bash\ngcloud dataproc clusters create --region europe-west1 --subnet default --zone \"\" \\\n --single-node --master-machine-type custom-4-20480 --master-boot-disk-size 500 \\\n --image-version 1.3-deb9 --project --initialization-actions 'gs:////oozie-5.1.sh' \\\n --initialization-action-timeout=30m\n```\n\n**Note 1:** it might take ~20 minutes to create the cluster\n**Note 2:** the init-action works only with\n [single-node cluster](https://cloud.google.com/dataproc/docs/concepts/configuring-clusters/single-node-clusters)\n and Dataproc 1.3\n\nOnce cluster is created, steps from [example map reduce job](dataproc/example-map-reduce-job.sh) can be\nrun on master node to execute Oozie's example Map-Reduce job.\n\nOozie is serving web UI on port 11000. To enable access to it please follow\n[official instructions](https://cloud.google.com/dataproc/docs/concepts/accessing/cluster-web-interfaces)\non how to connect to the cluster web interfaces.\n\nList of jobs with their statuses can be also shown by issuing `oozie jobs` command on master node.\n\nMore about testing the Oozie to Airflow conversion process can be found in\n[CONTRIBUTING.md](CONTRIBUTING.md#running-system-tests)\n\n# Examples\n\nAll examples can be found in the [examples](examples) directory.\n\n* [EL](#el-example)\n* [SSH](#ssh-example)\n* [Email](#email-example)\n* [MapReduce](#mapreduce-example)\n* [FS](#fs-example)\n* [Java](#java-example)\n* [Pig](#pig-example)\n* [Shell](#shell-example)\n* [Spark](#spark-example)\n* [Sub-workflow](#sub-workflow-example)\n* [DistCp](#distcp-example)\n* [Decision](#decision-example)\n* [Hive/Hive2](#hivehive2-example)\n* [Demo](#demo-example)\n* [Child workflow](#childwf-example)\n\n## EL Example\n\n### Running\n\nThe Oozie Expression Language (EL) example can be run as:\n`o2a -i examples/el -o output/el`\n\nThis will showcase the ability to use the [o2a/o2a_libs](o2a/o2a_libs) folder to map EL functions\nto Python methods. This example assumes that the user has a valid Apache Airflow\nSSH connection set up and the [o2a/o2a_libs](o2a/o2a_libs) folder has been copied to the dags\nfolder (preserving o2a parent directory).\n\nPlease keep in mind that as of the current version only a single EL variable\nor single EL function. Variable/function chaining is not currently supported.\n\n### Output\nIn this example the output will be created in the `./output/el/` folder.\n\n### Known limitations\n\nDecision example is not yet fully functional as EL functions are not yet fully implemented so condition is\nhard-coded for now. Once EL functions are implemented, the condition in the example will be updated.\n\nGithub issue: [Implement decision node](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/42)\n\n\n## SSH Example\n\n### Prerequisites\n\nIn order to change the `user` or `host` in the example, please edit the\n`examples/ssh/hdfs/workflow.xml`.\n\n### Running\n\nThe ssh example can be run as:\n\n`o2a -i examples/ssh -o output/ssh`\n\nThis will convert the specified Oozie XML and write the output into the\nspecified output directory, in this case `output/ssh/ssh.py`.\n\nThere are some differences between Apache Oozie and Apache Airflow as far as the SSH specification goes.\nIn Airflow you will have to add/edit an SSH-specific connection that contains\nthe credentials required for the specified SSH action. For example, if\nthe SSH node looks like:\n```xml\n\n \n user@apache.org\n echo\n \"Hello Oozie!\"\n \n \n \n\n```\nThen the default Airflow SSH connection, `ssh_default` should have at\nthe very least a password set. This can be found in the Airflow Web UI\nunder **Admin > Connections**. From the command line it is impossible to\nedit connections so you must add one like:\n\n`airflow connections --add --conn_id --conn_type SSH --conn_password `\n\nMore information can be found in [Airflow's documentation](https://airflow.apache.org/cli.html#connections).\n\n### Output\nIn this example the output will be created in the `./output/ssh/` folder.\n\nThe converted DAG uses the `SSHOperator` in Airflow.\n\n### Known limitations\n\nNo known limitations.\n\n\n## Email Example\n\n### Prerequisites\n\nMake sure to first copy `/examples/email/configuration.template.properties`, rename it as\n`configuration.properties` and fill in with configuration data.\n\n### Running\n\nThe Email example can be run as:\n\n`o2a -i examples/email -o output/email`\n\n### Output\nIn this example the output will be created in the `./output/email/` folder.\n\nThe converted DAG uses the `EmailOperator` in Airflow.\n\n### Prerequisites\nIn Oozie the SMTP server configuration is located in `oozie-site.xml`.\n\nFor Airflow it needs to be located in `airflow.cfg`.\nExample Airflow SMTP configuration:\n```\n[email]\nemail_backend = airflow.utils.email.send_email_smtp\n\n[smtp]\nsmtp_host = example.com\nsmtp_starttls = True\nsmtp_ssl = False\nsmtp_user = airflow_user\nsmtp_password = password\nsmtp_port = 587\nsmtp_mail_from = airflow_user@example.com\n```\n\nFor more information on setting Airflow configuration options\n[see here](https://airflow.readthedocs.io/en/stable/howto/set-config.html).\n\n### Known limitations\n\n**1. Attachments are not supported**\n\nDue to the complexity of extracting files from HDFS inside Airflow and providing them\nfor the `EmailOperator`, the functionality of sending attachments has not yet been\nimplemented.\n\n*Solution:* Implement in O2A a mechanism to extract a file from HDFS inside Airflow.\n\nGithub Issue: [Add support for attachment in Email mapper](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/335)\n\n**2. `` tag is not supported**\n\nFrom Oozie docs:\n> From uri:oozie:email-action:0.2 one can also specify mail content type as\ntext/html. \u201ctext/plain\u201d is default.\n\nUnfortunately, currently the `EmailOperator` only accepts the `mime_subtype` parameter.\nHowever it only works for multipart subtypes, as the operator appends the subtype\nto the `multipart/` prefix. Therefore passing either `html` or `plain` from Oozie makes no sense.\n\nAs a result the email will always be sent with the `EmailOperator`'s default Content-Type value,\nwhich is `multipart/mixed`.\n\n*Solution:* Modify the Airflow's `EmailOperator` to support more\ncontent types.\n\nGithub Issue: [Content type support in Email mapper](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/343)\n\n**3. `cc` and `bcc` fields are not templated in EmailOperator**\n\nOnly the 'to', 'subject' and 'html_content' fields in EmailOperator are templated.\nIn practice this covers all fields of an Oozie email action node apart from `cc` and `bcc`.\n\nTherefore if there is an EL function in the action node in either of these two fields\nwhich will require a Jinja expression in Airflow, it will not work - the expression will\nnot be executed, but rather treated as a plain string.\n\n*Solution:* Modify the Airflow's `EmailOperator` to mark more fields as\n`template_fields`.\n\nGithub Issue: [The CC: and BCC: fields are not templated in EmailOperator](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/344)\n\n\n## MapReduce Example\n\n### Prerequisites\n\nMake sure to first copy `examples/mapreduce/configuration.template.properties`, rename it as\n`configuration.properties` and fill in with configuration data.\n\n### Running\n\nThe MapReduce example can be run as:\n\n`o2a -i examples/mapreduce -o output/mapreduce`\n\n### Output\nIn this example the output will be created in the `./output/mapreduce/` folder.\n\nThe converted DAG uses the `DataProcHadoopOperator` in Airflow.\n\n### Known limitations\n\n**1. Exit status not available**\n\nFrom the [Oozie documentation](https://oozie.apache.org/docs/5.1.0/WorkflowFunctionalSpec.html#a3.2.2_Map-Reduce_Action):\n> The counters of the Hadoop job and job exit status (FAILED, KILLED or SUCCEEDED) must be available to the\nworkflow job after the Hadoop jobs ends. This information can be used from within decision nodes and other\nactions configurations.\n\nCurrently we use the `DataProcHadoopOperator` which does not store the job exit status in an XCOM for other tasks to use.\n\nIssue in Github: [Implement exit status and counters in MapReduce Action](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/337)\n\n**2. Configuration options**\n\nFrom the [Oozie documentation](https://oozie.apache.org/docs/5.1.0/WorkflowFunctionalSpec.html#a3.2.2_Map-Reduce_Action)\n(the strikethrough is from us):\n> Hadoop JobConf properties can be specified as part of\n> - ~~the config-default.xml or~~\n> - ~~JobConf XML file bundled with the workflow application or~~\n> - ~~\\ tag in workflow definition or~~\n> - Inline map-reduce action configuration or\n> - ~~An implementation of OozieActionConfigurator specified by the tag in workflow definition.~~\n\nCurrently the only supported way of configuring the map-reduce action is with the\ninline action configuration, i.e. using the `` tag in the workflow's XML file definition.\n\nIssues in Github:\n* [Add support for config-default.xml](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/137)\n* [Add support for parameters section of the workflow.xml](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/138)\n* [Handle global configuration properties](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/134)\n\n**3. Streaming and pipes**\n\nStreaming and pipes are currently not supported.\n\nIssue in github [Implement streaming support](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/336)\n\n## FS Example\n\n### Prerequisites\n\nMake sure to first copy `examples/fs/configuration.template.properties`, rename it as\n`configuration.properties` and fill in with configuration data.\n\n### Running\n\nThe FS example can be run as:\n\n`o2a -i examples/fs -o output/fs`\n\n### Output\nIn this example the output will be created in the `./output/fs/` folder.\n\nThe converted DAG uses the `BashOperator` in Airflow.\n\n### Known limitations\n\nNot all FS operations are currently idempotent. It's not a problem if prepare action is used in other tasks\nbut might be a problem in certain situations. Fixing the operators to be idempotent requires more complex\nlogic and support for Pig actions is missing currently.\n\nIssue in Github: [FS Mapper and idempotence](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/175)\n\nThe dirFiles are not supported in FSMapper.\n\nIssue in Github: [Add support for dirFiles in FsMapper](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/80)\n\n## Java Example\n\n### Prerequisites\n\nMake sure to first copy `examples/fs/configuration.template.properties`, rename it as\n`configuration.properties` and fill in with configuration data.\n\n### Running\n\nThe Java example can be run as:\n\n`o2a -i examples/java -o output/java`\n\n### Output\nIn this example the output will be created in the `./output/java/` folder.\n\nThe converted DAG uses the `DataProcHadoopOperator` in Airflow.\n\n### Known limitations\n\n1. Overriding action's Main class via `oozie.launcher.action.main.class` is not implemented.\n\nIssue in Github: [Override Java main class with property](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/338)\n\n## Pig Example\n\n### Prerequisites\n\nMake sure to first copy `examples/pig/configuration.template.properties`, rename it as\n`configuration.properties` and fill in with configuration data.\n\n### Running\n\nThe Pig example can be run as:\n\n`o2a -i examples/pig -o output/pig`\n\n### Output\nIn this example the output will be created in the `./output/pig/` folder.\n\nThe converted DAG uses the `DataProcPigOperator` in Airflow.\n\n### Known limitations\n**1. Configuration options**\n\nFrom the [Oozie documentation](https://oozie.apache.org/docs/5.1.0/WorkflowFunctionalSpec.html#a3.2.3_Pig_Action)\n(the strikethrough is from us):\n> Hadoop JobConf properties can be specified as part of\n> - ~~the config-default.xml or~~\n> - ~~JobConf XML file bundled with the workflow application or~~\n> - ~~\\ tag in workflow definition or~~\n> - Inline pig action configuration.\n\nCurrently the only supported way of configuring the pig action is with the\ninline action configuration, i.e. using the `` tag in the workflow's XML file definition.\n\nIssues in Github:\n* [Add support for config-default.xml](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/137)\n* [Add support for parameters section of the workflow.xml](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/138)\n* [Handle global configuration properties](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/134)\n\n\n## Shell Example\n\n### Prerequisites\n\nMake sure to first copy `examples/shell/configuration.template.properties`, rename it as\n`configuration.properties` and fill in with configuration data.\n\n### Running\n\nThe Shell example can be run as:\n\n`o2a -i examples/shell -o output/shell`\n\n### Output\nIn this example the output will be created in the `./output/shell/` folder.\n\nThe converted DAG uses the `BashOperator` in Airflow, which executes the desired shell\naction with Pig by invoking `gcloud dataproc jobs submit pig --cluster= --region=\n--execute 'sh '`.\n\n### Known limitations\n\n**1. Exit status not available**\n\nFrom the [Oozie documentation](https://oozie.apache.org/docs/5.1.0/DG_ShellActionExtension.html):\n> The output (STDOUT) of the Shell job can be made available to the workflow job after the Shell job ends.\nThis information could be used from within decision nodes.\n\nCurrently we use the `BashOperator` which can store only the last line of the job output in an XCOM.\nIn this case the line is not helpful as it relates to the Dataproc job submission status and\nnot the Shell action's result.\n\nIssue in Github: [Finalize shell mapper](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/50)\n\n**2. No Shell launcher configuration**\n\nFrom the [Oozie documentation](https://oozie.apache.org/docs/5.1.0/DG_ShellActionExtension.html):\n> Shell launcher configuration can be specified with a file, using the job-xml element, and inline,\nusing the configuration elements.\n\nCurrently there is no way specify the shell launcher configuration (it is ignored).\n\nIssue in Github: [Shell Launcher Configuration](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/340)\n\n## Spark Example\n\n### Prerequisites\n\nMake sure to first copy `/examples/spark/configuration.template.properties`, rename it as\n`configuration.properties` and fill in with configuration data.\n\n### Running\n\nThe Spark example can be run as:\n\n`o2a -i examples/spark -o output/spark`\n\n### Output\nIn this example the output will be created in the `./output/spark/` folder.\n\nThe converted DAG uses the `DataProcSparkOperator` in Airflow.\n\n### Known limitations\n\n**1. Only tasks written in Java are supported**\n\nFrom the [Oozie documentation](https://oozie.apache.org/docs/5.1.0/DG_ShellActionExtension.html):\n> The jar element indicates a comma separated list of jars or python files.\n\nThe solution was tested with only a single Jar file.\n\n**2. No Spark launcher configuration**\n\nFrom the [Oozie documentation](https://oozie.apache.org/docs/5.1.0/DG_SparkActionExtension.html):\n> Shell launcher configuration can be specified with a file, using the job-xml element, and inline,\nusing the configuration elements.\n\nCurrently there is no way to specify the Spark launcher configuration (it is ignored).\n\n**3. Not all elements are supported**\n\nThe following elements are not supported: `job-tracker`, `name-node`, `master`, `mode`.\n\n## Sub-workflow Example\n\n### Prerequisites\n\nMake sure to first copy `examples/subwf/configuration.template.properties`, rename it as\n`configuration.properties` and fill in with configuration data.\n\n### Running\n\nThe Sub-workflow example can be run as:\n\n`o2a -i examples/subwf -o output/subwf`\n\n### Output\nIn this example the output (together with sub-worfklow dag) will be created in the `./output/subwf/` folder.\n\nThe converted DAG uses the `SubDagOperator` in Airflow.\n\n### Known limitations\n\nNo known limitations.\n\n## DistCp Example\n\n### Prerequisites\n\nMake sure to first copy `examples/distcp/configuration.template.properties`, rename it as\n`configuration.properties` and fill in with configuration data.\n\n### Running\n\nThe DistCp example can be run as:\n\n`o2a -i examples/distcp -o output/distcp`\n\n### Output\nIn this example the output will be created in the `./output/distcp/` folder.\n\nThe converted DAG uses the `BashOperator` in Airflow, which submits the Hadoop DistCp job using the\n`gcloud dataproc jobs submit hadoop` command.\n\n### Known limitations\n\nThe system test of the example run with Oozie fails due to unknown reasons. The converted DAG run by Airflow\ncompletes successfully.\n\n## Decision Example\n\n### Prerequisites\n\nMake sure to first copy `examples/decision/configuration.template.properties`, rename it as\n`configuration.properties` and fill in with configuration data.\n\n### Running\n\nThe decision example can be run as:\n\n`o2a -i examples/decision -o output/decision`\n\n### Output\nIn this example the output will be created in the `./output/decision/` folder.\n\nThe converted DAG uses the `BranchPythonOperator` in Airflow.\n\n### Known limitations\n\nDecision example is not yet fully functional as EL functions are not yet fully implemented so condition is\nhard-coded for now. Once EL functions are implemented, the condition in the example will be updated.\n\nGithub issue: [Implement decision node](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/42)\n\n\n## Hive/Hive2 Example\n\n### Prerequisites\n\nMake sure to first copy `/examples/hive/configuration.template.properties`, rename it as\n`configuration.properties` and fill in with configuration data.\n\n### Running\n\nThe Hive example can be run as:\n\n`o2a -i examples/hive -o output/hive`\n\n### Output\nIn this example the output will be created in the `./output/hive/` folder.\n\nThe converted DAG uses the `DataProcHiveOperator` in Airflow.\n\n### Known limitations\n\n**1. Only the connection to the local Hive instance is supported.**\n\nConnection configuration options are not supported.\n\n\n**2. Not all elements are supported**\n\nFor Hive, the following elements are not supported: `job-tracker`, `name-node`.\nFor Hive2, the following elements are not supported: `job-tracker`, `name-node`, `jdbc-url`, `password`.\n\nThe Github issue for both problems: [Hive connection configuration and other elements](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/342)\n\n\n## Demo Example\n\nThe demo example contains several action and control nodes. The control\nnodes are `fork`, `join`, `decision`, `start`, `end`, and `kill`. As far as action\nnodes go, there are `fs`, `map-reduce`, and `pig`.\n\nMost of these are already supported, but when the program encounters a node it does\nnot know how to parse, it will perform a sort of \"skeleton transformation\" -\nit will convert all the unknown nodes to dummy nodes. This will\nallow users to manually parse the nodes if they so wish as the control flow\nis there.\n\n### Prerequisites\n\nMake sure to first copy `examples/demo/configuration.template.properties`, rename it as\n`configuration.properties` and fill in with configuration data.\n\n### Running\n\nThe demo can be run as:\n\n`o2a -i examples/demo -o output/demo`\n\nThis will parse and write to an output file in the `output/demo` directory.\n\n### Known limitations\n\nThe decision node is not fully functional as there is not currently\n support for all EL functions. So in order for it to run in Airflow you may need to\nedit the Python output file and change the decision node expression.\n\nIssue in GitHub: [Implement decision node](https://github.com/GoogleCloudPlatform/oozie-to-airflow/issues/42)\n\n### Output\nIn this example the output (including sub-workflow dag) will be created in the `./output/demo/` folder.\n\n## Childwf Example\n\n### Prerequisites\n\nMake sure to first copy `examples/subwf/configuration.template.properties`, rename it as\n`configuration.properties` and fill in with configuration data.\n\n### Running\n\nThe childwf example is sub-workflow for the `demo` example. It can be run as:\n\n`o2a -i examples/childwf -o output/childwf`\n\n### Output\nIn this example the output will be created in the `./output/childwf/` folder.\n\n### Known limitations\n\nNo known limitations.\n\n\n", "description_content_type": "text/markdown", "docs_url": null, "download_url": "", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/GoogleCloudPlatform/oozie-to-airflow", "keywords": "", "license": "", "maintainer": "", "maintainer_email": "", "name": "o2a", "package_url": "https://pypi.org/project/o2a/", "platform": "", "project_url": "https://pypi.org/project/o2a/", "project_urls": { "Homepage": "https://github.com/GoogleCloudPlatform/oozie-to-airflow" }, "release_url": "https://pypi.org/project/o2a/1.0.1/", "requires_dist": [ "apache-airflow (==1.10.2)", "autoflake (==1.3)", "black (==19.3b0)", "flake8 (==3.7.8)", "google-api-python-client (==1.7.10)", "isort (==4.3.21)", "j2cli (==0.3.10)", "Jinja2 (==2.10.0)", "lark-parser (==0.7.1)", "mypy (==0.720)", "parameterized (==0.7.0)", "paramiko (==2.6.0)", "pre-commit (==1.17.0)", "pydeps (==1.7.3)", "pylint (==2.3.1)", "pytest (==5.0.1)", "pytest-cov (==2.7.1)", "safety (==1.8.5)", "sshtunnel (==0.1.5)", "twine (==1.13.0)", "tzlocal (==1.5.1)", "Werkzeug (==0.14.1)", "yamllint (==1.16.0)" ], "requires_python": "", "summary": "Oozie To Airflow migration tool", "version": "1.0.1" }, "last_serial": 5599849, "releases": { "0.0.1": [ { "comment_text": "", "digests": { "md5": "2876a9cc4c72c1023339314a25df998f", "sha256": "5cb069153cfd2f286bb377bdbe908f5f9c92847ce5da807f67227cc3cf8f9d9c" }, "downloads": -1, "filename": "o2a-0.0.1-py3-none-any.whl", "has_sig": false, "md5_digest": "2876a9cc4c72c1023339314a25df998f", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 111360, "upload_time": "2019-05-21T15:15:19", "url": "https://files.pythonhosted.org/packages/27/d9/f36c6ceb99954ed5877cadf8ab7a8358342a616c0c56f9377bbd30dd09d6/o2a-0.0.1-py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "1b601454beeb5893b39bd738b0724176", "sha256": "b323507f920def9f79bb62077fcafbabc3c41f2669d8765c03bae58b20a93190" }, "downloads": -1, "filename": "o2a-0.0.1.tar.gz", "has_sig": false, "md5_digest": "1b601454beeb5893b39bd738b0724176", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 61712, "upload_time": "2019-05-21T15:15:25", "url": "https://files.pythonhosted.org/packages/45/82/f058ba398b45b612c97d5348184426e9ad86975fe1eb679ef17d3f9b1da7/o2a-0.0.1.tar.gz" } ], "0.0.12": [ { "comment_text": "", "digests": { "md5": "257999bb40f8d07b48515c46134d3d04", "sha256": "2381efcffeebb7eef38fb76bda90100f84a5158e663de1aaf62c97e95fcfa133" }, "downloads": -1, "filename": "o2a-0.0.12-py3-none-any.whl", "has_sig": false, "md5_digest": "257999bb40f8d07b48515c46134d3d04", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 136205, "upload_time": "2019-05-22T00:38:55", "url": "https://files.pythonhosted.org/packages/b7/2f/ad876c266114ce24a5f4b53c186e1ff2b0bf431315165143c066e6c1aa68/o2a-0.0.12-py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "330310340c7191f5df72600744461582", "sha256": "a79ad768644bf9b695b065717b752a40592020019fa2dd274d6b218033986f51" }, "downloads": -1, "filename": "o2a-0.0.12.tar.gz", "has_sig": false, "md5_digest": "330310340c7191f5df72600744461582", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 71284, "upload_time": "2019-05-22T00:38:57", "url": "https://files.pythonhosted.org/packages/21/7c/b4dcba00baf3673faddf3d15b5b2d031b8d6c0cc24a2d97c0fefec63bcfd/o2a-0.0.12.tar.gz" } ], "0.0.13": [ { "comment_text": "", "digests": { "md5": "21f8ea6a346f85e85e74d5947157cf8e", "sha256": "463baebd793f5727a772a4f83613ef9bf0b2b520ded935cf35e1d4b29daa9bf2" }, "downloads": -1, "filename": "o2a-0.0.13-py3-none-any.whl", "has_sig": false, "md5_digest": "21f8ea6a346f85e85e74d5947157cf8e", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 136458, "upload_time": "2019-05-24T11:53:27", "url": "https://files.pythonhosted.org/packages/27/82/a8451ab298e138b40dff9b9a29f713efa01a2352740cf2ea2d6bee090c42/o2a-0.0.13-py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "d3c9f1d3bec9de8ab051c84393956bfb", "sha256": "a88f9a7a0317ba27783cee98b73b1a3fd9bc5d0af8fc31969ed286e45512e720" }, "downloads": -1, "filename": "o2a-0.0.13.tar.gz", "has_sig": false, "md5_digest": "d3c9f1d3bec9de8ab051c84393956bfb", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 71883, "upload_time": "2019-05-24T11:53:29", "url": "https://files.pythonhosted.org/packages/4b/51/ca5d0b1b5289ea53545a3d770467790e7bf956f85aed86b4dd57a74e03b5/o2a-0.0.13.tar.gz" } ], "1.0.0": [ { "comment_text": "", "digests": { "md5": "eac52ccd1fbe14d681e3378c5faf38fd", "sha256": "b86925348270d4dc1dab2eddbe334752f6ece802ce3ce24ea125edbdf1667eb1" }, "downloads": -1, "filename": "o2a-1.0.0-py3-none-any.whl", "has_sig": false, "md5_digest": "eac52ccd1fbe14d681e3378c5faf38fd", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 156477, "upload_time": "2019-07-28T10:45:06", "url": "https://files.pythonhosted.org/packages/11/c5/bbee6d595c6d73d4e9e1074f7667a5c421533300c6cdec9e51f49ebcbc01/o2a-1.0.0-py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "f2fb1f9e017529fa6fb07533d514e2aa", "sha256": "2a83d07bb2af6d833a408b4f78383afa550467fea4baf94c0b55680786f90fd2" }, "downloads": -1, "filename": "o2a-1.0.0.tar.gz", "has_sig": false, "md5_digest": "f2fb1f9e017529fa6fb07533d514e2aa", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 88025, "upload_time": "2019-07-28T10:45:09", "url": "https://files.pythonhosted.org/packages/92/5a/c6a7b42a4a0da6be9508c92bbb3162e2c0e0d495dea7f0b9dfa49c08fbd6/o2a-1.0.0.tar.gz" } ], "1.0.1": [ { "comment_text": "", "digests": { "md5": "f2d0b97b1db2a5100c3ef859fb2a94da", "sha256": "c763aa3e9212c9a59e1ffaf52a9ce2b3a87be6445dc4e66a39e3f94086c4f736" }, "downloads": -1, "filename": "o2a-1.0.1-py3-none-any.whl", "has_sig": false, "md5_digest": "f2d0b97b1db2a5100c3ef859fb2a94da", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 144642, "upload_time": "2019-07-29T14:32:25", "url": "https://files.pythonhosted.org/packages/e9/c0/1444131bb6f358408fd2dce213eab0b0912de25014f845783b2fd11615a0/o2a-1.0.1-py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "d7010afb896ed633177eb0954f0c15bb", "sha256": "45bdadd6b736c3f01487c609cdf437312b6f0a649a978fc5060b163fab894f10" }, "downloads": -1, "filename": "o2a-1.0.1.tar.gz", "has_sig": false, "md5_digest": "d7010afb896ed633177eb0954f0c15bb", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 90261, "upload_time": "2019-07-29T14:32:28", "url": "https://files.pythonhosted.org/packages/ef/8c/49a37c66f02298e3f0be47b5267dfd22cdd3b8cc805e717c90fe996be99f/o2a-1.0.1.tar.gz" } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "f2d0b97b1db2a5100c3ef859fb2a94da", "sha256": "c763aa3e9212c9a59e1ffaf52a9ce2b3a87be6445dc4e66a39e3f94086c4f736" }, "downloads": -1, "filename": "o2a-1.0.1-py3-none-any.whl", "has_sig": false, "md5_digest": "f2d0b97b1db2a5100c3ef859fb2a94da", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": null, "size": 144642, "upload_time": "2019-07-29T14:32:25", "url": "https://files.pythonhosted.org/packages/e9/c0/1444131bb6f358408fd2dce213eab0b0912de25014f845783b2fd11615a0/o2a-1.0.1-py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "d7010afb896ed633177eb0954f0c15bb", "sha256": "45bdadd6b736c3f01487c609cdf437312b6f0a649a978fc5060b163fab894f10" }, "downloads": -1, "filename": "o2a-1.0.1.tar.gz", "has_sig": false, "md5_digest": "d7010afb896ed633177eb0954f0c15bb", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 90261, "upload_time": "2019-07-29T14:32:28", "url": "https://files.pythonhosted.org/packages/ef/8c/49a37c66f02298e3f0be47b5267dfd22cdd3b8cc805e717c90fe996be99f/o2a-1.0.1.tar.gz" } ] }