{ "info": { "author": "Christopher Flynn", "author_email": "crf204@gmail.com", "bugtrack_url": null, "classifiers": [ "License :: OSI Approved :: MIT License", "Natural Language :: English", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", "Programming Language :: Python :: Implementation :: CPython" ], "description": "databricks-api\n==============\n\n|pypi| |pyversions|\n\n.. |pypi| image:: https://img.shields.io/pypi/v/databricks-api.svg\n :target: https://pypi.python.org/pypi/databricks-api\n\n.. |pyversions| image:: https://img.shields.io/pypi/pyversions/databricks-api.svg\n :target: https://pypi.python.org/pypi/databricks-api\n\n*[This documentation is auto-generated]*\n\nThis package provides a simplified interface for the Databricks REST API.\nThe interface is autogenerated on instantiation using the underlying client\nlibrary used in the official ``databricks-cli`` python package.\n\nInstall using\n\n.. code-block:: bash\n\n pip install databricks-api\n \n\nThe docs here describe the interface for version **0.9.0** of\nthe ``databricks-cli`` package for API version **2.0**.\nAssuming there are no new major or minor versions to the ``databricks-cli`` package\nstructure, this package should continue to work without a required update.\n\nThe ``databricks-api`` package contains a ``DatabricksAPI`` class which provides\ninstance attributes for the ``databricks-cli`` ``ApiClient``, as well as each of\nthe available service instances. The attributes of a ``DatabricksAPI`` instance are:\n\n* DatabricksAPI.client **\n* DatabricksAPI.jobs **\n* DatabricksAPI.cluster **\n* DatabricksAPI.managed_library **\n* DatabricksAPI.dbfs **\n* DatabricksAPI.workspace **\n* DatabricksAPI.secret **\n* DatabricksAPI.groups **\n* DatabricksAPI.instance_pool **\n\nTo instantiate the client, provide the databricks host and either a token or\nuser and password. Also shown is the full signature of the\nunderlying ``ApiClient.__init__``\n\n.. code-block:: python\n\n from databricks_api import DatabricksAPI\n\n # Provide a host and token\n db = DatabricksAPI(\n host=\"example.cloud.databricks.com\",\n token=\"dpapi123...\"\n )\n\n # OR a host and user and password\n db = DatabricksAPI(\n host=\"example.cloud.databricks.com\",\n user=\"me@example.com\",\n password=\"password\"\n )\n\n # Full __init__ signature\n db = DatabricksAPI(\n user=None,\n password=None,\n host=None,\n token=None,\n apiVersion=2.0,\n default_headers={},\n verify=True,\n command_name=''\n )\n\nRefer to the `official documentation `_\non the functionality and required arguments of each method below.\n\nEach of the service instance attributes provides the following public methods:\n\nDatabricksAPI.jobs\n------------------\n\n.. code-block:: python\n\n DatabricksAPI.jobs.cancel_run(\n run_id,\n headers=None,\n )\n\n DatabricksAPI.jobs.create_job(\n name=None,\n existing_cluster_id=None,\n new_cluster=None,\n libraries=None,\n email_notifications=None,\n timeout_seconds=None,\n max_retries=None,\n min_retry_interval_millis=None,\n retry_on_timeout=None,\n schedule=None,\n notebook_task=None,\n spark_jar_task=None,\n spark_python_task=None,\n spark_submit_task=None,\n max_concurrent_runs=None,\n headers=None,\n )\n\n DatabricksAPI.jobs.delete_job(\n job_id,\n headers=None,\n )\n\n DatabricksAPI.jobs.delete_run(\n run_id=None,\n headers=None,\n )\n\n DatabricksAPI.jobs.export_run(\n run_id,\n views_to_export=None,\n headers=None,\n )\n\n DatabricksAPI.jobs.get_job(\n job_id,\n headers=None,\n )\n\n DatabricksAPI.jobs.get_run(\n run_id=None,\n headers=None,\n )\n\n DatabricksAPI.jobs.get_run_output(\n run_id,\n headers=None,\n )\n\n DatabricksAPI.jobs.list_jobs(headers=None)\n\n DatabricksAPI.jobs.list_runs(\n job_id=None,\n active_only=None,\n completed_only=None,\n offset=None,\n limit=None,\n headers=None,\n )\n\n DatabricksAPI.jobs.reset_job(\n job_id,\n new_settings,\n headers=None,\n )\n\n DatabricksAPI.jobs.run_now(\n job_id=None,\n jar_params=None,\n notebook_params=None,\n python_params=None,\n spark_submit_params=None,\n headers=None,\n )\n\n DatabricksAPI.jobs.submit_run(\n run_name=None,\n existing_cluster_id=None,\n new_cluster=None,\n libraries=None,\n notebook_task=None,\n spark_jar_task=None,\n spark_python_task=None,\n spark_submit_task=None,\n timeout_seconds=None,\n headers=None,\n )\n\n\nDatabricksAPI.cluster\n---------------------\n\n.. code-block:: python\n\n DatabricksAPI.cluster.create_cluster(\n num_workers=None,\n autoscale=None,\n cluster_name=None,\n spark_version=None,\n spark_conf=None,\n aws_attributes=None,\n node_type_id=None,\n driver_node_type_id=None,\n ssh_public_keys=None,\n custom_tags=None,\n cluster_log_conf=None,\n spark_env_vars=None,\n autotermination_minutes=None,\n enable_elastic_disk=None,\n cluster_source=None,\n instance_pool_id=None,\n headers=None,\n )\n\n DatabricksAPI.cluster.delete_cluster(\n cluster_id,\n headers=None,\n )\n\n DatabricksAPI.cluster.edit_cluster(\n cluster_id,\n num_workers=None,\n autoscale=None,\n cluster_name=None,\n spark_version=None,\n spark_conf=None,\n aws_attributes=None,\n node_type_id=None,\n driver_node_type_id=None,\n ssh_public_keys=None,\n custom_tags=None,\n cluster_log_conf=None,\n spark_env_vars=None,\n autotermination_minutes=None,\n enable_elastic_disk=None,\n cluster_source=None,\n instance_pool_id=None,\n headers=None,\n )\n\n DatabricksAPI.cluster.get_cluster(\n cluster_id,\n headers=None,\n )\n\n DatabricksAPI.cluster.get_events(\n cluster_id,\n start_time=None,\n end_time=None,\n order=None,\n event_types=None,\n offset=None,\n limit=None,\n headers=None,\n )\n\n DatabricksAPI.cluster.list_available_zones(headers=None)\n\n DatabricksAPI.cluster.list_clusters(headers=None)\n\n DatabricksAPI.cluster.list_node_types(headers=None)\n\n DatabricksAPI.cluster.list_spark_versions(headers=None)\n\n DatabricksAPI.cluster.permanent_delete_cluster(\n cluster_id,\n headers=None,\n )\n\n DatabricksAPI.cluster.pin_cluster(\n cluster_id,\n headers=None,\n )\n\n DatabricksAPI.cluster.resize_cluster(\n cluster_id,\n num_workers=None,\n autoscale=None,\n headers=None,\n )\n\n DatabricksAPI.cluster.restart_cluster(\n cluster_id,\n headers=None,\n )\n\n DatabricksAPI.cluster.start_cluster(\n cluster_id,\n headers=None,\n )\n\n DatabricksAPI.cluster.unpin_cluster(\n cluster_id,\n headers=None,\n )\n\n\nDatabricksAPI.managed_library\n-----------------------------\n\n.. code-block:: python\n\n DatabricksAPI.managed_library.all_cluster_statuses(headers=None)\n\n DatabricksAPI.managed_library.cluster_status(\n cluster_id,\n headers=None,\n )\n\n DatabricksAPI.managed_library.install_libraries(\n cluster_id,\n libraries=None,\n headers=None,\n )\n\n DatabricksAPI.managed_library.uninstall_libraries(\n cluster_id,\n libraries=None,\n headers=None,\n )\n\n\nDatabricksAPI.dbfs\n------------------\n\n.. code-block:: python\n\n DatabricksAPI.dbfs.add_block(\n handle,\n data,\n headers=None,\n )\n\n DatabricksAPI.dbfs.close(\n handle,\n headers=None,\n )\n\n DatabricksAPI.dbfs.create(\n path,\n overwrite=None,\n headers=None,\n )\n\n DatabricksAPI.dbfs.delete(\n path,\n recursive=None,\n headers=None,\n )\n\n DatabricksAPI.dbfs.get_status(\n path,\n headers=None,\n )\n\n DatabricksAPI.dbfs.list(\n path,\n headers=None,\n )\n\n DatabricksAPI.dbfs.mkdirs(\n path,\n headers=None,\n )\n\n DatabricksAPI.dbfs.move(\n source_path,\n destination_path,\n headers=None,\n )\n\n DatabricksAPI.dbfs.put(\n path,\n contents=None,\n overwrite=None,\n headers=None,\n )\n\n DatabricksAPI.dbfs.read(\n path,\n offset=None,\n length=None,\n headers=None,\n )\n\n\nDatabricksAPI.workspace\n-----------------------\n\n.. code-block:: python\n\n DatabricksAPI.workspace.delete(\n path,\n recursive=None,\n headers=None,\n )\n\n DatabricksAPI.workspace.export_workspace(\n path,\n format=None,\n direct_download=None,\n headers=None,\n )\n\n DatabricksAPI.workspace.get_status(\n path,\n headers=None,\n )\n\n DatabricksAPI.workspace.import_workspace(\n path,\n format=None,\n language=None,\n content=None,\n overwrite=None,\n headers=None,\n )\n\n DatabricksAPI.workspace.list(\n path,\n headers=None,\n )\n\n DatabricksAPI.workspace.mkdirs(\n path,\n headers=None,\n )\n\n\nDatabricksAPI.secret\n--------------------\n\n.. code-block:: python\n\n DatabricksAPI.secret.create_scope(\n scope,\n initial_manage_principal=None,\n scope_backend_type=None,\n headers=None,\n )\n\n DatabricksAPI.secret.delete_acl(\n scope,\n principal,\n headers=None,\n )\n\n DatabricksAPI.secret.delete_scope(\n scope,\n headers=None,\n )\n\n DatabricksAPI.secret.delete_secret(\n scope,\n key,\n headers=None,\n )\n\n DatabricksAPI.secret.get_acl(\n scope,\n principal,\n headers=None,\n )\n\n DatabricksAPI.secret.list_acls(\n scope,\n headers=None,\n )\n\n DatabricksAPI.secret.list_scopes(headers=None)\n\n DatabricksAPI.secret.list_secrets(\n scope,\n headers=None,\n )\n\n DatabricksAPI.secret.put_acl(\n scope,\n principal,\n permission,\n headers=None,\n )\n\n DatabricksAPI.secret.put_secret(\n scope,\n key,\n string_value=None,\n bytes_value=None,\n headers=None,\n )\n\n\nDatabricksAPI.groups\n--------------------\n\n.. code-block:: python\n\n DatabricksAPI.groups.add_to_group(\n parent_name,\n user_name=None,\n group_name=None,\n headers=None,\n )\n\n DatabricksAPI.groups.create_group(\n group_name,\n headers=None,\n )\n\n DatabricksAPI.groups.get_group_members(\n group_name,\n headers=None,\n )\n\n DatabricksAPI.groups.get_groups(headers=None)\n\n DatabricksAPI.groups.get_groups_for_principal(\n user_name=None,\n group_name=None,\n headers=None,\n )\n\n DatabricksAPI.groups.remove_from_group(\n parent_name,\n user_name=None,\n group_name=None,\n headers=None,\n )\n\n DatabricksAPI.groups.remove_group(\n group_name,\n headers=None,\n )\n\n\nDatabricksAPI.instance_pool\n---------------------------\n\n.. code-block:: python\n\n DatabricksAPI.instance_pool.create_instance_pool(\n instance_pool_name=None,\n min_idle_instances=None,\n max_capacity=None,\n aws_attributes=None,\n node_type_id=None,\n custom_tags=None,\n idle_instance_autotermination_minutes=None,\n enable_elastic_disk=None,\n disk_spec=None,\n preloaded_spark_versions=None,\n headers=None,\n )\n\n DatabricksAPI.instance_pool.delete_instance_pool(\n instance_pool_id=None,\n headers=None,\n )\n\n DatabricksAPI.instance_pool.edit_instance_pool(\n instance_pool_id,\n instance_pool_name=None,\n min_idle_instances=None,\n max_capacity=None,\n aws_attributes=None,\n node_type_id=None,\n custom_tags=None,\n idle_instance_autotermination_minutes=None,\n enable_elastic_disk=None,\n disk_spec=None,\n preloaded_spark_versions=None,\n headers=None,\n )\n\n DatabricksAPI.instance_pool.get_instance_pool(\n instance_pool_id=None,\n headers=None,\n )\n\n DatabricksAPI.instance_pool.list_instance_pools(headers=None)\n\n\n\n", "description_content_type": "text/x-rst", "docs_url": null, "download_url": "", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/crflynn/databricks-api", "keywords": "databricks,api,client", "license": "MIT", "maintainer": "", "maintainer_email": "", "name": "databricks-api", "package_url": "https://pypi.org/project/databricks-api/", "platform": "", "project_url": "https://pypi.org/project/databricks-api/", "project_urls": { "Homepage": "https://github.com/crflynn/databricks-api", "Repository": "https://github.com/crflynn/databricks-api" }, "release_url": "https://pypi.org/project/databricks-api/0.3.0/", "requires_dist": [ "databricks-cli (>=0.9.0,<0.10.0)" ], "requires_python": ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*", "summary": "Databricks API client auto-generated from the official databricks-cli package", "version": "0.3.0" }, "last_serial": 5747598, "releases": { "0.1.0": [ { "comment_text": "", "digests": { "md5": "db7ce28762263b40eca51e1d9eb42a55", "sha256": "276ab4a26b8aa60d9a86c5fb305431c133acac4cc8e4188fbaecbc9e46636a68" }, "downloads": -1, "filename": "databricks_api-0.1.0-py2.py3-none-any.whl", "has_sig": false, "md5_digest": "db7ce28762263b40eca51e1d9eb42a55", "packagetype": "bdist_wheel", "python_version": "py2.py3", "requires_python": null, "size": 4269, "upload_time": "2018-10-11T02:08:33", "url": "https://files.pythonhosted.org/packages/30/52/f2f6abd00d59059d0f0733a2a1b5088b1ce3bc4a2eafd8ff55d032c245de/databricks_api-0.1.0-py2.py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "a058889dfcb13c3c93265a9a3320d724", "sha256": "e9796bea26f3236b9d4d82781292ce52ca7f67d81841c61c077301e05c6bd7cd" }, "downloads": -1, "filename": "databricks-api-0.1.0.tar.gz", "has_sig": false, "md5_digest": "a058889dfcb13c3c93265a9a3320d724", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 5464, "upload_time": "2018-10-11T02:08:34", "url": "https://files.pythonhosted.org/packages/9d/1c/2862a98a6541043fb5d3105456d05789c318f23aff5753bd7244ccf2cebf/databricks-api-0.1.0.tar.gz" } ], "0.2.0": [ { "comment_text": "", "digests": { "md5": "b56a24549841bea146b7560cfbc872b9", "sha256": "54c10540c0c056447791a7b473c6285eaa33541d1b11e2435cebc28810a94d7e" }, "downloads": -1, "filename": "databricks_api-0.2.0-py2.py3-none-any.whl", "has_sig": false, "md5_digest": "b56a24549841bea146b7560cfbc872b9", "packagetype": "bdist_wheel", "python_version": "py2.py3", "requires_python": ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*", "size": 5480, "upload_time": "2019-08-17T23:20:44", "url": "https://files.pythonhosted.org/packages/77/68/3775cd41db067d77710a9877e9fb61c82217e09d20f9408fb8c31da045b8/databricks_api-0.2.0-py2.py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "5f9d1a95043eb32e15c417e720ca393e", "sha256": "6c4397cf790129947df1654d3eb2d96c2560defa0de1ff27bf8ce1fc353ec49b" }, "downloads": -1, "filename": "databricks_api-0.2.0.tar.gz", "has_sig": false, "md5_digest": "5f9d1a95043eb32e15c417e720ca393e", "packagetype": "sdist", "python_version": "source", "requires_python": ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*", "size": 6919, "upload_time": "2019-08-17T23:20:46", "url": "https://files.pythonhosted.org/packages/b9/01/b72afd655bb93bef4a48605624d5b1605340ddbb500396c4e65f8d5f5a9f/databricks_api-0.2.0.tar.gz" } ], "0.3.0": [ { "comment_text": "", "digests": { "md5": "00121d957e06d4720076560a37ccf686", "sha256": "8f637288d05b3d214ea1636bb31d6423c751d85fdccd25fd0d0c86657e7c0abc" }, "downloads": -1, "filename": "databricks_api-0.3.0-py2.py3-none-any.whl", "has_sig": false, "md5_digest": "00121d957e06d4720076560a37ccf686", "packagetype": "bdist_wheel", "python_version": "py2.py3", "requires_python": ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*", "size": 5718, "upload_time": "2019-08-29T02:04:35", "url": "https://files.pythonhosted.org/packages/40/c7/02501c2494ad160c007aadac18c7dea2e83c9e0f4bc8137b45e3a4bc11a2/databricks_api-0.3.0-py2.py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "2b50af6ce0dccaaea11da4ef6d52044f", "sha256": "3e37422b2e52b1d8aa68e3ce36034d86313170f56fac11f88295713d450dbd99" }, "downloads": -1, "filename": "databricks_api-0.3.0.tar.gz", "has_sig": false, "md5_digest": "2b50af6ce0dccaaea11da4ef6d52044f", "packagetype": "sdist", "python_version": "source", "requires_python": ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*", "size": 7234, "upload_time": "2019-08-29T02:04:38", "url": "https://files.pythonhosted.org/packages/3e/c1/42d380a153910f8becd19840c9a27166006ecf84048fe92a2e16e98c1de1/databricks_api-0.3.0.tar.gz" } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "00121d957e06d4720076560a37ccf686", "sha256": "8f637288d05b3d214ea1636bb31d6423c751d85fdccd25fd0d0c86657e7c0abc" }, "downloads": -1, "filename": "databricks_api-0.3.0-py2.py3-none-any.whl", "has_sig": false, "md5_digest": "00121d957e06d4720076560a37ccf686", "packagetype": "bdist_wheel", "python_version": "py2.py3", "requires_python": ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*", "size": 5718, "upload_time": "2019-08-29T02:04:35", "url": "https://files.pythonhosted.org/packages/40/c7/02501c2494ad160c007aadac18c7dea2e83c9e0f4bc8137b45e3a4bc11a2/databricks_api-0.3.0-py2.py3-none-any.whl" }, { "comment_text": "", "digests": { "md5": "2b50af6ce0dccaaea11da4ef6d52044f", "sha256": "3e37422b2e52b1d8aa68e3ce36034d86313170f56fac11f88295713d450dbd99" }, "downloads": -1, "filename": "databricks_api-0.3.0.tar.gz", "has_sig": false, "md5_digest": "2b50af6ce0dccaaea11da4ef6d52044f", "packagetype": "sdist", "python_version": "source", "requires_python": ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*", "size": 7234, "upload_time": "2019-08-29T02:04:38", "url": "https://files.pythonhosted.org/packages/3e/c1/42d380a153910f8becd19840c9a27166006ecf84048fe92a2e16e98c1de1/databricks_api-0.3.0.tar.gz" } ] }