{
"info": {
"author": "WISE-PaaS/AFS",
"author_email": "stacy.yeh@advantech.com.tw",
"bugtrack_url": null,
"classifiers": [],
"description": "# AFS2-DataSource SDK\nThe AFS2-DataSource SDK package allows developers to easily access PostgreSQL, MongoDB, InfluxDB, S3 and APM.\n\n## Installation\nSupport Python version 3.6 or later\n```\npip install afs2-datasource\n```\n\n## Notice\nAFS2-DataSource SDK uses `asyncio` package, and Jupyter kernel is also using `asyncio` and running an event loop, but these loops can't be nested.\n(https://github.com/jupyter/notebook/issues/3397)\n\nIf using AFS2-DataSource SDK in Jupyter Notebook, please add the following codes to resolve this issue:\n```python\n!pip install nest_asyncio\nimport nest_asyncio\nnest_asyncio.apply()\n```\n\n## API\n### DBManager\n+ Init DBManager\n+ DBManager.connect()\n+ DBManager.disconnect()\n+ DBManager.is_connected()\n+ DBManager.is_connecting()\n+ DBManager.get_dbtype()\n+ DBManager.execute_query()\n+ DBManager.create_table(table_name, columns)\n+ DBManager.is_table_exist(table_name)\n+ DBManager.is_file_exist(table_name, file_name)\n+ DBManager.insert(table_name, columns, records, source, destination)\n+ DBManager.delete_file(table_name, file_name)\n----\n\n#### Init DBManager\n\n##### With Database Config\nImport database config via Python.\n```python\nfrom afs2datasource import DBManager, constant\n\n# For PostgreSQL\nmanager = DBManager(db_type=constant.DB_TYPE['POSTGRES'],\n username=username,\n password=password,\n host=host,\n port=port,\n database=database,\n querySql=\"select {field} from {schema}.{table}\"\n)\n\n# For MongoDB\nmanager = DBManager(db_type=constant.DB_TYPE['MONGODB'],\n username=username,\n password=password,\n host=host,\n port=port,\n database=database,\n collection=collection,\n querySql=\"{\"{key}\": {value}}\"\n)\n\n# For InfluxDB\nmanager = DBManager(db_type=constant.DB_TYPE['INFLUXDB'],\n username=username,\n password=password,\n host=host,\n port=port,\n database=database,\n querySql=\"select {field_key} from {measurement_name}\"\n)\n\n# For S3\nmanager = DBManager(db_type=constant.DB_TYPE['S3'],\n endpoint=endpoint,\n access_key=access_key,\n secret_key=secret_key,\n buckets=[{\n 'bucket': 'bucket_name',\n 'blobs': {\n 'files': ['file_name'],\n 'folders': ['folder_name']\n }\n }]\n)\n\n# For APM\nmanager = DBManager(db_type=constant.DB_TYPE['APM'],\n username=username, # sso username\n password=password, # sso password\n apmUrl=apmUrl,\n machineIdList=[machineId], # APM Machine Id\n parameterList=[parameter], # APM Parameter\n mongouri=mongouri,\n # timeRange or timeLast\n timeRange=[{'start': start_ts, 'end': end_ts}],\n timeLast={'lastDays:' lastDay, 'lastHours': lastHour, 'lastMins': lastMin}\n)\n\n# For Azure Blob\nmanager = DBManager(db_type=constant.DB_TYPE['AZUREBLOB'],\n account_name=account_name,\n account_key=account_key,\n containers=[{\n 'container': container_name,\n 'blobs': {\n 'files': ['file_name']\n 'folders': ['folder_name']\n }\n }]\n)\n```\n----\n\n#### DBManager.connect()\nConnect to PostgreSQL, MongoDB, InfluxDB, S3, APM with specified by the given config.\n```python\nmanager.connect()\n```\n----\n\n#### DBManager.disconnect()\nClose the connection.\nNote S3 datasource not support this function.\n```python\nmanager.disconnect()\n```\n----\n\n#### DBManager.is_connected()\nReturn if the connection is connected.\n```python\nmanager.is_connected()\n```\n----\n\n#### DBManager.is_connecting()\nReturn if the connection is connecting.\n```python\nmanager.is_connecting()\n```\n----\n\n#### DBManager.get_dbtype()\nReturn database type of the connection.\n```python\nmanager.get_dbtype()\n```\n----\n\n#### DBManager.execute_query()\nReturn the result in PostgreSQL, MongoDB or InfluxDB after executing the `querySql` in config.\n\nDownload files which is specified in `buckets` in S3 config or `containers` in Azure Blob config, and return `buckets` and `containers` name of array.\n\nReturn data of `Machine` and `Parameter` in `timeRange` or `timeLast` from APM.\n\n```python\n# For Postgres, MongoDB, InfluxDB and APM\ndf = manager.execute_query()\n# Return type: DataFrame\n\"\"\"\n Age Cabin Embarked Fare ... Sex Survived Ticket_info Title2\n0 22.0 7.0 2.0 7.2500 ... 1.0 0.0 2.0 2.0\n1 38.0 2.0 0.0 71.2833 ... 0.0 1.0 14.0 3.0\n2 26.0 7.0 2.0 7.9250 ... 0.0 1.0 31.0 1.0\n3 35.0 2.0 2.0 53.1000 ... 0.0 1.0 36.0 3.0\n4 35.0 7.0 2.0 8.0500 ... 1.0 0.0 36.0 2.0\n...\n\"\"\"\n\n# For Azure Blob\ncontainer_names = manager.execute_query()\n# Return Array\n\"\"\"\n['container1', 'container2']\n\"\"\"\n\n# For S3\nbucket_names = manager.execute_query()\n# Return Array\n\"\"\"\n['bucket1', 'bucket2']\n\"\"\"\n\n```\n----\n\n#### DBManager.create_table(table_name, columns=[])\nCreate table in database for Postgres, MongoDB and InfluxDB.\n\nCreate Bucket/Container in S3/Azure Blob.\n\nNote: PostgreSQL table_name format **schema.table**\n```python\n# For Postgres, MongoDB and InfluxDB\ntable_name = 'titanic'\ncolumns = [\n {'name': 'index', 'type': 'INTEGER', 'is_primary': True},\n {'name': 'survived', 'type': 'FLOAT', 'is_not_null': True},\n {'name': 'age', 'type': 'FLOAT'},\n {'name': 'embarked', 'type': 'INTEGER'}\n]\nmanager.create_table(table_name=table_name, columns=columns)\n\n# For S3\nbucket_name = 'bucket'\nmanager.create_table(table_name=bucket_name)\n\n# For Azure Blob\ncontainer_name = 'container'\nmanager.create_table(table_name=container_name)\n```\n----\n\n#### DBManager.is_table_exist(table_name)\nReturn if the table is exist in Postgres, MongoDB or Influxdb.\n\nReturn if the bucket is exist in S3.\n\nReturn if the container is exist in Azure Blob.\n\n```python\n# For Postgres, MongoDB and InfluxDB\ntable_name = 'titanic'\nmanager.is_table_exist(table_name=table_name)\n\n# For S3\nbucket_name = 'bucket'\nmanager.is_table_exist(table_name=bucket_name)\n\n# For Azure blob\ncontainer_name = 'container'\nmanager.is_table_exist(table_name=container_name)\n```\n----\n\n#### DBManager.is_file_exist(table_name, file_name)\nReturn if the file is exist in bucket in S3.\nReturn if the file is exist in container in Azure Blob.\n\nNote this function only support S3 and Azure Blob.\n```python\n# For S3\nbucket_name = 'bucket'\nfile_name = 'test.csv\nmanager.is_file_exist(table_name=bucket_name, file_name=file_name)\n# Return: Boolean\n\n# For Azure Blob\ncontainer_name = 'container'\nfile_name = 'test.csv\nmanager.is_file_exist(table_name=container_name, file_name=file_name)\n# Return: Boolean\n```\n----\n\n#### DBManager.insert(table_name, columns=[], records=[], source='', destination='')\nInsert records into table in Postgres, MongoDB or InfluxDB.\n\nUpload file to S3 and Azure Blob.\n\n```python\n# For Postgres, MongoDB and InfluxDB\ntable_name = 'titanic'\ncolumns = ['index', 'survived', 'age', 'embarked']\nrecords = [\n [0, 1, 22.0, 7.0],\n [1, 1, 2.0, 0.0],\n [2, 0, 26.0, 7.0]\n]\nmanager.insert(table_name=table_name, columns=columns, records=records)\n\n# For S3\nbucket_name = 'bucket'\nsource='test.csv' # local file path\ndestination='test_s3.csv' # the file path and name in s3\nmanager.insert(table_name=bucket_name, source=source, destination=destination)\n\n# For Azure Blob\ncontainer_name = 'container'\nsource='test.csv' # local file path\ndestination='test_s3.csv' # the file path and name in Azure blob\nmanager.insert(table_name=container_name, source=source, destination=destination)\n```\n---\n#### Use APM data source\n* Get Hist Raw data from SCADA Mongo data base\n* Required\n - username: APM SSO username\n - password: APM SSO password\n - uri: mongo data base uri\n - apmurl: APM api url\n - machineIdList: APM machine Id list (**type:Array**)\n - parameterList: APM parameter name list (**type:Array**)\n - time range: Training date range\n * example:\n ```json\n [{'start':'2019-05-01', 'end':'2019-05-31'}]\n ```\n----\n\n#### DBManager.delete_file(table_name, file_name)\nDelete file in bucket in S3 and return if the file is deleted successfully.\n\nNote this function only support S3.\n\n```python\n# For S3\nbucket_name = 'bucket'\nfile_name = 'test_s3.csv'\nmanager.delete_file(table_name=bucket_name, file_name=file_name)\n# Return: Boolean\n```\n---\n# Example\n\n## MongoDB Example\n\n```python\nfrom afs2datasource import DBManager, constant\n\n# Init DBManager\nmanager = DBManager(\n db_type=constant.DB_TYPE['MONGODB'],\n username={USERNAME},\n password={PASSWORD},\n host={HOST},\n port={PORT},\n database={DATABASE},\n collection={COLLECTION},\n querySql={QUERYSQL}\n)\n\n# Connect DB\nmanager.connect()\n\n# Check the status of connection\nis_connected = manager.is_connected()\n# Return type: boolean\n\n# Check is the table is exist\ntable_name = 'titanic'\nmanager.is_table_exist(table_name)\n# Return type: boolean\n\n# Create Table\ncolumns = [\n {'name': 'index', 'type': 'INTEGER', 'is_not_null': True},\n {'name': 'survived', 'type': 'INTEGER'},\n {'name': 'age', 'type': 'FLOAT'},\n {'name': 'embarked', 'type': 'INTEGER'}\n]\nmanager.create_table(table_name=table_name, columns=columns)\n\n# Insert Record\ncolumns = ['index', 'survived', 'age', 'embarked']\nrecords = [\n [0, 1, 22.0, 7.0],\n [1, 1, 2.0, 0.0],\n [2, 0, 26.0, 7.0]\n]\nmanager.insert(table_name=table_name, columns=columns, records=records)\n\n# Execute querySql in DB config\ndata = manager.execute_query()\n# Return type: DataFrame\n\"\"\"\n index survived age embarked\n0 0 1 22.0 7.0\n1 1 1 2.0 0.0\n2 2 0 26.0 7.0\n...\n\"\"\"\n\n# Disconnect to DB\nmanager.disconnect()\n```\n---\n## S3 Example\n\n```python\nfrom afs2datasource import DBManager, constant\n\n# Init DBManager\nmanager = DBManager(\n db_type = constant.DB_TYPE['S3'],\n endpoint={ENDPOINT},\n access_key={ACCESSKEY},\n secret_key={SECRETKEY},\n buckets=[{\n 'bucket': {BUCKET_NAME},\n 'blobs': {\n 'files': ['dataset/train.csv'],\n 'folders': ['models/']\n }\n }]\n)\n\n# Connect S3\nmanager.connect()\n\n# Check is the table is exist\nbucket_name = 'titanic'\nmanager.is_table_exist(table_name=bucket_name)\n# Return type: boolean\n\n# Create Bucket\nmanager.create_table(table_name=bucket_name)\n\n# Upload File to S3\nlocal_file = '../test.csv'\ns3_file = 'dataset/test.csv'\nmanager.insert(table_name=bucket_name, source=local_file, destination=s3_file)\n\n# Download files in blob_list\n# Download all files in directory\nbucket_names = manager.execute_query()\n# Return type: Array\n\n# Check if file is exist or not\nis_exist = manager.is_file_exist(table_name=bucket_name, file_name=s3_file)\n# Return type: Boolean\n\n# Delete the file in Bucket and return if the file is deleted successfully\nis_success = manager.delete_file(table_name=bucket_name, file_name=s3_file)\n# Return type: Boolean\n```\n---\n\n## APM Data source example\n```python\nAPMDSHelper(\n username,\n password,\n apmurl,\n machineIdList,\n parameterList,\n mongouri,\n timeRange)\nAPMDSHelper.execute()\n```\n---\n\n## Azure Blob Example\n\n```python\nfrom afs2datasource import DBManager, constant\n\n# Init DBManager\nmanager = DBManager(\n db_type=constant.DB_TYPE['AZUREBLOB'],\n account_key={ACCESS_KEY},\n account_name={ACCESS_NAME}\n containers=[{\n 'container': {CONTAINER_NAME},\n 'blobs': {\n 'files': ['titanic.csv', 'models/train.csv'],\n 'folders': ['test/']\n }\n }]\n)\n\n# Connect Azure Blob\nmanager.connect()\n\n# Check is the container is exist\ncontainer_name = 'container'\nmanager.is_table_exist(table_name=container_name)\n# Return type: boolean\n\n# Create container\nmanager.create_table(table_name=container_name)\n\n# Upload File to Azure Blob\nlocal_file = '../test.csv'\nazure_file = 'dataset/test.csv'\nmanager.insert(table_name=container_name, source=local_file, destination=azure_file)\n\n# Download files in `containers`\n# Download all files in directory\ncontainer_names = manager.execute_query()\n# Return type: Array\n\n# Check if file is exist in container or not\nis_exist = manager.is_file_exist(table_name=container_name, file_name=azure_file)\n# Return type: Boolean\n```\n\n",
"description_content_type": "text/markdown",
"docs_url": null,
"download_url": "",
"downloads": {
"last_day": -1,
"last_month": -1,
"last_week": -1
},
"home_page": "https://github.com/stacy0416/afs2-datasource",
"keywords": "AFS",
"license": "Apache License 2.0",
"maintainer": "",
"maintainer_email": "",
"name": "afs2-datasource",
"package_url": "https://pypi.org/project/afs2-datasource/",
"platform": "",
"project_url": "https://pypi.org/project/afs2-datasource/",
"project_urls": {
"Homepage": "https://github.com/stacy0416/afs2-datasource"
},
"release_url": "https://pypi.org/project/afs2-datasource/2.1.27/",
"requires_dist": [
"pymongo (==3.7.2)",
"pandas (==0.24.2)",
"psycopg2-binary (==2.8.1)",
"influxdb (==5.2.2)",
"boto3 (==1.9.156)",
"requests (==2.22.0)",
"motor (==2.0.0)",
"azure-storage-blob (==2.0.1)"
],
"requires_python": "",
"summary": "For AFS developer to access Datasource",
"version": "2.1.27"
},
"last_serial": 5735765,
"releases": {
"1.0.11": [
{
"comment_text": "",
"digests": {
"md5": "9b6eedf8172656877ab2b7d3cd247da4",
"sha256": "e51ecaf054088d590612072e955edba7b919a67f97be78cab6c9b4b3a3178c80"
},
"downloads": -1,
"filename": "afs2_datasource-1.0.11-py3-none-any.whl",
"has_sig": false,
"md5_digest": "9b6eedf8172656877ab2b7d3cd247da4",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 9342,
"upload_time": "2019-05-13T02:48:38",
"url": "https://files.pythonhosted.org/packages/13/a8/46532af306a02ae0776c383037de6102a353ceb0a65ac2f3a7acfe93c351/afs2_datasource-1.0.11-py3-none-any.whl"
},
{
"comment_text": "",
"digests": {
"md5": "62a3c4971264c6b83f527698b12aea48",
"sha256": "79a4ba75eee2295543438989abe66e40ca2d9ef1bc8ceb6fc5c51ceafe46d01e"
},
"downloads": -1,
"filename": "afs2-datasource-1.0.11.tar.gz",
"has_sig": false,
"md5_digest": "62a3c4971264c6b83f527698b12aea48",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 2875,
"upload_time": "2019-05-09T03:27:46",
"url": "https://files.pythonhosted.org/packages/85/e3/2173025362a64f069360a7196526cd769b93fe7441649b3d92ff9dd84d2f/afs2-datasource-1.0.11.tar.gz"
}
],
"1.1.1": [
{
"comment_text": "",
"digests": {
"md5": "474ab568334ac02b7743d8f6c28673a6",
"sha256": "3b309848a0f2364dcbb14e06c829e9b2f2adb957444c120c800438cc2b9522eb"
},
"downloads": -1,
"filename": "afs2_datasource-1.1.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "474ab568334ac02b7743d8f6c28673a6",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 10667,
"upload_time": "2019-05-13T02:48:39",
"url": "https://files.pythonhosted.org/packages/ee/f4/eeb24d4d30499b5fc9fb504ffb980b9599f55851372eac5f9e85eca7a609/afs2_datasource-1.1.1-py3-none-any.whl"
}
],
"2.1.14": [
{
"comment_text": "",
"digests": {
"md5": "2de1a63cb48b8582788138b2882ae7d5",
"sha256": "706f9ebfbfb95d09889f8681ce4cca8caeb1872c9256ef3d638c2218a9ef6aba"
},
"downloads": -1,
"filename": "afs2_datasource-2.1.14-py3-none-any.whl",
"has_sig": false,
"md5_digest": "2de1a63cb48b8582788138b2882ae7d5",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 10080,
"upload_time": "2019-05-13T02:48:41",
"url": "https://files.pythonhosted.org/packages/24/ce/e665536369f465c931ee2c7f24062211c3281b7bb9c22f3fac7848867814/afs2_datasource-2.1.14-py3-none-any.whl"
},
{
"comment_text": "",
"digests": {
"md5": "84bff330c79f4bc7d6de53a9de469cdd",
"sha256": "401282578cfe13779bf0af87e9dfc2150dfb91c679d64aa903563a79dc6ab2f4"
},
"downloads": -1,
"filename": "afs2-datasource-2.1.14.tar.gz",
"has_sig": false,
"md5_digest": "84bff330c79f4bc7d6de53a9de469cdd",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 3887,
"upload_time": "2019-05-13T02:48:43",
"url": "https://files.pythonhosted.org/packages/78/3d/8a9958823a4494af8b9ed9d942234deff7f6625904e0003d6deb753b443f/afs2-datasource-2.1.14.tar.gz"
}
],
"2.1.15": [
{
"comment_text": "",
"digests": {
"md5": "15865333039c630df52774e8bac38144",
"sha256": "81e51f7bf5faa1cd3e04b48e9212a0ad7d29f7717783189c2eef4508f0ce5405"
},
"downloads": -1,
"filename": "afs2_datasource-2.1.15-py3-none-any.whl",
"has_sig": false,
"md5_digest": "15865333039c630df52774e8bac38144",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 11511,
"upload_time": "2019-05-16T06:20:56",
"url": "https://files.pythonhosted.org/packages/f2/06/129ee8761f5c0ea0b7facdf11305cc82021045480004f04c855585800f33/afs2_datasource-2.1.15-py3-none-any.whl"
},
{
"comment_text": "",
"digests": {
"md5": "d2dc1e1ed345e1184b0c342ade5bbc71",
"sha256": "701d5706277c85b244bfb41a2cf44929d3d982de184d923b6a2862b0e03365c5"
},
"downloads": -1,
"filename": "afs2-datasource-2.1.15.tar.gz",
"has_sig": false,
"md5_digest": "d2dc1e1ed345e1184b0c342ade5bbc71",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 5028,
"upload_time": "2019-05-16T06:21:04",
"url": "https://files.pythonhosted.org/packages/a9/b1/9ba751ee4c26c97cd0bf1ab1de6ab961ed0c6bb43423c1144b4b269e0335/afs2-datasource-2.1.15.tar.gz"
}
],
"2.1.17": [
{
"comment_text": "",
"digests": {
"md5": "f2f7ba401876b2179ee29f94e14904c9",
"sha256": "5d71a8c1dd8bb4e8b8928ea8af143c7f9d023be3a3ba68457d1bc4bb183bffe5"
},
"downloads": -1,
"filename": "afs2-datasource-2.1.17.tar.gz",
"has_sig": false,
"md5_digest": "f2f7ba401876b2179ee29f94e14904c9",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 6687,
"upload_time": "2019-05-30T06:07:39",
"url": "https://files.pythonhosted.org/packages/d8/9d/7ac1f3d9eb8a5c91ba97c294dccfa68614078fe92bf759c6db8546603964/afs2-datasource-2.1.17.tar.gz"
}
],
"2.1.18": [
{
"comment_text": "",
"digests": {
"md5": "d230784103e586741f50015f5ec01c1f",
"sha256": "2852fcc8b3ab62743bbd824a6db6af7dd45734588137911c269e12239f743d7e"
},
"downloads": -1,
"filename": "afs2_datasource-2.1.18-0617-py3-none-any.whl",
"has_sig": false,
"md5_digest": "d230784103e586741f50015f5ec01c1f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 17850,
"upload_time": "2019-06-17T01:52:24",
"url": "https://files.pythonhosted.org/packages/4d/44/ab57119fd6ee430fcd07ab23f72271217c79eedd3811625b37a8c12b6be8/afs2_datasource-2.1.18-0617-py3-none-any.whl"
}
],
"2.1.19": [
{
"comment_text": "",
"digests": {
"md5": "a203eb67ebc7e20dbcc4ef355b09eed3",
"sha256": "137239f157ef3fcb2f8a1b2fd0c1982347affbac5f73d0511de9f27f72851140"
},
"downloads": -1,
"filename": "afs2_datasource-2.1.19-1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a203eb67ebc7e20dbcc4ef355b09eed3",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 21097,
"upload_time": "2019-06-25T02:41:40",
"url": "https://files.pythonhosted.org/packages/51/bd/c9e6a5adcee9e067c6584b49f73cc6201f85777590af0aceaf4129538c58/afs2_datasource-2.1.19-1-py3-none-any.whl"
}
],
"2.1.20": [
{
"comment_text": "",
"digests": {
"md5": "cb0bcc1c09493bac1f5bca43d9681a32",
"sha256": "bea9f0ca54b61c1e78b7962b6122e2010957c98202e2179c756bade71f2a43be"
},
"downloads": -1,
"filename": "afs2_datasource-2.1.20-py3-none-any.whl",
"has_sig": false,
"md5_digest": "cb0bcc1c09493bac1f5bca43d9681a32",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 21834,
"upload_time": "2019-07-01T02:53:31",
"url": "https://files.pythonhosted.org/packages/65/79/e5d022fcd5aed4a44918c71c85d719221b16bfec360b2fc9930b7c47e70f/afs2_datasource-2.1.20-py3-none-any.whl"
}
],
"2.1.23": [
{
"comment_text": "",
"digests": {
"md5": "005aa375fb527c9a63a5578c00327f79",
"sha256": "fe467992444cccdf3480f11dd86ea1327b8cb669cc7a412516893dd5cce4c712"
},
"downloads": -1,
"filename": "afs2_datasource-2.1.23-py3-none-any.whl",
"has_sig": false,
"md5_digest": "005aa375fb527c9a63a5578c00327f79",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 24402,
"upload_time": "2019-07-24T06:28:57",
"url": "https://files.pythonhosted.org/packages/dc/27/eaea7a5c8847e38ec987336f1caa47e3619b4ab1518ed63746c6ff9d78bf/afs2_datasource-2.1.23-py3-none-any.whl"
}
],
"2.1.25": [
{
"comment_text": "",
"digests": {
"md5": "c67ac3b79c53e4363259c59293962045",
"sha256": "849b957c54fcbe5f5ae6b188f5d7054d5243239e6ca5143818356e8accb28d40"
},
"downloads": -1,
"filename": "afs2_datasource-2.1.25-py3-none-any.whl",
"has_sig": false,
"md5_digest": "c67ac3b79c53e4363259c59293962045",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 24518,
"upload_time": "2019-08-02T03:28:34",
"url": "https://files.pythonhosted.org/packages/f7/6a/0f0f6c1cbfe921f731659572de8b4bd0e1f288dca7b92ebba414a53c6d5c/afs2_datasource-2.1.25-py3-none-any.whl"
}
],
"2.1.27": [
{
"comment_text": "",
"digests": {
"md5": "fef24a6aab964f712c14036e541143d4",
"sha256": "0782e835a977c52819d0063274b31fa0bdc02bab432e1719c015884992486a2c"
},
"downloads": -1,
"filename": "afs2_datasource-2.1.27-py3-none-any.whl",
"has_sig": false,
"md5_digest": "fef24a6aab964f712c14036e541143d4",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 24803,
"upload_time": "2019-08-27T09:24:48",
"url": "https://files.pythonhosted.org/packages/5a/6b/c6d0a16b67dd223cf6f9e354f0f44fb7c7ea174398e1a2929f8e2bd5e7f0/afs2_datasource-2.1.27-py3-none-any.whl"
}
]
},
"urls": [
{
"comment_text": "",
"digests": {
"md5": "fef24a6aab964f712c14036e541143d4",
"sha256": "0782e835a977c52819d0063274b31fa0bdc02bab432e1719c015884992486a2c"
},
"downloads": -1,
"filename": "afs2_datasource-2.1.27-py3-none-any.whl",
"has_sig": false,
"md5_digest": "fef24a6aab964f712c14036e541143d4",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 24803,
"upload_time": "2019-08-27T09:24:48",
"url": "https://files.pythonhosted.org/packages/5a/6b/c6d0a16b67dd223cf6f9e354f0f44fb7c7ea174398e1a2929f8e2bd5e7f0/afs2_datasource-2.1.27-py3-none-any.whl"
}
]
}