{ "info": { "author": "Thor Whalen", "author_email": "", "bugtrack_url": null, "classifiers": [ "Development Status :: 3 - Alpha", "Intended Audience :: Developers", "License :: OSI Approved :: Apache Software License", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", "Topic :: Software Development" ], "description": "# py2store\nStorage CRUD how and where you want it.\n\nList, read, write, and delete data in a structured data source/target, \nas if manipulating simple python builtins (dicts, lists), or through the interface **you** want to interact with, \nwith configuration or physical particularities out of the way. \nAlso, being able to change these particularities without having to change the business-logic code. \n\n# Quickstart\n\nInstall it (e.g. `pip install py2store`).\n\nThink of type of storage you want to use and just go ahead, like you're using a dict.\nHere's an example for local storage (you must you string keys only here). \n\n```\n>>> from py2store import QuickStore\n>>>\n>>> store = QuickStore() # will print what (tmp) rootdir it is choosing\n>>> # Write something and then read it out again\n>>> store['foo'] = 'baz'\n>>> store['foo']\n'baz'\n>>> \n>>> # Go see that there is now a file in the rootdir, named 'foo'!\n>>> \n>>> # Write something more complicated\n>>> store['hello/world'] = [1, 'flew', {'over': 'a', \"cuckoo's\": map}]\n>>> store['hello/world'] == [1, 'flew', {'over': 'a', \"cuckoo's\": map}]\nTrue\n>>>\n>>> # do you have the key 'foo' in your store?\n>>> 'foo' in store\nTrue\n>>> # how many items do you have now?\n>>> assert len(store) >= 2 # can't be sure there were no elements before, so can't assert == 2\n>>> \n>>> # delete the stuff you've written\n>>> del store['foo']\n>>> del store['hello/world']\n```\n\n`QuickStore` will by default store things in local files, using pickle as the serializer.\nIf a root directory is not specified, \nit will use a tmp directory it will create (the first time you try to store something) \nIt will create any directories that need to be created to satisfy any/key/that/contains/slashes.\nOf course, everything is configurable.\n\n# More examples\n\n## Looks like a dict\nBelow, we make a default store and demo a few basic operations on it.\nThe default store uses a dict as it's backend persister. \nA dict is neither really a backend, nor a persister. But it helps to try things out with no\nfootprint.\n\n```python\nfrom py2store.base import Store\n\ns = Store()\nassert list(s) == []\ns['foo'] = 'bar' # put 'bar' in 'foo'\nassert 'foo' in s # check that 'foo' is in (i.e. a key of) s\nassert s['foo'] == 'bar' # see that the value that 'foo' contains is 'bar'\nassert list(s) == ['foo'] # list all the keys (there's only one)\nassert list(s.items()) == [('foo', 'bar')] # list all the (key, value) pairs\nassert list(s.values()) == ['bar'] # list all the values\nassert len(s) == 1 # Number of items in my store\ns['another'] = 'item' # store another item\nassert len(s) == 2 # Now I have two!\nassert list(s) == ['foo', 'another'] # here they are\n```\n\nThere's nothing fantastic in the above code. \nI've just demoed some operations on a dict.\nBut it's exactly this simplicity that py2store aims for. \nYou can now replace the `s = Store()` with `s = AnotherStore(...)` where `AnotherStore` \nnow uses some other backend that could be remote or local, could be a database, or any \nsystem that can store `something` (the value) `somewhere` (the key).\n\nYou can choose from an existing store (e.g. local files, for AWS S3, for MongoDB) or \nquite easily make your own (more on that later).\n\nAnd yet, it will still look like you're talking to a dict. This not only means that you can \ntalk to various storage systems without having to actually learn how to, but also means \nthat the same business logic code you've written can be reused with no modification. \n\nBut py2store offers more than just a simple consistent facade to **where** you store things, \nbut also provides means to define **how** you do it.\n\nIn the case of key-value storage, the \"how\" is defined on the basis of the keys (how you reference) \nthe objects you're storing and the values (how you serialize and deserialize those objects).\n \n\n## Converting keys: Relative paths and absolute paths\nTake a look at the following example, that adds a layer of key conversion to a store.\n\n```python\n# defining the store\nfrom py2store.base import Store\n\nclass PrefixedKeyStore(Store):\n prefix = ''\n def _id_of_key(self, key):\n return self.prefix + key # prepend prefix before passing on to store\n def _key_of_id(self, _id):\n if not _id.startswith(self.prefix):\n raise ValueError(f\"_id {_id} wasn't prefixed with {self.prefix}\")\n else:\n return _id[len(self.prefix):] # don't show the user the prefix\n \n# trying the store out \ns = PrefixedKeyStore()\ns.prefix = '/ROOT/'\nassert list(s) == []\ns['foo'] = 'bar' # put 'bar' in 'foo'\nassert 'foo' in s # check that 'foo' is in (i.e. a key of) s\nassert s['foo'] == 'bar' # see that the value that 'foo' contains is 'bar'\nassert list(s) == ['foo'] # list all the keys (there's only one)\nassert list(s.items()) == [('foo', 'bar')] # list all the (key, value) pairs\nassert list(s.values()) == ['bar'] # list all the values\nassert len(s) == 1 # Number of items in my store\ns['another'] = 'item' # store another item\nassert len(s) == 2 # Now I have two!\nassert list(s) == ['foo', 'another'] # here they are \n```\n\n\nQ: That wasn't impressive! It's just the same as the first Store. What's this prefix all about?\n\nA: The prefix thing is hidden, and that's the point. You want to talk the \"relative\" (i.e \"prefix-free\")\nlanguage, but may have the need for this prefix to be prepended to the key before persisting the data\nand that prefix to be removed before being displayed to the user. \nThink of working with files. Do you want to have to specify the root folder every time you store something\nor retrieve something?\n\nQ: Prove it!\n\nA: Okay, let's look under the hood at what the underlying store (a dict) is dealing with:\n\n```python\nassert list(s.store.items()) == [('/ROOT/foo', 'bar'), ('/ROOT/another', 'item')]\n```\n\nYou see? The keys that the \"backend\" is using are actually prefixed with `\"/ROOT/\"`\n\n## Serialization/Deserialization\n\nLet's now demo serialization and deserialization. \n\nSay we want to deserialize any text we stored by appending `\"hello \"` to everything stored.\n\n```python\n# defining the store\nfrom py2store.base import Store\n\nclass MyFunnyStore(Store):\n def _obj_of_data(self, data):\n return f'hello {data}'\n \n# trying the store out \ns = MyFunnyStore()\nassert list(s) == []\ns['foo'] = 'bar' # put 'bar' in 'foo'\nassert 'foo' in s # check that 'foo' is in (i.e. a key of) s\nassert s['foo'] == 'hello bar' # see that the value that 'foo' contains is 'bar'\nassert list(s) == ['foo'] # list all the keys (there's only one)\nassert list(s.items()) == [('foo', 'hello bar')] # list all the (key, value) pairs\nassert list(s.values()) == ['hello bar'] # list all the values \n```\n\nIn the following, we want to serialize our text by upper-casing it (and see it as such) \nwhen we retrieve the text.\n\n```python\n# defining the store\nfrom py2store.base import Store\n\nclass MyOtherFunnyStore(Store):\n def _data_of_obj(self, obj):\n return obj.upper()\n \n# trying the store out \ns = MyOtherFunnyStore()\nassert list(s) == []\ns['foo'] = 'bar' # put 'bar' in 'foo'\nassert 'foo' in s # check that 'foo' is in (i.e. a key of) s\nassert s['foo'] == 'BAR' # see that the value that 'foo' contains is 'bar'\nassert list(s) == ['foo'] # list all the keys (there's only one)\nassert list(s.items()) == [('foo', 'BAR')] # list all the (key, value) pairs\nassert list(s.values()) == ['BAR'] # list all the values\n``` \n\nIn the last to serialization examples, we only implemented one way transformations. \nThat's all fine if you just want to have a writer (so only need a serializer) or a reader (so only \nneed a deserializer). \nIn most cases though, you will need two way transformations, specifying how the object \nshould be serialized to be stored, and how it should be deserialized to get your object back. \n\n\n## A pickle store\n\nSay you wanted the store to pickle as your serializer. Here's how this could look like.\n\n```python\n# defining the store\nimport pickle\nfrom py2store.base import Store\n\n\nclass PickleStore(Store):\n protocol = None\n fix_imports = True\n encoding = 'ASCII'\n def _data_of_obj(self, obj): # serializer\n return pickle.dumps(obj, protocol=self.protocol, fix_imports=self.fix_imports)\n def _obj_of_data(self, data): # deserializer\n return pickle.loads(data, fix_imports=self.fix_imports, encoding=self.encoding)\n\n# trying the store out \ns = PickleStore()\nassert list(s) == []\ns['foo'] = 'bar' # put 'bar' in 'foo'\nassert s['foo'] == 'bar' # I can get 'bar' back\n# behind the scenes though, it's really a pickle that is stored:\nassert s.store['foo'] == b'\\x80\\x03X\\x03\\x00\\x00\\x00barq\\x00.'\n``` \n\nAgain, it doesn't seem that impressive that you can get back a string that you stored in a dict. \nFor two reasons: (1) you don't really need to serialize strings to store them and (2) you don't need to serialize python \nobjects to store them in a dict. \nBut if you (1) were trying to store more complex types and (2) were actually persisting them in a file system or database, \nthen you'll need to serialize.\nThe point here is that the serialization and persisting concerns are separated from the storage and retrieval concern. \nThe code still looks like you're working with a dict.\n\n## But how do you change the persister?\n\nBy using a persister that persists where you want. \nYou can also write your own. All a persister needs to work with py2store is that it follows the interface \npython's `collections.MutableMapping` (or a subset thereof). More on how to make your own persister later\nYou just need to follow the collections.MutableMapping interface. \n\nBelow a simple example of how to persist in files under a given folder.\n(Warning: If you want a local file store, don't use this, but one of the easier to use, robust and safe stores in the \nstores folder!)\n\n```python\nimport os\nfrom collections.abc import MutableMapping\n\nclass SimpleFilePersister(MutableMapping):\n \"\"\"Read/write (text or binary) data to files under a given rootdir.\n Keys must be absolute file paths.\n Paths that don't start with rootdir will be raise a KeyValidationError\n \"\"\"\n\n def __init__(self, rootdir, mode='t'):\n if not rootdir.endswith(os.path.sep):\n rootdir = rootdir + os.path.sep\n self.rootdir = rootdir\n assert mode in {'t', 'b', ''}, f\"mode ({mode}) not valid: Must be 't' or 'b'\"\n self.mode = mode\n\n def __getitem__(self, k):\n with open(k, 'r' + self.mode) as fp:\n data = fp.read()\n return data\n\n def __setitem__(self, k, v):\n with open(k, 'w' + self.mode) as fp:\n fp.write(v)\n\n def __delitem__(self, k):\n os.remove(k)\n\n def __contains__(self, k):\n \"\"\" Implementation of \"k in self\" check.\n Note: MutableMapping gives you this for free, using a try/except on __getitem__,\n but the following uses faster os functionality.\"\"\"\n return os.path.isfile(k)\n\n def __iter__(self):\n yield from filter(os.path.isfile, \n map(lambda x: os.path.join(self.rootdir, x), \n os.listdir(self.rootdir)))\n \n def __len__(self):\n \"\"\"Note: There's system-specific faster ways to do this.\"\"\"\n count = 0\n for _ in self.__iter__():\n count += 1\n return count\n \n def clear(self):\n \"\"\"MutableMapping creates a 'delete all' functionality by default. Better disable it!\"\"\"\n raise NotImplementedError(\"If you really want to do that, loop on all keys and remove them one by one.\")\n```\n\nNow try this out:\n```python\nimport os\n# What folder you want to use. Defaulting to the home folder. You can choose another place, but make sure \nrootdir = os.path.expanduser('~/') # Defaulting to the home folder. You can choose another place\n\npersister = SimpleFilePersister(rootdir)\nfoo_fullpath = os.path.join(rootdir, 'foo')\npersister[foo_fullpath] = 'bar' # write 'bar' to a file named foo_fullpath\nassert persister[foo_fullpath] == 'bar' # see that you can read the contents of that file to get your 'bar' back\nassert foo_fullpath in persister # the full filepath indeed exists in (i.e. \"is a key of\") the persister\nassert foo_fullpath in list(persister) # you can list all the contents of the rootdir and file foo_fullpath in it\n```\n\n## Talk your own CRUD dialect\n\nDon't like this dict-like interface? Want to talk **your own** CRUD words? \nWe got you covered! Just subclass `SimpleFilePersister` and make the changes you want to make:\n\n```python\nclass MySimpleFilePersister(SimpleFilePersister): \n # If it's just renaming, it's easy\n read = SimpleFilePersister.__getitem__\n exists = SimpleFilePersister.__contains__\n n_files = SimpleFilePersister.__len__\n \n # here we want a new method that gives us an actual list of the filepaths in the rootdir\n list_files = lambda self: list(self.__iter__())\n\n # And for write we want val and key to be swapped in our interface, \n def write(self, val, key): # note that we wanted val to come first here (as with json.dump and pickle.dump interface)\n return self.__setitem__(key, val) \n\nmy_persister = MySimpleFilePersister(rootdir)\n\nfoo_fullpath = os.path.join(rootdir, 'foo1')\nmy_persister.write('bar1', foo_fullpath) # write 'bar1' to a file named foo_fullpath\nassert my_persister.read(foo_fullpath) == 'bar1' # see that you can read the contents of that file to get your 'bar1' back\nassert my_persister.exists(foo_fullpath) # the full filepath indeed exists in (i.e. \"is a key of\") the persister\nassert foo_fullpath in my_persister.list_files() # you can list all the contents of the rootdir and file foo_fullpath in it\n```\n\n## Transforming keys\n\nBut dealing with full paths can be annoying, and might couple code too tightly with a particular local system.\nWe'd like to use relative paths instead. \nEasy: Wrap the persister in the `PrefixedKeyStore` defined earlier. \n\n```python\ns = PrefixedKeyStore(store=persister) # wrap your persister with the PrefixedKeyStore defined earlier\nif not rootdir.endswith(os.path.sep): \n rootdir = rootdir + os.path.sep # make sure the rootdir ends with slash\ns.prefix = rootdir # use rootdir as prefix in keys\n\ns['foo2'] = 'bar2' # write 'bar2' to a file \nassert s['foo2'] == 'bar2' # see that you can read the contents of that file to get your 'bar2' back\nassert 'foo2' in s \nassert 'foo2' in list(s) \n```\n\n# How it works\n\npy2store offers three aspects that you can define or modify to store things where you like and how you like it:\n* **Persistence**: Where things are actually stored (memory, files, DBs, etc.)\n* **Serialization**: Value transformaton. \nHow python objects should be transformed before it is persisted, \nand how persisted data should be transformed into python objects.\n* **Indexing**: Key transformation. How you name/id/index your data. \nFull or relative paths. Unique combination of parameters (e.g. (country, city)). Etc.\n\nAll of this allows you to do operations such as \"store this (value) in there (persitence) as that (key)\", \nmoving the tedious particularities of the \"in there\" as well how the \"this\" and \"that\" are transformed to fit \nin there, all out of the way of the business logic code. The way it should be.\n\n![alt text](img/py2store_how_it_works.png)\n\nNote: Where data is actually persisted just depends on what the base CRUD methods \n(`__getitem__`, `__setitem__`, `__delitem__`, `__iter__`, etc.) define them to be. \n \n# A few persisters you can use\n\nWe'll go through a few basic persisters that are ready to use.\nThere are more in each category, and we'll be adding new categories, but \nthis should get you started.\n\nHere is a useful function to perform a basic test on a store, given a key and value.\nIt doesn't test all store method (see test modules for that), but demos \nthe basic functionality that pretty much every store should be able to do.\n\n```python\ndef basic_test(store, k='foo', v='bar'):\n \"\"\" This test performs \n Warning: Don't use on a key k that you don't want to loose!\"\"\"\n if k in store: # deleting all docs in tmp\n del store[k]\n assert (k in store) == False # see that key is not in store (and testing __contains__)\n orig_length = len(store) # the length of the store before insertion\n store[k] = v # write v to k (testing __setitem__)\n assert store[k] == v # see that the value can be retrieved (testing __getitem__, and that __setitem__ worked)\n assert len(store) == orig_length + 1 # see that the number of items in the store increased by 1\n assert (k in store) == True # see that key is in store now (and testing __contains__ again)\n assert k in list(store) # testing listing the (key) contents of a store (and seeing if )\n assert store.get(k) == v # the get method\n _ = next(iter(store.keys())) # get the first key (test keys method)\n _ = next(iter(store.__iter__())) # get the first key (through __iter__)\n k in store.keys() # test that the __contains__ of store.keys() works\n \n try: \n _ = next(iter(store.values())) # get the first value (test values method)\n _ = next(iter(store.items())) # get the first (key, val) pair (test items method)\n except Exception:\n print(\"values() (therefore items()) didn't work: Probably testing a persister that had other data in it that your persister doesn't like\")\n \n assert (k in store) == True # testing __contains__ again\n del store[k] # clean up (and test delete)\n```\n## Local Files\n\nThere are many choices of local file stores according to what you're trying to do. \nOne general (but not too general) purpose local file store is \n'py2store.stores.local_store.RelativePathFormatStoreEnforcingFormat'.\nIt can do a lot for you, like add a prefix to your keys (so you can talk in relative instead of absolute paths),\nlists all files in subdirectories as well recursively, \nonly show you files that have a given pattern when you list them, \nand not allow you to write to a key that doesn't fit the pattern. \nFurther, it also has what it takes to create parametrized paths or parse out the parameters of a path. \n\n```python\nfrom py2store.stores.local_store import RelativePathFormatStoreEnforcingFormat as LocalFileStore\nimport os\n\nrootdir = os.path.expanduser('~/pystore_tests/') # or replace by the folder you want to use\nos.makedirs(rootdir, exist_ok=True) # this will make all directories that don't exist. Don't use if you don't want that.\n\nstore = LocalFileStore(path_format=rootdir)\nbasic_test(store, k='foo', v='bar')\n```\n \nThe signature of LocalFileStore is:\n```python\nLocalFileStore(path_format, mode='',\n buffering=-1, encoding=None, errors=None, newline=None, closefd=True, opener=None)\n```\n\nOften path_format is just used to specify the rootdir, as above. \nBut you can specify the desired format further.\nFor example, the following will only yield .wav files, \nand only allow you to write to keys that end with .wav:\n```python\nstore = LocalFileStore(path_format='/THE/ROOT/DIR/{}.wav')\n```\n\nThe following will additional add the restriction that those .wav files have the format 'SOMESTRING_' \nfollowed by digits:\n```python\nstore = LocalFileStore(path_format='/THE/ROOT/DIR/{:s}_{:d}.wav')\n```\n\nYou get the point...\n\nThe other arguments of LocalFileStore or more or less those of python's `open` function.\nThe slight difference is that here the `mode` argument applies both to read and write. \nIf `mode='b'` for example, the file will be opened with `mode='rb'` when opened to read and\nwith `mode='wb'` when opened to write. For assymetrical read/write modes, the \nuser can specify a `read_mode` and `write_mode` (in this case the `mode` argument is ignored).\n\n## MongoDB\n\nA MongoDB collection is not as naturally a key-value storage as a file system is.\nMongoDB stores \"documents\", which are JSONs of data, having many (possibly nested) fields that are not \nby default enforced by a schema. So in order to talk to mongo as a key-value store, we need to \nspecify what fields should be considered as keys, and what fields should be considered as data. \n\nBy default, the `_id` field (the only field ensured by default to contain unique values) is the single key field, and \nall other fields are considered to be data fields.\n\n```python\nfrom py2store.stores.mongo_store import MongoStore\n# The following makes a default MongoStore, the default pymongo.MongoClient settings, \n# and db_name='py2store', collection_name='test', key_fields=('_id',)\nstore = MongoStore()\nbasic_test(store, k={'_id': 'foo'}, v={'val': 'bar', 'other_val': 3})\n```\n\nBut it can get annoying to specify the key as a dict every time.\nThe key schema is fixed, so you should be able to just specify the tuple of values making the keys.\nAnd you can, with MongoTupleKeyStore\n\n```python\nfrom py2store.stores.mongo_store import MongoTupleKeyStore\nstore = MongoTupleKeyStore(key_fields=('_id', 'name'))\nbasic_test(store, k=(1234, 'bob'), v={'age': 42, 'gender': 'unspecified'})\n```\n\n## S3\n\nIt works pretty much like LocalStores, but stores in S3. You'll need to have an account with \nAWS to use this. Find S3 stores in py2store.stores.s3_stores.\n\n# Use cases\n\n## Interfacing reads\n\nHow many times did someone share some data with you in the form of a zip of some nested folders \nwhose structure and naming choices are fascinatingly obscure? And how much time do you then spend to write code \nto interface with that freak of nature? Well, one of the intents of py2store is to make that easier to do. \nYou still need to understand the structure of the data store and how to deserialize these datas into python \nobjects you can manipulate. But with the proper tool, you shouldn't have to do much more than that.\n\n## Changing where and how things are stored\n\nEver have to switch where you persist things (say from file system to S3), or change the way key into your data, \nor the way that data is serialized? If you use py2store tools to separate the different storage concerns, \nit'll be quite easy to change, since change will be localized. And if you're dealing with code that was already \nwritten, with concerns all mixed up, py2store should still be able to help since you'll be able to\nmore easily give the new system a facade that makes it look like the old one. \n\nAll of this can also be applied to data bases as well, in-so-far as the CRUD operations you're using \nare covered by the base methods.\n\n## Adapters: When the learning curve is in the way of learning\n\nShinny new storage mechanisms (DBs etc.) are born constantly, and some folks start using them, and we are eventually lead to use them \nas well if we need to work with those folks' systems. And though we'd love to learn the wonderful new \ncapabilities the new kid on the block has, sometimes we just don't have time for that. \n\nWouldn't it be nice if someone wrote an adapter to the new system that had an interface we were familiar with? \nTalking to SQL as if it were mongo (or visa versa). Talking to S3 as if it were a file system. \nNow it's not a long term solution: If we're really going to be using the new system intensively, we \nshould learn it. But when you just got to get stuff done, having a familiar facade to something new \nis a life saver. \n\npy2store would like to make it easier for you roll out an adapter to be able to talk \nto the new system in the way **you** are familiar with.\n \n## Thinking about storage later, if ever\n\nYou have a new project or need to write a new app. You'll need to store stuff and read stuff back. \nStuff: Different kinds of resources that your app will need to function. Some people enjoy thinking \nof how to optimize that aspect. I don't. I'll leave it to the experts to do so when the time comes. \nOften though, the time is later, if ever. Few proof of concepts and MVPs ever make it to prod. \n\nSo instead, I'd like to just get on with the business logic and write my program. \nSo what I need is an easy way to get some minimal storage functionality. \nBut when the time comes to optimize, I shouldn't have to change my code, but instead just change the way my \nDAO does things. What I need is py2store.\n\n# Is a store an ORM? A DAO?\n\nCall it what you want, really.\n\nIt would be tempting to coin py2store as ya(p)orm (yet another (python) object-relational mapping), \nbut that would be misleading. The intent of py2store is not to map objects to db entries, \nbut rather to offer a consistent interface for basic storage operations. \n\nIn that sense, py2store is more akin to an implementation of the data access object (DAO) pattern. \nOf course, the difference between ORM and DAO can be blurry, so all this should be taken with a grain of salt.\n\nAdvantages and disadvantages such abstractions are easy to search and find.\n\nMost data interaction mechanisms can be satisfied by a subset of the collections.abc interfaces.\nFor example, one can use python's collections.Mapping interface for any key-value storage, making the data access \nobject have the look and feel of a dict, instead of using other popular method name choices such for \nsuch as read/write, load/dump, etc. \nOne of the dangers there is that, since the DAO looks and acts like a dict (but is not) a user might underestimate \nthe running-costs of some operations.\n\n# Some links\n\nORM: Object-relational mapping: https://en.wikipedia.org/wiki/Object-relational_mapping\n\nDAO: Data access object: https://en.wikipedia.org/wiki/Data_access_object", "description_content_type": "text/markdown", "docs_url": null, "download_url": "https://github.com/i2mint/py2store/archive/v0.0.3.zip", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/i2mint/py2store", "keywords": "storage,interface", "license": "Apache", "maintainer": "", "maintainer_email": "", "name": "py2store", "package_url": "https://pypi.org/project/py2store/", "platform": "", "project_url": "https://pypi.org/project/py2store/", "project_urls": { "Download": "https://github.com/i2mint/py2store/archive/v0.0.3.zip", "Homepage": "https://github.com/i2mint/py2store" }, "release_url": "https://pypi.org/project/py2store/0.0.3/", "requires_dist": null, "requires_python": "", "summary": "DAO for Python: Tools to create simple and consistent interfaces to complicated and varied data sources.", "version": "0.0.3" }, "last_serial": 5695392, "releases": { "0.0.2": [ { "comment_text": "", "digests": { "md5": "e6d94e54421296894f5db060915eca17", "sha256": "5294eaf695758fdb4ee60d3022b2cb750e9967453c618ad7e848da719308aaad" }, "downloads": -1, "filename": "py2store-0.0.2.tar.gz", "has_sig": false, "md5_digest": "e6d94e54421296894f5db060915eca17", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 78406, "upload_time": "2019-08-15T17:01:35", "url": "https://files.pythonhosted.org/packages/da/cd/647550fd4c49692a4d2c600f70d84ff303365460791e6a219a67950d9c4f/py2store-0.0.2.tar.gz" } ], "0.0.3": [ { "comment_text": "", "digests": { "md5": "fa147201e55d5dc606f81c3fdedcbf87", "sha256": "d894c9cbb1a5dba1d72a89f951119fb5adf63c1b14e327edfe318ee7646e5cb0" }, "downloads": -1, "filename": "py2store-0.0.3.tar.gz", "has_sig": false, "md5_digest": "fa147201e55d5dc606f81c3fdedcbf87", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 82655, "upload_time": "2019-08-18T17:56:11", "url": "https://files.pythonhosted.org/packages/36/2f/b1b35acc5c1d13e75a3ca64ac67bbb06c522bfb77cd9246be23444d69f16/py2store-0.0.3.tar.gz" } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "fa147201e55d5dc606f81c3fdedcbf87", "sha256": "d894c9cbb1a5dba1d72a89f951119fb5adf63c1b14e327edfe318ee7646e5cb0" }, "downloads": -1, "filename": "py2store-0.0.3.tar.gz", "has_sig": false, "md5_digest": "fa147201e55d5dc606f81c3fdedcbf87", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 82655, "upload_time": "2019-08-18T17:56:11", "url": "https://files.pythonhosted.org/packages/36/2f/b1b35acc5c1d13e75a3ca64ac67bbb06c522bfb77cd9246be23444d69f16/py2store-0.0.3.tar.gz" } ] }