{ "info": { "author": "Angelos Katharopoulos", "author_email": "angelos.katharopoulos@idiap.ch", "bugtrack_url": null, "classifiers": [ "Intended Audience :: Developers", "Intended Audience :: Science/Research", "License :: OSI Approved :: MIT License", "Programming Language :: Python", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.6", "Topic :: Scientific/Engineering" ], "description": "Attention Sampling\n==================\n\nThis repository provides a python library to accelerate the training and\ninference of neural networks on large data. This code is the reference\nimplementation of the methods described in our ICML 2019 publication\n`\"Processing Megapixel Images with Deep Attention-Sampling Models\"\n`_.\n\n\nUsage\n------\n\nYou can find examples of how to use our library in the provided `scripts\n`_ or a very\nconcise one below.\n\n.. code:: python\n\n # Keras imports\n\n from ats.core import attention_sampling\n from ats.utils.layers import SampleSoftmax\n from ats.utils.regularizers import multinomial_entropy\n\n # Create our two inputs.\n # Note that x_low could also be an input if we have access to a precomputed\n # downsampled image.\n x_high = Input(shape=(H, W, C))\n x_low = AveragePooling2D(pool_size=(10,))(x_high)\n\n # Create our attention model\n attention = Sequential([\n ...\n Conv2D(1),\n SampleSoftmax(squeeze_channels=True)\n ])\n\n # Create our feature extractor per patch, we assume that it returns a\n # vector per patch.\n feature = Sequential([\n ...\n GlobalAveragePooling2D(),\n L2Normalize()\n ])\n\n features, attention, patches = attention_sampling(\n attention,\n feature,\n patch_size=(32, 32),\n n_patches=10,\n attention_regularizer=multinomial_entropy(0.01)\n )([x_low, x_high])\n\n y = Dense(output_size, activation=\"softmax\")(features)\n\n model = Model(inputs=x_high, outputs=y)\n\nDependencies & Installation\n----------------------------\n\nTo install the library just run ``pip install attention-sampling``. If you want\nto extend our code clone the repository and install it in development mode.\n\nThe dependencies of ``attention-sampling`` are\n\n* TensorFlow\n* C++ tool chain\n* CUDA (optional)\n\nDocumentation\n-------------\n\nThere exists a dedicated `documentation site `_\nbut you are also encouraged to read the `source code\n` and the `scripts\n`_ to get an\nidea of how the library should be used and extended.\n\nResearch\n---------\n\nIf you found this work influential or helpful in your research in any way, we\nwould appreciate if you cited us.\n\n.. code::\n\n @inproceedings{katharopoulos2019ats,\n title={Processing Megapixel Images with Deep Attention-Sampling Models},\n author={Katharopoulos, A. and Fleuret, F.},\n booktitle={Proceedings of the International Conference on Machine Learning (ICML)},\n year={2019}\n }\n", "description_content_type": "", "docs_url": null, "download_url": "", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "http://www.idiap.ch/~katharas/", "keywords": "", "license": "MIT", "maintainer": "", "maintainer_email": "", "name": "attention-sampling", "package_url": "https://pypi.org/project/attention-sampling/", "platform": "", "project_url": "https://pypi.org/project/attention-sampling/", "project_urls": { "Homepage": "http://www.idiap.ch/~katharas/" }, "release_url": "https://pypi.org/project/attention-sampling/0.2/", "requires_dist": null, "requires_python": "", "summary": "Train networks on large data using attention sampling.", "version": "0.2" }, "last_serial": 5568940, "releases": { "0.1": [ { "comment_text": "", "digests": { "md5": "10d8b3d2a638b4be2120cb8d46b6b6d6", "sha256": "d6a80df34d1bb07d6a9753e2b9f97f5f42a13da21679cbe73c0eb4b539682847" }, "downloads": -1, "filename": "attention-sampling-0.1.tar.gz", "has_sig": false, "md5_digest": "10d8b3d2a638b4be2120cb8d46b6b6d6", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 23544, "upload_time": "2019-07-22T14:42:49", "url": "https://files.pythonhosted.org/packages/d5/a1/69be75b89c40789b47e7bdac4150ba894376747d367a3a54bf00754123c3/attention-sampling-0.1.tar.gz" } ], "0.1.1": [ { "comment_text": "", "digests": { "md5": "b5f932f53b680a8ff49a5cef326594de", "sha256": "30b7590af2546cf4dff68ab1125f9ef54a1953a52abba542a1e43a3ef98d6d13" }, "downloads": -1, "filename": "attention-sampling-0.1.1.tar.gz", "has_sig": false, "md5_digest": "b5f932f53b680a8ff49a5cef326594de", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 23548, "upload_time": "2019-07-22T14:45:14", "url": "https://files.pythonhosted.org/packages/47/9f/49568dab359a58122ed6428584f3704fbeed48790adbede223c529f1a7ce/attention-sampling-0.1.1.tar.gz" } ], "0.1.2": [ { "comment_text": "", "digests": { "md5": "e62de2c3b404d961e532cd8d56ccde91", "sha256": "cbc20f6eccda9e47d5df07d35c765c5ad1c913c24eaae6773a564463fc65bb81" }, "downloads": -1, "filename": "attention-sampling-0.1.2.tar.gz", "has_sig": false, "md5_digest": "e62de2c3b404d961e532cd8d56ccde91", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 24691, "upload_time": "2019-07-22T14:55:07", "url": "https://files.pythonhosted.org/packages/ca/64/2691fc1b7fdc6e3515eb02645c995bfe8dfac6905085c66eb24a4e461951/attention-sampling-0.1.2.tar.gz" } ], "0.2": [ { "comment_text": "", "digests": { "md5": "f8f72846f254a70474687cd7641b9f96", "sha256": "7e2df98c2e05532799316c6a9bca5311960c9c404176f7bd98c8e971c2994321" }, "downloads": -1, "filename": "attention-sampling-0.2.tar.gz", "has_sig": false, "md5_digest": "f8f72846f254a70474687cd7641b9f96", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 24636, "upload_time": "2019-07-22T20:17:17", "url": "https://files.pythonhosted.org/packages/ff/2d/4474d1f516865eb83419c67045d885adc03266f97858fa8eeb14117c709d/attention-sampling-0.2.tar.gz" } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "f8f72846f254a70474687cd7641b9f96", "sha256": "7e2df98c2e05532799316c6a9bca5311960c9c404176f7bd98c8e971c2994321" }, "downloads": -1, "filename": "attention-sampling-0.2.tar.gz", "has_sig": false, "md5_digest": "f8f72846f254a70474687cd7641b9f96", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 24636, "upload_time": "2019-07-22T20:17:17", "url": "https://files.pythonhosted.org/packages/ff/2d/4474d1f516865eb83419c67045d885adc03266f97858fa8eeb14117c709d/attention-sampling-0.2.tar.gz" } ] }