{ "info": { "author": "CyberZHG", "author_email": "CyberZHG@gmail.com", "bugtrack_url": null, "classifiers": [ "License :: OSI Approved :: MIT License", "Operating System :: OS Independent", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3.6" ], "description": "# Keras Self-Attention\n\n[![Travis](https://travis-ci.org/CyberZHG/keras-self-attention.svg)](https://travis-ci.org/CyberZHG/keras-self-attention)\n[![Coverage](https://coveralls.io/repos/github/CyberZHG/keras-self-attention/badge.svg?branch=master)](https://coveralls.io/github/CyberZHG/keras-self-attention)\n[![PyPI](https://img.shields.io/pypi/pyversions/keras-self-attention.svg)](https://pypi.org/project/keras-self-attention/)\n[![Codacy Badge](https://api.codacy.com/project/badge/Grade/5a99d0419bec42cfb73c4af06d746c8a)](https://www.codacy.com/project/CyberZHG/keras-self-attention/dashboard?utm_source=github.com&utm_medium=referral&utm_content=CyberZHG/keras-self-attention&utm_campaign=Badge_Grade_Dashboard)\n\nAttention mechanism for processing sequential data that considers the context for each timestamp.\n\n* ![](https://user-images.githubusercontent.com/853842/44248592-1fbd0500-a21e-11e8-9fe0-52a1e4a48329.gif)\n* ![](https://user-images.githubusercontent.com/853842/44248591-1e8bd800-a21e-11e8-9ca8-9198c2725108.gif)\n* ![](https://user-images.githubusercontent.com/853842/44248590-1df34180-a21e-11e8-8ff1-268217f466ba.gif)\n* ![](https://user-images.githubusercontent.com/853842/44249018-8ba06d00-a220-11e8-80e3-802677b658ed.gif)\n\n## Install\n\n```bash\npip install keras-self-attention\n```\n\n## Usage\n\n### Basic\n\nBy default, the attention layer uses additive attention and considers the whole context while calculating the relevance. The following code creates an attention layer that follows the equations in the first section (`attention_activation` is the activation function of `e_{t, t'}`):\n\n```python\nimport keras\nfrom keras_self_attention import SeqSelfAttention\n\n\nmodel = keras.models.Sequential()\nmodel.add(keras.layers.Embedding(input_dim=10000,\n output_dim=300,\n mask_zero=True))\nmodel.add(keras.layers.Bidirectional(keras.layers.LSTM(units=128,\n return_sequences=True)))\nmodel.add(SeqSelfAttention(attention_activation='sigmoid'))\nmodel.add(keras.layers.Dense(units=5))\nmodel.compile(\n optimizer='adam',\n loss='categorical_crossentropy',\n metrics=['categorical_accuracy'],\n)\nmodel.summary()\n```\n\n### Local Attention\n\nThe global context may be too broad for one piece of data. The parameter `attention_width` controls the width of the local context:\n\n```python\nfrom keras_self_attention import SeqSelfAttention\n\nSeqSelfAttention(\n attention_width=15,\n attention_activation='sigmoid',\n name='Attention',\n)\n```\n\n### Multiplicative Attention\n\nYou can use multiplicative attention by setting `attention_type`:\n\n![](https://user-images.githubusercontent.com/853842/44253887-a03a3080-a233-11e8-9d49-3fd7e622a0f7.gif)\n\n```python\nfrom keras_self_attention import SeqSelfAttention\n\nSeqSelfAttention(\n attention_width=15,\n attention_type=SeqSelfAttention.ATTENTION_TYPE_MUL,\n attention_activation=None,\n kernel_regularizer=keras.regularizers.l2(1e-6),\n use_attention_bias=False,\n name='Attention',\n)\n```\n\n### Regularizer\n\n![](https://user-images.githubusercontent.com/853842/44250188-f99b6300-a225-11e8-8fab-8dcf0d99616e.gif)\n\nTo use the regularizer, set `attention_regularizer_weight` to a positive number:\n\n```python\nimport keras\nfrom keras_self_attention import SeqSelfAttention\n\ninputs = keras.layers.Input(shape=(None,))\nembd = keras.layers.Embedding(input_dim=32,\n output_dim=16,\n mask_zero=True)(inputs)\nlstm = keras.layers.Bidirectional(keras.layers.LSTM(units=16,\n return_sequences=True))(embd)\natt = SeqSelfAttention(attention_type=SeqSelfAttention.ATTENTION_TYPE_MUL,\n kernel_regularizer=keras.regularizers.l2(1e-4),\n bias_regularizer=keras.regularizers.l1(1e-4),\n attention_regularizer_weight=1e-4,\n name='Attention')(lstm)\ndense = keras.layers.Dense(units=5, name='Dense')(att)\nmodel = keras.models.Model(inputs=inputs, outputs=[dense])\nmodel.compile(\n optimizer='adam',\n loss={'Dense': 'sparse_categorical_crossentropy'},\n metrics={'Dense': 'categorical_accuracy'},\n)\nmodel.summary(line_length=100)\n```\n\n### Load the Model\n\nMake sure to add `SeqSelfAttention` to custom objects:\n\n```python\nimport keras\n\nkeras.models.load_model(model_path, custom_objects=SeqSelfAttention.get_custom_objects())\n```\n\n### History Only\n\nSet `history_only` to `True` when only historical data could be used:\n\n```python\nSeqSelfAttention(\n attention_width=3,\n history_only=True,\n name='Attention',\n)\n```\n\n### Multi-Head\n\nPlease refer to [keras-multi-head](https://github.com/CyberZHG/keras-multi-head).", "description_content_type": "text/markdown", "docs_url": null, "download_url": "", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/CyberZHG/keras-self-attention", "keywords": "", "license": "MIT", "maintainer": "", "maintainer_email": "", "name": "keras-self-attention", "package_url": "https://pypi.org/project/keras-self-attention/", "platform": "", "project_url": "https://pypi.org/project/keras-self-attention/", "project_urls": { "Homepage": "https://github.com/CyberZHG/keras-self-attention" }, "release_url": "https://pypi.org/project/keras-self-attention/0.42.0/", "requires_dist": null, "requires_python": "", "summary": "Attention mechanism for processing sequential data that considers the context for each timestamp", "version": "0.42.0" }, "last_serial": 5792056, "releases": { "0.0.10": [ { "comment_text": "", "digests": { "md5": "251f5167df1c20296c461d0f6b6b8e02", "sha256": "b1e853a2fce372a7410d0f83150f8a4d80bd11bcdb8e4b32377fcb1a046fd1d4" }, "downloads": -1, "filename": "keras-self-attention-0.0.10.tar.gz", "has_sig": false, "md5_digest": "251f5167df1c20296c461d0f6b6b8e02", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 3493, "upload_time": "2018-08-16T09:19:02", "url": "https://files.pythonhosted.org/packages/6e/ae/4b84b483fec410604414d138d39b770fa3d8d7249010eda670bfbc114c03/keras-self-attention-0.0.10.tar.gz" } ], "0.0.11": [ { "comment_text": "", "digests": { "md5": "47ef106a58bc96e6aff55f91dd1894c9", "sha256": "765ce94dd2c748420ce834d2a8cbc7b14214f0a94525aedc5dccb5973f4ece49" }, "downloads": -1, "filename": "keras-self-attention-0.0.11.tar.gz", "has_sig": false, "md5_digest": "47ef106a58bc96e6aff55f91dd1894c9", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 3843, "upload_time": "2018-08-16T10:25:19", "url": "https://files.pythonhosted.org/packages/33/3c/18a76ecb79c9ed85cb9120c6bb7226a1ad96829bf73779c0b2b1d6a87d92/keras-self-attention-0.0.11.tar.gz" } ], "0.0.12": [ { "comment_text": "", "digests": { "md5": "609cb773eb2d30b08c4b54134283e1eb", "sha256": "c2c5a07ee9a534698a63f9e3f5d73d1f88a9bb5d5a7b973b1d9388eb854a1b7e" }, "downloads": -1, "filename": "keras-self-attention-0.0.12.tar.gz", "has_sig": false, "md5_digest": "609cb773eb2d30b08c4b54134283e1eb", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 4195, "upload_time": "2018-08-17T05:18:52", "url": "https://files.pythonhosted.org/packages/b6/61/be0efde53072c21c88c70514583cb35c34d0fdb157e6d3d33d42b67beff2/keras-self-attention-0.0.12.tar.gz" } ], "0.0.13": [ { "comment_text": "", "digests": { "md5": "efb1f55355cff5ec2238920e04405f19", "sha256": "6ec3be5e1164567233dc30732f05efe22c1efd6865e802a7197616669f714d45" }, "downloads": -1, "filename": "keras-self-attention-0.0.13.tar.gz", "has_sig": false, "md5_digest": "efb1f55355cff5ec2238920e04405f19", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 5109, "upload_time": "2018-08-17T06:35:35", "url": "https://files.pythonhosted.org/packages/13/ea/f455dcbe43ba091f336e9d3d340da67e560b88dc3bd8dea4c9bd4cdf0348/keras-self-attention-0.0.13.tar.gz" } ], "0.0.14": [ { "comment_text": "", "digests": { "md5": "3b361c105b8c26a234247e59edd575a5", "sha256": "b1cf7bd1172b7e186a48e0e6f54165f6a237802e17f6309b789431f946231835" }, "downloads": -1, "filename": "keras-self-attention-0.0.14.tar.gz", "has_sig": false, "md5_digest": "3b361c105b8c26a234247e59edd575a5", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 5145, "upload_time": "2018-08-17T07:40:21", "url": "https://files.pythonhosted.org/packages/11/81/3cddb03574113b22904288439c75bd7afda04fdabe486486b283e983612d/keras-self-attention-0.0.14.tar.gz" } ], "0.0.15": [ { "comment_text": "", "digests": { "md5": "e4cbffdd292479271916d328b5b9fe54", "sha256": "2f60a4579d66206959ac96acb52a771372c446a103d6534132813e5dc78231ad" }, "downloads": -1, "filename": "keras-self-attention-0.0.15.tar.gz", "has_sig": false, "md5_digest": "e4cbffdd292479271916d328b5b9fe54", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 5143, "upload_time": "2018-08-17T07:43:16", "url": "https://files.pythonhosted.org/packages/44/bc/495c15f3325921d6f24e64b10ce832346ec76ca69df5fa385aea7138300c/keras-self-attention-0.0.15.tar.gz" } ], "0.0.16": [ { "comment_text": "", "digests": { "md5": "6cdc1ef7144c966ff38f67bc55345c6a", "sha256": "e6c3a51fe29b88477454e233d1acc1cd6a22fc58bc3583fcfe14e3787590e584" }, "downloads": -1, "filename": "keras-self-attention-0.0.16.tar.gz", "has_sig": false, "md5_digest": "6cdc1ef7144c966ff38f67bc55345c6a", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 5209, "upload_time": "2018-08-17T09:32:42", "url": "https://files.pythonhosted.org/packages/77/44/03ddfa115da1db2875526c47cb1f48a4785fcc484248e16730fab06678ae/keras-self-attention-0.0.16.tar.gz" } ], "0.0.17": [ { "comment_text": "", "digests": { "md5": "06744eacc3d2f0cdc26845bd862c2d64", "sha256": "0c2c40b77102ce9c68b4cde14c66f78f6fdb93eb2a91ab06acadc5268c2aadb0" }, "downloads": -1, "filename": "keras-self-attention-0.0.17.tar.gz", "has_sig": false, "md5_digest": "06744eacc3d2f0cdc26845bd862c2d64", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 5166, "upload_time": "2018-08-20T01:44:57", "url": "https://files.pythonhosted.org/packages/80/70/799461e149104506bcae2dc469c18c608e8952440734888c58f112262eee/keras-self-attention-0.0.17.tar.gz" } ], "0.0.18": [ { "comment_text": "", "digests": { "md5": "d449d781aaefb71cbe75cfb0d85b19e8", "sha256": "f2cac3d2766e3ca2f28b687abcf06e382777bb8945669d9d128aae408bcdb105" }, "downloads": -1, "filename": "keras-self-attention-0.0.18.tar.gz", "has_sig": false, "md5_digest": "d449d781aaefb71cbe75cfb0d85b19e8", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 5292, "upload_time": "2018-08-30T09:09:57", "url": "https://files.pythonhosted.org/packages/8b/f4/30fc144b94da39e8c68626bb851e86bc60ea1d96c937e80351f410915ef5/keras-self-attention-0.0.18.tar.gz" } ], "0.0.19": [ { "comment_text": "", "digests": { "md5": "21becd48ff485b5ae0fd07dff083e18e", "sha256": "72aef482c37f172f7b383044a01040764d80d5f05685cfd61726daacb7847535" }, "downloads": -1, "filename": "keras-self-attention-0.0.19.tar.gz", "has_sig": false, "md5_digest": "21becd48ff485b5ae0fd07dff083e18e", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 5311, "upload_time": "2018-08-30T09:26:16", "url": "https://files.pythonhosted.org/packages/16/22/48456f14d6cda6d1bb6a89a9c98f99d43338f5d9a288b4d40ba736568bcd/keras-self-attention-0.0.19.tar.gz" } ], "0.0.20": [ { "comment_text": "", "digests": { "md5": "addc37e3c160fd19254c16cc62111449", "sha256": "cd356be8a9e2648c8ab9ce80eac266bfb9fba2ecb76a20c7dc9ab9fed3b00679" }, "downloads": -1, "filename": "keras-self-attention-0.0.20.tar.gz", "has_sig": false, "md5_digest": "addc37e3c160fd19254c16cc62111449", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 5308, "upload_time": "2018-08-31T07:18:30", "url": "https://files.pythonhosted.org/packages/73/b5/0f15be9b287c30bd2de65b47d3b6deaf23229d4309eda32d06f580f82061/keras-self-attention-0.0.20.tar.gz" } ], "0.0.21": [ { "comment_text": "", "digests": { "md5": "80a26fd3fb6feab7b6f2350da84f4bd2", "sha256": "03357507d7ab5830e5d3af91ee91ec29b2379a84abaf3f3c9290dd3144b804c4" }, "downloads": -1, "filename": "keras-self-attention-0.0.21.tar.gz", "has_sig": false, "md5_digest": "80a26fd3fb6feab7b6f2350da84f4bd2", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 5466, "upload_time": "2018-09-06T09:23:00", "url": "https://files.pythonhosted.org/packages/61/f4/1a7b8745944f33477846ef8ad4e634c635e71e7a0a4bf66fe3e0185c26d0/keras-self-attention-0.0.21.tar.gz" } ], "0.0.8": [ { "comment_text": "", "digests": { "md5": "b1e572fdc146b93f025b50243caba91f", "sha256": "dc1d48a791b2fa0edd15e8774df044601cb635226ff5b30845fd597db2fdcf4f" }, "downloads": -1, "filename": "keras-self-attention-0.0.8.tar.gz", "has_sig": false, "md5_digest": "b1e572fdc146b93f025b50243caba91f", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 3437, "upload_time": "2018-08-16T08:14:04", "url": "https://files.pythonhosted.org/packages/4e/95/dd3303f7fb0e3c46f098f42d181e8b1b2d649d1c860219a2f1ff33f805f2/keras-self-attention-0.0.8.tar.gz" } ], "0.0.9": [ { "comment_text": "", "digests": { "md5": "6a41e12775972bf3479e682439fe20ee", "sha256": "6421b965c26b68f836d666ee825e5d91c8f8ef796e2df6175e5f7bca7c87530d" }, "downloads": -1, "filename": "keras-self-attention-0.0.9.tar.gz", "has_sig": false, "md5_digest": "6a41e12775972bf3479e682439fe20ee", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 3430, "upload_time": "2018-08-16T08:16:21", "url": "https://files.pythonhosted.org/packages/95/0f/992c377ee1df1c4022c24184145cb3b708ffa465e0e9cd40ffbd22840a6e/keras-self-attention-0.0.9.tar.gz" } ], "0.30.0": [ { "comment_text": "", "digests": { "md5": "fd993d92b93516bf8f595928ba57ea62", "sha256": "758160551f8b48d553c3f0ffea6e8216246972ab8b7addc147f84f822d77d70f" }, "downloads": -1, "filename": "keras-self-attention-0.30.0.tar.gz", "has_sig": false, "md5_digest": "fd993d92b93516bf8f595928ba57ea62", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 6586, "upload_time": "2018-11-06T09:09:29", "url": "https://files.pythonhosted.org/packages/4f/e7/cca1975a2f4a43c874ccb02dad3d230caa34b46f8b2d3ca3fb84b4551d29/keras-self-attention-0.30.0.tar.gz" } ], "0.31.0": [ { "comment_text": "", "digests": { "md5": "e43000dd4027adba0d22009c6d867225", "sha256": "1490f443d20137f10c10757206647429f232453ebe78f932feeba38a6bff0ed1" }, "downloads": -1, "filename": "keras-self-attention-0.31.0.tar.gz", "has_sig": false, "md5_digest": "e43000dd4027adba0d22009c6d867225", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 6591, "upload_time": "2018-11-12T06:38:16", "url": "https://files.pythonhosted.org/packages/ea/10/3959f1a19511be96b4198df9b3b812385377bb0b5d1a7c8c39c545c6f665/keras-self-attention-0.31.0.tar.gz" } ], "0.32.0": [ { "comment_text": "", "digests": { "md5": "4d60a8cd467db9390278214a3202c088", "sha256": "d6ab68924245df2b2cc865fc6894250df9047cb9315dbc38fcae874fae6187b7" }, "downloads": -1, "filename": "keras-self-attention-0.32.0.tar.gz", "has_sig": false, "md5_digest": "4d60a8cd467db9390278214a3202c088", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 6639, "upload_time": "2018-11-26T06:01:12", "url": "https://files.pythonhosted.org/packages/b7/8f/92d1889a9319e50a8edea6446c218ee324d776b152e39ac49d43bdcb9ec6/keras-self-attention-0.32.0.tar.gz" } ], "0.33.0": [ { "comment_text": "", "digests": { "md5": "ed2a3a3a5b9fcd6662bffcfa35d7aa2d", "sha256": "73e4214e4e9ecf66d8fcc383a375e3432c7ce0dd95b9ae0b5cb5c0c9e937dc7b" }, "downloads": -1, "filename": "keras-self-attention-0.33.0.tar.gz", "has_sig": false, "md5_digest": "ed2a3a3a5b9fcd6662bffcfa35d7aa2d", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 6644, "upload_time": "2019-01-31T05:49:54", "url": "https://files.pythonhosted.org/packages/cf/43/439a625c26915181f46e3375fb079fd7c27eda7a93d977beec49307e2e93/keras-self-attention-0.33.0.tar.gz" } ], "0.34.0": [ { "comment_text": "", "digests": { "md5": "6b93c545a28eb80996a0bdf3c2a8bfa7", "sha256": "a21f6a3f63c9f6df24f2085667c345edcddd2c0fc82911b94973054c688b665f" }, "downloads": -1, "filename": "keras-self-attention-0.34.0.tar.gz", "has_sig": false, "md5_digest": "6b93c545a28eb80996a0bdf3c2a8bfa7", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 11985, "upload_time": "2019-02-01T03:03:54", "url": "https://files.pythonhosted.org/packages/8e/c7/ab6b511363ca59d0c1f655ae3bfd08c638c66b4f480f41a345b17164e840/keras-self-attention-0.34.0.tar.gz" } ], "0.35.0": [ { "comment_text": "", "digests": { "md5": "daf0a8022d65b9b495e168519b057984", "sha256": "6cee52c1123047fa5ff420c7210b97eafab63f9cbd887931d0d9d0c606c4c358" }, "downloads": -1, "filename": "keras-self-attention-0.35.0.tar.gz", "has_sig": false, "md5_digest": "daf0a8022d65b9b495e168519b057984", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 11980, "upload_time": "2019-03-11T06:07:08", "url": "https://files.pythonhosted.org/packages/1f/b7/93e762cd4db6cd444ff96f64d1f6ba4410358337aeafaf47972b7687f96a/keras-self-attention-0.35.0.tar.gz" } ], "0.36.0": [ { "comment_text": "", "digests": { "md5": "b6e021b60d764e8df706dd4c22450dc7", "sha256": "0379ca565356e3dc022cf0edeb9dd55c283eb7d3ffea1fc1abf52161fdd7a8f2" }, "downloads": -1, "filename": "keras-self-attention-0.36.0.tar.gz", "has_sig": false, "md5_digest": "b6e021b60d764e8df706dd4c22450dc7", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 11995, "upload_time": "2019-04-01T07:05:07", "url": "https://files.pythonhosted.org/packages/6a/c1/26fab70aba688fc40aa231fb475f958ff57db67ca8050eb9c2c8fe05fe72/keras-self-attention-0.36.0.tar.gz" } ], "0.37.0": [ { "comment_text": "", "digests": { "md5": "d45f9adb9c6ffc7b1519ff204c6729d2", "sha256": "0685638afe3ec5c262c84596e8c71a4228cff373ecb787b641dfb7cbe4d991cc" }, "downloads": -1, "filename": "keras-self-attention-0.37.0.tar.gz", "has_sig": false, "md5_digest": "d45f9adb9c6ffc7b1519ff204c6729d2", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 12003, "upload_time": "2019-04-16T07:01:37", "url": "https://files.pythonhosted.org/packages/4a/aa/50a37e8fb2224c71537db21b9fc8b4b4b06493fe1786744eec37bf3a3265/keras-self-attention-0.37.0.tar.gz" } ], "0.38.0": [ { "comment_text": "", "digests": { "md5": "1c3bcc03984f43cb8c6c7bae8e557e39", "sha256": "b75205cec5d955dd3e4b2d837caa15c63f8ff78833ae02bab58b5e8dbdbb1588" }, "downloads": -1, "filename": "keras-self-attention-0.38.0.tar.gz", "has_sig": false, "md5_digest": "1c3bcc03984f43cb8c6c7bae8e557e39", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 12008, "upload_time": "2019-04-16T07:27:43", "url": "https://files.pythonhosted.org/packages/6d/b4/30b245cb0eeef78d480b2412e2fc8b24b9eaf925c395c52508bddcbded04/keras-self-attention-0.38.0.tar.gz" } ], "0.39.0": [ { "comment_text": "", "digests": { "md5": "bef1ba407bde11cefb6d313e0a4635fb", "sha256": "66353811481b2a68fefca72c5d6e1f99b034dd2d97c128d3454565ae0bd8a6ae" }, "downloads": -1, "filename": "keras-self-attention-0.39.0.tar.gz", "has_sig": false, "md5_digest": "bef1ba407bde11cefb6d313e0a4635fb", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 11539, "upload_time": "2019-04-16T08:26:53", "url": "https://files.pythonhosted.org/packages/91/70/51150779d5bbd1488a30c62026b141073873faf81eac7a62c6460cb5efe0/keras-self-attention-0.39.0.tar.gz" } ], "0.40.0": [ { "comment_text": "", "digests": { "md5": "d0632949ddc952d2216b281adb5e02f4", "sha256": "352983212ba9c7893a56487c7f53c460a24be31941123acfa3b5597f2cf771aa" }, "downloads": -1, "filename": "keras-self-attention-0.40.0.tar.gz", "has_sig": false, "md5_digest": "d0632949ddc952d2216b281adb5e02f4", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 11464, "upload_time": "2019-04-24T10:22:13", "url": "https://files.pythonhosted.org/packages/a1/4b/20980621869bc2b1a06b3add3add81e4334876d35db6a6f1ecfe1c812ca6/keras-self-attention-0.40.0.tar.gz" } ], "0.41.0": [ { "comment_text": "", "digests": { "md5": "dfcc849113d79b38f4380f2359a93150", "sha256": "f49751f7bf57407cd66d7c3c3eaf32af62dc5cc1126f5a2087659fbfeb24efbb" }, "downloads": -1, "filename": "keras-self-attention-0.41.0.tar.gz", "has_sig": false, "md5_digest": "dfcc849113d79b38f4380f2359a93150", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 9332, "upload_time": "2019-05-11T09:07:33", "url": "https://files.pythonhosted.org/packages/1b/1c/01599219bef7266fa43b3316e4f55bcb487734d3bafdc60ffd564f3cfe29/keras-self-attention-0.41.0.tar.gz" } ], "0.42.0": [ { "comment_text": "", "digests": { "md5": "35f986dbabe51bb446a3e6be73a55f83", "sha256": "e602c19203acb133eab05a5ff0b62b3110c4a18b14c33bfe5ab4a199f6acc3a6" }, "downloads": -1, "filename": "keras-self-attention-0.42.0.tar.gz", "has_sig": false, "md5_digest": "35f986dbabe51bb446a3e6be73a55f83", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 9329, "upload_time": "2019-09-06T12:38:02", "url": "https://files.pythonhosted.org/packages/44/3e/eb1a7c7545eede073ceda2f5d78442b6cad33b5b750d7f0742866907c34b/keras-self-attention-0.42.0.tar.gz" } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "35f986dbabe51bb446a3e6be73a55f83", "sha256": "e602c19203acb133eab05a5ff0b62b3110c4a18b14c33bfe5ab4a199f6acc3a6" }, "downloads": -1, "filename": "keras-self-attention-0.42.0.tar.gz", "has_sig": false, "md5_digest": "35f986dbabe51bb446a3e6be73a55f83", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 9329, "upload_time": "2019-09-06T12:38:02", "url": "https://files.pythonhosted.org/packages/44/3e/eb1a7c7545eede073ceda2f5d78442b6cad33b5b750d7f0742866907c34b/keras-self-attention-0.42.0.tar.gz" } ] }