{ "info": { "author": "Lasagne contributors", "author_email": "lasagne-users@googlegroups.com", "bugtrack_url": null, "classifiers": [ "Development Status :: 3 - Alpha", "Intended Audience :: Developers", "Intended Audience :: Science/Research", "License :: OSI Approved :: MIT License", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.4", "Topic :: Scientific/Engineering :: Artificial Intelligence" ], "description": ".. image:: http://img.shields.io/badge/docs-latest-brightgreen.svg\n :target: http://lasagne.readthedocs.org/en/latest/\n\n.. image:: https://travis-ci.org/Lasagne/Lasagne.svg?branch=master\n :target: https://travis-ci.org/Lasagne/Lasagne\n\n.. image:: https://img.shields.io/coveralls/Lasagne/Lasagne.svg\n :target: https://coveralls.io/r/Lasagne/Lasagne\n\n.. image:: https://img.shields.io/badge/license-MIT-blue.svg\n :target: https://github.com/Lasagne/Lasagne/blob/master/LICENSE\n\n.. image:: https://zenodo.org/badge/16974/Lasagne/Lasagne.svg\n :target: https://zenodo.org/badge/latestdoi/16974/Lasagne/Lasagne\n\nLasagne\n=======\n\nLasagne is a lightweight library to build and train neural networks in Theano.\nIts main features are:\n\n* Supports feed-forward networks such as Convolutional Neural Networks (CNNs),\n recurrent networks including Long Short-Term Memory (LSTM), and any\n combination thereof\n* Allows architectures of multiple inputs and multiple outputs, including\n auxiliary classifiers\n* Many optimization methods including Nesterov momentum, RMSprop and ADAM\n* Freely definable cost function and no need to derive gradients due to\n Theano's symbolic differentiation\n* Transparent support of CPUs and GPUs due to Theano's expression compiler\n\nIts design is governed by `six principles\n`_:\n\n* Simplicity: Be easy to use, easy to understand and easy to extend, to\n facilitate use in research\n* Transparency: Do not hide Theano behind abstractions, directly process and\n return Theano expressions or Python / numpy data types\n* Modularity: Allow all parts (layers, regularizers, optimizers, ...) to be\n used independently of Lasagne\n* Pragmatism: Make common use cases easy, do not overrate uncommon cases\n* Restraint: Do not obstruct users with features they decide not to use\n* Focus: \"Do one thing and do it well\"\n\n\nInstallation\n------------\n\nIn short, you can install a known compatible version of Theano and the latest\nLasagne development version via:\n\n.. code-block:: bash\n\n pip install -r https://raw.githubusercontent.com/Lasagne/Lasagne/master/requirements.txt\n pip install https://github.com/Lasagne/Lasagne/archive/master.zip\n\nFor more details and alternatives, please see the `Installation instructions\n`_.\n\n\nDocumentation\n-------------\n\nDocumentation is available online: http://lasagne.readthedocs.org/\n\nFor support, please refer to the `lasagne-users mailing list\n`_.\n\n\nExample\n-------\n\n.. code-block:: python\n\n import lasagne\n import theano\n import theano.tensor as T\n\n # create Theano variables for input and target minibatch\n input_var = T.tensor4('X')\n target_var = T.ivector('y')\n\n # create a small convolutional neural network\n from lasagne.nonlinearities import leaky_rectify, softmax\n network = lasagne.layers.InputLayer((None, 3, 32, 32), input_var)\n network = lasagne.layers.Conv2DLayer(network, 64, (3, 3),\n nonlinearity=leaky_rectify)\n network = lasagne.layers.Conv2DLayer(network, 32, (3, 3),\n nonlinearity=leaky_rectify)\n network = lasagne.layers.Pool2DLayer(network, (3, 3), stride=2, mode='max')\n network = lasagne.layers.DenseLayer(lasagne.layers.dropout(network, 0.5),\n 128, nonlinearity=leaky_rectify,\n W=lasagne.init.Orthogonal())\n network = lasagne.layers.DenseLayer(lasagne.layers.dropout(network, 0.5),\n 10, nonlinearity=softmax)\n\n # create loss function\n prediction = lasagne.layers.get_output(network)\n loss = lasagne.objectives.categorical_crossentropy(prediction, target_var)\n loss = loss.mean() + 1e-4 * lasagne.regularization.regularize_network_params(\n network, lasagne.regularization.l2)\n\n # create parameter update expressions\n params = lasagne.layers.get_all_params(network, trainable=True)\n updates = lasagne.updates.nesterov_momentum(loss, params, learning_rate=0.01,\n momentum=0.9)\n\n # compile training function that updates parameters and returns training loss\n train_fn = theano.function([input_var, target_var], loss, updates=updates)\n\n # train network (assuming you've got some training data in numpy arrays)\n for epoch in range(100):\n loss = 0\n for input_batch, target_batch in training_data:\n loss += train_fn(input_batch, target_batch)\n print(\"Epoch %d: Loss %g\" % (epoch + 1, loss / len(training_data)))\n\n # use trained network for predictions\n test_prediction = lasagne.layers.get_output(network, deterministic=True)\n predict_fn = theano.function([input_var], T.argmax(test_prediction, axis=1))\n print(\"Predicted class for first test input: %r\" % predict_fn(test_data[0]))\n\nFor a fully-functional example, see `examples/mnist.py `_,\nand check the `Tutorial\n`_ for in-depth\nexplanations of the same. More examples, code snippets and reproductions of\nrecent research papers are maintained in the separate `Lasagne Recipes\n`_ repository.\n\n\nDevelopment\n-----------\n\nLasagne is a work in progress, input is welcome.\n\nPlease see the `Contribution instructions\n`_ for details\non how you can contribute!\n\n\nChangelog\n---------\n\n0.1 (2015-08-13)\n~~~~~~~~~~~~~~~~\n\nFirst release.\n\n* core contributors, in alphabetical order:\n\n * Eric Battenberg (@ebattenberg)\n * Sander Dieleman (@benanne)\n * Daniel Nouri (@dnouri)\n * Eben Olson (@ebenolson)\n * A\u00e4ron van den Oord (@avdnoord)\n * Colin Raffel (@craffel)\n * Jan Schl\u00fcter (@f0k)\n * S\u00f8ren Kaae S\u00f8nderby (@skaae)\n\n* extra contributors, in chronological order:\n\n * Daniel Maturana (@dimatura): documentation, cuDNN layers, LRN\n * Jonas Degrave (@317070): get_all_param_values() fix\n * Jack Kelly (@JackKelly): help with recurrent layers\n * G\u00e1bor Tak\u00e1cs (@takacsg84): support broadcastable parameters in lasagne.updates\n * Diogo Moitinho de Almeida (@diogo149): MNIST example fixes\n * Brian McFee (@bmcfee): MaxPool2DLayer fix\n * Martin Thoma (@MartinThoma): documentation\n * Jeffrey De Fauw (@JeffreyDF): documentation, ADAM fix\n * Michael Heilman (@mheilman): NonlinearityLayer, lasagne.random\n * Gregory Sanders (@instagibbs): documentation fix\n * Jon Crall (@erotemic): check for non-positive input shapes\n * Hendrik Weideman (@hjweide): set_all_param_values() test, MaxPool2DCCLayer fix\n * Kashif Rasul (@kashif): ADAM simplification\n * Peter de Rivaz (@peterderivaz): documentation fix", "description_content_type": null, "docs_url": null, "download_url": "UNKNOWN", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/Lasagne/Lasagne", "keywords": "", "license": "MIT", "maintainer": null, "maintainer_email": null, "name": "Lasagne", "package_url": "https://pypi.org/project/Lasagne/", "platform": "UNKNOWN", "project_url": "https://pypi.org/project/Lasagne/", "project_urls": { "Download": "UNKNOWN", "Homepage": "https://github.com/Lasagne/Lasagne" }, "release_url": "https://pypi.org/project/Lasagne/0.1/", "requires_dist": null, "requires_python": null, "summary": "A lightweight library to build and train neural networks in Theano", "version": "0.1" }, "last_serial": 1676667, "releases": { "0.1": [ { "comment_text": "", "digests": { "md5": "44212b92bf5f3b1be3021fa0b64b5fdb", "sha256": "3c634ecab67e43e4f18520932bfd88bd3c678ec723c48177f18799dab2411233" }, "downloads": -1, "filename": "Lasagne-0.1.tar.gz", "has_sig": false, "md5_digest": "44212b92bf5f3b1be3021fa0b64b5fdb", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 125077, "upload_time": "2015-08-13T21:10:53", "url": "https://files.pythonhosted.org/packages/98/bf/4b2336e4dbc8c8859c4dd81b1cff18eef2066b4973a1bd2b0ca2e5435f35/Lasagne-0.1.tar.gz" } ], "0.1dev": [] }, "urls": [ { "comment_text": "", "digests": { "md5": "44212b92bf5f3b1be3021fa0b64b5fdb", "sha256": "3c634ecab67e43e4f18520932bfd88bd3c678ec723c48177f18799dab2411233" }, "downloads": -1, "filename": "Lasagne-0.1.tar.gz", "has_sig": false, "md5_digest": "44212b92bf5f3b1be3021fa0b64b5fdb", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 125077, "upload_time": "2015-08-13T21:10:53", "url": "https://files.pythonhosted.org/packages/98/bf/4b2336e4dbc8c8859c4dd81b1cff18eef2066b4973a1bd2b0ca2e5435f35/Lasagne-0.1.tar.gz" } ] }