{ "info": { "author": "Marcello De Bernardi", "author_email": "marcello.debernardi@stcatz.ox.ac.uk", "bugtrack_url": null, "classifiers": [ "Development Status :: 4 - Beta", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", "Topic :: Scientific/Engineering :: Artificial Intelligence" ], "description": "# loss-landscapes\n\n`loss-landscapes` is a PyTorch library for approximating neural network loss functions, and other related metrics, \nin low-dimensional subspaces of the model's parameter space. The library makes the production of visualizations\nsuch as those seen in [Visualizing the Loss Landscape of Neural Nets](https://arxiv.org/abs/1712.09913v3) much\neasier, aiding the analysis of the geometry of neural network loss landscapes.\n\nThis library does not provide plotting facilities, letting the user define how the data should be plotted. Other\ndeep learning frameworks are not supported, though a TensorFlow version, `loss-landscapes-tf`, is planned for\na future release.\n\n**NOTE: this library is in early development. Bugs are virtually a certainty, and the API is volatile. Do not use\nthis library in production code. For prototyping and research, always use the newest version of the library.**\n\n\n## 1. What is a Loss Landscape?\nLet `L : Parameters -> Real Numbers` be a loss function, which maps a point in the model parameter space to a \nreal number. For a neural network with `n` parameters, the loss function `L` takes an `n`-dimensional input. We\ncan define the loss landscape as the set of all `n+1`-dimensional points `(param, L(param))`, for all points\n`param` in the parameter space. For example, the image below, reproduced from the paper by Li et al (2018), link\nabove, provides a visual representation of what a loss function over a two-dimensional parameter space might look \nlike:\n\n


