{ "info": { "author": "Lin Xiao", "author_email": "lin.xiao@gmail.com", "bugtrack_url": null, "classifiers": [], "description": "# Accelerated Bregman Proximal Gradient Methods\n\nAccelerated first-order algorithms for solving relatively-smooth convex optimization problems of the form\n\n minimize { f(x) + P(x) | x in C }\n\nwith a reference function h(x), where\n\n* h(x) is convex and essentially smooth on C\n* f(x) is convex and differentiable, and L-smooth relative to h(x), that is, f(x)-L*h(x) is convex\n* P(x) is convex and closed (lower semi-continuous)\n* C is a closed convex set\n\n### Implemented algorithms in [HRX2018](https://arxiv.org/abs/1808.03045)\n\n* BPG_LS (Bregman proximal gradient) method with line search\n* ABPG (Accelerated BPG) method\n* ABPG-expo (ABPG with exponent adaption)\n* ABPG-gain (ABPG with gain adaption)\n* ABDA (Accelerated Bregman dual averaging) method\n\n## Installation\n\nClone or fork from GitHub. Or install from PyPI:\n\n pip install accbpg\n\n## Usage\n\n import accbpg\n\n # generate a random instance of D-optimal design problem\n f, h, L, x0 = accbpg.D_opt_design(80, 200)\n\n # solve the problem instance using BPG with line search\n x1, F1, G1 = accbpg.BPG_LS(f, h, L, x0, maxitrs=1000, verskip=100)\n\n # solve it again using ABPG_gain with gamma=2\n x2, F2, G2, D2 = accbpg.ABPG_gain(f, h, L, 2, x0, maxitrs=1000, verbskip=100)\n\ncompare the two methods by visualization\n\n import matplotlib.pyplot as plt\n Fmin = min(F1.min(), F2.min())\n plt.semilogy(range(len(F1)), F1-Fmin, range(len(F2)), F2-Fmin)\n\n## Examples in [HRX2018](https://arxiv.org/abs/1808.03045)\n\n**D-optimal experiment design**\n \n import accbpg.ex_D_opt\n\n**Nonnegative regression with KL-divergence**\n\n import accbpg.ex_KL_regr\n\n**Poisson linear inverse problems**\n\n import accbpg.ex_PoissonL1\n import accbpg.ex_PoissonL2", "description_content_type": "", "docs_url": null, "download_url": "", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/Microsoft/accbpg", "keywords": "", "license": "MIT", "maintainer": "", "maintainer_email": "", "name": "accbpg", "package_url": "https://pypi.org/project/accbpg/", "platform": "", "project_url": "https://pypi.org/project/accbpg/", "project_urls": { "Homepage": "https://github.com/Microsoft/accbpg" }, "release_url": "https://pypi.org/project/accbpg/0.1/", "requires_dist": null, "requires_python": "", "summary": "Accelerated Bregman proximal gradient (ABPG) methods", "version": "0.1" }, "last_serial": 4162144, "releases": { "0.1": [ { "comment_text": "", "digests": { "md5": "f87201ec84c29d2d6e4a8f7df5f0253d", "sha256": "d3f5bad00d00b142a05eb4dfb3cebf9fb43b0bc14d6d425131f0189c86b49045" }, "downloads": -1, "filename": "accbpg-0.1.tar.gz", "has_sig": false, "md5_digest": "f87201ec84c29d2d6e4a8f7df5f0253d", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 11231, "upload_time": "2018-08-12T17:56:43", "url": "https://files.pythonhosted.org/packages/22/87/f20896f715a24d5e6b565d2bae30b30926d7541c9b607d0ff920393baa2e/accbpg-0.1.tar.gz" } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "f87201ec84c29d2d6e4a8f7df5f0253d", "sha256": "d3f5bad00d00b142a05eb4dfb3cebf9fb43b0bc14d6d425131f0189c86b49045" }, "downloads": -1, "filename": "accbpg-0.1.tar.gz", "has_sig": false, "md5_digest": "f87201ec84c29d2d6e4a8f7df5f0253d", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 11231, "upload_time": "2018-08-12T17:56:43", "url": "https://files.pythonhosted.org/packages/22/87/f20896f715a24d5e6b565d2bae30b30926d7541c9b607d0ff920393baa2e/accbpg-0.1.tar.gz" } ] }