{ "info": { "author": "John Towne", "author_email": "towne.john@gmail.com", "bugtrack_url": null, "classifiers": [ "License :: OSI Approved :: BSD License", "Operating System :: OS Independent", "Programming Language :: Python :: 3" ], "description": "# Swarm-Simplex-Bootstrap\nSwarm-Simplex-Bootstrap is a python implementation of the Particle Swarm and Nelder-Mead Simplex minimization algorithms. Both algorithms make few assumptions about the function to be minimized (such as continuity or differentiability) so they are applicable to a wide variety of problems. Bounds, linear constraints, and nonlinear constraints are supported.\n\nThe emphasis of this library is paramterizing models using experimental data where the model parameters can be subject to bounds and constraints. Model parameterization is carried out by minimizing the Least Squares objective function and parameter uncertainty is estimated by Bootstrapping.\n\n### Table of Contents\n+ Installation\n+ Testing\n+ Project Structure, Versioning, and Documentation\n+ General Information\n+ Test Functions for Minimization\n+ Unconstrained Minimization with Nelder-Mead\n+ Bounds\n+ Bounded Minimization with Nelder-Mead\n+ Particle Swarm Minimization\n+ Particle Swarm Followed by Nelder-Mead Refinement\n+ Constraints\n+ Bounded and Constrained Minimization\n+ Model Regression\n+ Bootstrapping\n\n### Installation\nssb_optimize can be installed as a python package using pip. Dependencies include numpy, itertools, numbers, and multiprocessing. \n```console\npython -m pip install numpy ssb_optimize\n```\nssb_optimize was developed in Python 3.6 so I suspect the package will work with any Python 3 installation. That being said, it hasn't been tested with any other versions of Python (if somebody would like to help with this, please let me know). \n\n\n### Testing\nTo run unit tests, open a shell environment and run the following command in the top-level directory.\n```console\npython -m unittest discover -v\n```\n\n### Project Structure, Versioning, and Documentation\nProject structure follows [python-guide.org recommendations](https://docs.python-guide.org/writing/structure/). \n\nDocstring format follows [Google style recoomendations](https://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_google.html). \n\nVersioning for publishing to PyPI follows the \"major.minor.patch\" format based on [https://semver.org/ recommendations](https://semver.org/).\n+ major version - when you make incompatible API changes,\n+ minor version - when you add functionality in a backwards-compatible manner, and\n+ patch version - when you make backwards-compatible bug fixes.\n\nThe [Markdown cheat sheet](https://github.com/adam-p/markdown-here/wiki/Markdown-Cheatsheet) is a useful reference for keeping documentation up to date.\n\n### General Information\nStart off by importing the 'optimizer' module. \n```python\nimport optimizer as opt\n```\nFunctions contained in the 'optimizer' module are summarized below.\n+ **bounds_check**: Check bounds list of size 'n' for consistency and return the list with basic problems corrected.\n+ **constraints_check**: Check constraints list for consistency and return the list with basic problems corrected.\n+ **feasible_points_random**: Generate an array of points using random sampling which satisfy the bounds and constraints.\n+ **best_point**: Return the point corresponding to the lowest evaluated value of 'func'.\n+ **nelder_mead**: Minimize a scalar function using the Nelder-Mead simplex algorithm.\n+ **particle_swarm**: Minimize a scalar function using the Particle Swarm algorithm.\n+ **least_squares_objective_function**: Returns the result of evaluation of the least squares objective function.\n+ **least_squares_bootstrap**: Returns a list of the results of repeated least squares fitting of func to random samples taken from x and fx.\n\nThe docstring for each function contains additional information about how each function works, arguments, return values, and error handling.\n```python\nprint(function.__doc__)\nhelp(function)\n```\n\nAll examples shown in this tutorial are provided in the 'optimizer.py' file for reference.\n\n### Test Functions for Minimization\nTest functions used in this tutorial are detailed in the [Test functions for optimization](https://en.wikipedia.org/wiki/Test_functions_for_optimization) Wikipedia page. These test functions are used to demonstrate performance of minimization algorithms in situations relevant to real world applications.\n\nAdditional information was taken from the following reference.\n\nJamil, M., & Yang, X. S. (2013). A literature survey of benchmark functions for global optimisation problems. International Journal of Mathematical Modelling and Numerical Optimisation, 4(2), 150. https://doi.org/10.1504/ijmmno.2013.055204\n\n##### Booth Function\nThe Booth function is continuous, differentiable, non-separable, non-scalable, and unimodal. It has a smooth approach to the global minimum. This mimics the smooth approach to a minimum for many functions when near a local or global minimum.\n\n![Booth Function](https://upload.wikimedia.org/wikipedia/commons/thumb/6/6e/Booth%27s_function.pdf/page1-320px-Booth%27s_function.pdf.jpg \"Booth Function\")\n\nPython implementation of the Booth function:\n```python\ndef booth(args):\n \"\"\" Booth function\n\n Global minimum: f(1.0,3.0) = 0.0\n Search domain: -10.0 <= x, y <= 10.0\n \"\"\"\n return (args[0] + 2*args[1] - 7)**2 + (2*args[0] + args[1] - 5)**2\n```\n\n##### Rosenbrock Function\nThe Rosenbrock function is continuous, differentiable, non-separable, scalable, and unimodal. The global minimum is inside a long, narrow, parabolic shaped flat valley. To find the valley is trivial but converging to the global minimum is difficult.\n\n![Rosenbrock Function](https://upload.wikimedia.org/wikipedia/commons/thumb/7/7e/Rosenbrock%27s_function_in_3D.pdf/page1-320px-Rosenbrock%27s_function_in_3D.pdf.jpg \"Rosenbrock Function\")\n\nPython implementation of the Rosenbrock function:\n```python\ndef rosenbrock(args):\n \"\"\"Rosenbrock function\n\n Global minimum: f(1,...,1) = 0.0\n Search domain: -inf <= xi <= inf, 1 <= i <= n\n \"\"\"\n rosen = 0\n for i in range(len(args) - 1):\n rosen += 10.0*((args[i]**2) - args[i + 1])**2 + (1 - args[i])**2\n return rosen\n```\n\n##### Ackley Function\nThe Ackley function is continuous, differentiable, non-separable, scalable, and multimodal. It has many local minima but only one one global minimum. Many minimizing algorithms will become trapped in one of the many local minimum during a search for the global minimum. \n\n![Ackley Function](https://upload.wikimedia.org/wikipedia/commons/thumb/9/98/Ackley%27s_function.pdf/page1-320px-Ackley%27s_function.pdf.jpg \"Ackley Function\")\n\n```python\ndef ackley(args):\n \"\"\"Ackley function\n\n Global minimum: f(0,0) = 0.0\n Search domain: -5.0 <= x, y <= 5.0\n \"\"\"\n first_sum = 0.0\n second_sum = 0.0\n for c in args:\n first_sum += c ** 2.0\n second_sum += np.cos(2.0 * np.pi * c)\n n = float(len(args))\n return -20.0*np.exp(-0.2*np.sqrt(first_sum/n)) - np.exp(second_sum/n) + 20.0 + np.e\n```\n\n### Unconstrained Minimization with Nelder-Mead\n\nThe 'nelder_mead' algorithm is a very general direct search minimization method. The main weakness of the 'nelder_mead' algorithm for complex problems is that it can prematurely converge to a local minimum in search of a global minium.\n\n#### Unconstrained Minimization with Nelder-Mead Simplex, Booth function example (successful convergence)\n\nInput:\n```python\ninitial_pt = [2.0, 2.0]\nfunc = booth\nminimum = opt.nelder_mead(initial_pt, func)\nprint(minimum)\n```\nReturn:\n```python\n[1., 3.]\n```\n\n#### Unconstrained Minimization with Nelder-Mead Simplex, Rosenbrock function example (successful convergence)\n\nInput:\n```python\ninitial_pt = [2.0, 2.0]\nfunc = rosenbrock\nminimum = opt.nelder_mead(initial_pt, func)\nprint(minimum)\n```\nReturn:\n```python\n[1., 1.]\n```\n\n#### Unconstrained Minimization with Nelder-Mead Simplex, Ackley function example (successful convergence)\n\nInput:\n```python\ninitial_pt = [0.1, 0.1]\nfunc = ackley\nminimum = opt.nelder_mead(initial_pt, func)\nprint(minimum)\n```\nReturn:\n```python\n[2.39783206e-16, -1.75571593e-16]\n```\n\n### Bounds\n\nA bound specifies the minimum and maximum values allowed for each of the 'n' dimensions of the problem space. A set of bounds define a valid region of the problem space. If there are no bounds for a particular dimension of the problem space (i.e. infinity or -infinity), then specify 'None' for that element of the bound.\n\nA valid bounds list is a list of bound tuples or bound lists.\n+ [(bound_tuple), ... ,(bound_tuple)]\n+ [[bound_list], ... ,[bound_list]]\n\nBounds tuples and lists are defined using the following syntax.\n+ (min, max), (None, max), (min, None), (None, None)\n+ [min, max], [None, max], [min, None], [None, None]\n\nThe 'bounds_check' function checks bounds lists for consistency and returns the list with basic problems corrected.\n\nBounds specification is optional for the 'nelder_mead' algorithm. However, bounds specification is required for the 'particle_swarm' algorithm because bounds are used to generate the initial particle swarm.\n\nInput:\n```python\ninfinite_bounds = [(None, None), (None, None)]\nfinite_bounds = [(-5.0, 5.0), (-5.0, 5.0)]\n\ninf_bounds_size = len(infinite_bounds)\nfin_bounds_size = len(finite_bounds)\n\ninf_bounds_checked = opt.bounds_check(inf_bounds_size, infinite_bounds)\nfin_bounds_checked = opt.bounds_check(fin_bounds_size, finite_bounds)\n\nprint(inf_bounds_checked)\nprint(fin_bounds_checked)\n```\nReturn:\n```python\n[[-inf, inf],\n [-inf, inf]]\n[[-5., 5.],\n [-5., 5.]]\n```\n\n### Bounded Minimization with Nelder-Mead \n\n#### Bounded Minimization with Nelder-Mead Simplex, Booth function example (successful convergence)\n\nInput:\n```python\ninitial_pt = [2.0, 2.0]\nfunc = booth\nminimum = opt.nelder_mead(initial_pt, func, bounds=finite_bounds)\nprint(minimum)\n```\nReturn:\n```python\n[1., 3.]\n```\n\n#### Bounded Minimization with Nelder-Mead Simplex, Rosenbrock function example (successful convergence)\n\nInput:\n```python\ninitial_pt = [2.0, 2.0]\nfunc = rosenbrock\nminimum = opt.nelder_mead(initial_pt, func, bounds=finite_bounds)\nprint(minimum)\n```\nReturn:\n```python\n[1., 1.]\n```\n\n##### Bounded Minimization with Nelder-Mead Simplex, Ackley function example (convergence failure) \nThis starting point is far from the global minimum for the Ackley function. This causes the 'nelder_mead' algorithm to become trapped in a local minimum.\n\nInput:\n```python\ninitial_pt = [2.0, 2.0]\nfunc = ackley\nminimum = opt.nelder_mead(initial_pt, func, bounds=finite_bounds)\nprint(minimum)\n```\nReturn:\n```python\n[1.97445199, 1.97445199]\n```\n\n#### Bounded Minimization with Nelder-Mead Simplex, Ackley function example (successful convergence)\nThis starting point is close to the global minimum for the Ackley function. There are no local minima between this starting point and the global minimum. The 'nelder_mead' algorithm quickly converges to the global minimum.\n\nInput:\n```python\ninitial_pt = [0.1, 0.1]\nfunc = ackley\nminimum = opt.nelder_mead(initial_pt, func, bounds=finite_bounds)\nprint(minimum)\n```\nReturn:\n```python\n[ 2.39783206e-16, -1.75571593e-16]\n```\n\n### Particle Swarm Minimization\nThe 'particle_swarm' algorithm is an evolutionary minimization method. The main strength of the 'particle_swarm' algorithm is that it effectively identifies the global minimum in problem spaces that contain many other local minima. Though the algorithm can identify the global minimum for MOST problems problems, there is no guarantee that it will identify the global minimum for EVERY problem. The main weakness of the 'particle_swarm' algorithm is that it is not efficient at converging to a tight estimate of the global minimum (after the neighborhood of the global minimum is identified). \n\nBounds specification is required for this implementation of the 'particle_swarm' algorithm because bounds are used to generate the initial particle swarm.\n\n#### Bounded Minimization with Particle Swarm, Booth function example (successful convergence)\n\nInput:\n```python\nfunc = booth\nminimum, nelder_mead_initial_size = opt.particle_swarm(func, bounds=finite_bounds)\nprint(minimum)\n```\nReturn:\n```python\n[0.99999974, 3.00000191]\n```\n\n#### Bounded Minimization with Particle Swarm, Rosenbrock function example (successful convergence)\n\nInput:\n```python\nfunc = rosenbrock\nminimum, nelder_mead_initial_size = opt.particle_swarm(func, bounds=finite_bounds)\nprint(minimum)\n```\nReturn:\n```python\n[0.99999155, 0.99998029]\n```\n\n#### Bounded Minimization with Particle Swarm, Ackley function example (successful convergence)\n\nInput:\n```python\nfunc = ackley\nminimum, nelder_mead_initial_size = opt.particle_swarm(func, bounds=finite_bounds)\nprint(minimum)\n```\nReturn:\n```python\n[-3.59268130e-11, 4.02149815e-10]\n```\n\n### Particle Swarm Followed by Nelder-Mead Refinement\nThe 'particle_swarm' and 'nelder_mead' algorithms can be used together to efficiently minimize complex objective functions. The 'particle_swarm' algorithm is used first to find an estimate for the neighborhood of the global minimum (loose convergence critera is used). The 'particle_swarm' algorithm also generates an estimate for the initial size of the simplex in the 'nelder_mead' algorithm. The initial estimate of the global minimum and simplex size are then passed to the 'nelder_mead' algorithm. The 'nelder_mead' algorithm will converge to a tight estimate of the global minimum. Though this sequential procedure can identify the global minimum for MOST problems problems, there is no guarantee that it will identify the global minimum for EVERY problem.\n\n#### Bounded Minimization with Combined Procedure, Booth function example (successful convergence)\n\nInput:\n```python\nfunc = booth\nps_minimum, nelder_mead_initial_size = opt.particle_swarm(func, bounds=finite_bounds)\nnm_minimum = opt.nelder_mead(ps_minimum, func, bounds=finite_bounds, initial_size=nelder_mead_initial_size)\nprint(nm_minimum)\n```\nReturn:\n```python\n[1., 3.]\n```\n\n#### Bounded Minimization with Combined Procedure, Rosenbrock function example (successful convergence)\n\nInput:\n```python\nfunc = rosenbrock\nps_minimum, nelder_mead_initial_size = opt.particle_swarm(func, bounds=finite_bounds)\nnm_minimum = opt.nelder_mead(ps_minimum, func, bounds=finite_bounds, initial_size=nelder_mead_initial_size)\nprint(nm_minimum)\n```\nReturn:\n```python\n[1., 1.]\n```\n\n#### Bounded Minimization with Combined Procedure, Ackley function example (successful convergence)\n\nInput:\n```python\nfunc = ackley\nps_minimum, nelder_mead_initial_size = opt.particle_swarm(func, bounds=finite_bounds)\nnm_minimum = opt.nelder_mead(ps_minimum, func, bounds=finite_bounds, initial_size=nelder_mead_initial_size)\nprint(nm_minimum)\n```\nReturn:\n```python\n[-2.48273518e-16, 1.10150570e-15]\n```\n\n### Constraints\nConstraints are functional requirements that define valid regions of the problem space. A complete constraint specification must include a constraint function, any optional arguments (args and kwargs), and constraint type (inequality type). More than one constraint can be specified to define a problem space.\n\nA valid constraints list is a list of constraint dictionaries.\n+ [{const_dict}, ... ,{const_dict}]\n+ {'type': ineq_spec_string, 'func': callable_func, 'args': (args_tuple), 'kwargs': {kwargs_dict}}\n\nThe 'constraints_check' function checks constraints lists for consistency and returns the list with basic problems corrected.\n\nInput:\n```python\ndef cubic(args):\n \"\"\"Cubic curve\"\"\"\n return (args[0] - 1.0) ** 3 - args[1] + 1.0\n\n\ndef line(args):\n \"\"\"Line\"\"\"\n return args[0] + args[1] - 2.0\n\n\ndef circle(args):\n \"\"\"Circle\"\"\"\n return args[0]**2 + args[1]**2 - 2.0\n\n\nconst_a = [{'type': '<=0', 'func': cubic},\n {'type': '<=0', 'func': line}]\nconst_b = [{'type': '<=0', 'func': circle}]\nconst_a_checked = opt.constraints_check(const_a)\nconst_b_checked = opt.constraints_check(const_b)\nprint(const_a_checked)\nprint(const_b_checked)\n```\nReturn:\n ```python\n[{'type': '<=0', 'func': , 'args': (), 'kwargs': {}}, \n {'type': '<=0', 'func': , 'args': (), 'kwargs': {}}]\n[{'type': '<=0', 'func': , 'args': (), 'kwargs': {}}]\n ```\n\n### Bounded and Constrained Minimization\nIt is straight forward to minimize a function after bounds and constraints have been specified. Both the 'nelder_mead' and 'particle_swarm' algorithms can be used with bounds and constraints. However, the combined procedure ('particle_swarm' followed by 'nelder_mead' refinement) is recommended.\n\n#### Bounded and Constrained Minimization with Combined Procedure, Booth function example (successful convergence)\nThe minimum of the booth function subject to bounds and constraints is no longer the global minimum of the unconstrained booth function. The minimum lies on the edge of the constrained problem space.\n\nInput:\n```python\nfunc = booth\nps_minimum, nelder_mead_initial_size = opt.particle_swarm(func, bounds=finite_bounds, constraints=const_a)\nnm_minimum = opt.nelder_mead(ps_minimum, func, bounds=finite_bounds, constraints=const_a,\n initial_size=nelder_mead_initial_size)\nprint(nm_minimum)\n```\nReturn:\n```python\n[-0.00337825, 2.00337825]\n```\n\n#### Bounded and Constrained Minimization with Combined Procedure, Booth function example (successful convergence)\nThe minimum of the booth function subject to bounds and constraints is no longer the global minimum of the unconstrained booth function. The minimum lies on the edge of the constrained problem space.\n\nInput:\n```python\nfunc = booth\nps_minimum, nelder_mead_initial_size = opt.particle_swarm(func, bounds=finite_bounds, constraints=const_b)\nnm_minimum = opt.nelder_mead(ps_minimum, func, bounds=finite_bounds, constraints=const_b,\n initial_size=nelder_mead_initial_size)\nprint(nm_minimum)\n```\nReturn:\n```python\n[0.90574948, 1.08610215]\n```\n\n#### Bounded and Constrained Minimization with Combined Procedure, Rosenbrock function example (successful convergence)\nThe minimum of the rosenbrock function subject to bounds and constraints is still the global minimum of the unconstrained rosenbrock function. The global minimum lies right on the edge of the constrained problem space.\n\nInput:\n```python\nfunc = rosenbrock\nps_minimum, nelder_mead_initial_size = opt.particle_swarm(func, bounds=finite_bounds, constraints=const_a)\nnm_minimum = opt.nelder_mead(ps_minimum, func, bounds=finite_bounds, constraints=const_a,\n initial_size=nelder_mead_initial_size)\nprint(nm_minimum)\n```\nReturn:\n```python\n[1., 1.]\n```\n\n#### Bounded and Constrained Minimization with Combined Procedure, Rosenbrock function example (successful convergence)\nThe minimum of the rosenbrock function subject to bounds and constraints is still the global minimum of the unconstrained rosenbrock function. The global minimum lies right on the edge of the constrained problem space.\n\nInput:\n```python\nfunc = rosenbrock\nps_minimum, nelder_mead_initial_size = opt.particle_swarm(func, bounds=finite_bounds, constraints=const_b)\nnm_minimum = opt.nelder_mead(ps_minimum, func, bounds=finite_bounds, constraints=const_b,\n initial_size=nelder_mead_initial_size)\nprint(nm_minimum)\n```\nReturn:\n```python\n[1., 1.]\n```\n\n#### Bounded and Constrained Minimization with Combined Procedure, Ackley function example (successful convergence)\n\nInput:\n```python\nfunc = ackley\nps_minimum, nelder_mead_initial_size = opt.particle_swarm(func, bounds=finite_bounds, constraints=const_a)\nnm_minimum = opt.nelder_mead(ps_minimum, func, bounds=finite_bounds, constraints=const_a,\n initial_size=nelder_mead_initial_size)\nprint(nm_minimum)\n```\nReturn:\n```python\n[-2.20115198e-15, -1.85442232e-15]\n```\n\n### Model Regression\nThe least squares objective function is the core of regression. This implementation of the least squares objective function facilitates weights as well as bootstrapping. The difference between 'fx' and 'func(x, theta)' is a measure of the goodness of fit. Minimizing this difference by adjusting 'theta' is how 'func' is regressed to fit the data set ('x' and 'fx').\n\n#### Function to be Fit\n```python\ndef quadratic(x, a, b, c):\n \"\"\"General quadratic function\"\"\"\n return a*x**2 + b*x + c\n```\n\n#### 'x' and 'fx' Vectors\n```python\n# 'x' vector.\nx = [0.0, 0.2, 0.4, 0.6, 0.8, 1.0, 1.2, 1.4, 1.6, 1.8, 2.0, 0.1, 0.3, 0.5, 0.7, 0.9, 1.1, 1.3, 1.5]\n# 'fx' values\nfx = [0.43, 0.27, 0.59, 0.62, 1.43, 2.04, 2.88, 3.96, 4.9, 6.27, 8.05, 0.3, 0.3, 0.48, 0.98, 1.51, 2.14, 3.1, 4.72]\n```\n\n#### Basic Least Squares Regression\n\nInput:\n```python\nfunc = quadratic\ntheta_initial_guess = [2.1, -0.4, 0.3]\ntheta = opt.nelder_mead(theta_initial_guess, opt.least_squares_objective_function, args=(func, x, fx))\nprint(theta)\n```\nReturn:\n```python\n[2.24925897, -0.67474865, 0.35668829]\n```\n\n#### Weighted Least Squares Regression\nWeights can be added which allows additional influence to be attached to certain data points. Even weights for each term will yield the same result as unweighted least squares.\n\nInput:\n```python\neven_weight = [0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5]\ntheta = opt.nelder_mead(theta_initial_guess, opt.least_squares_objective_function, args=(func, x, fx, even_weight))\nprint(theta)\n```\nReturn:\n```python\n[2.24925897, -0.67474865, 0.35668829]\n```\n\nUneven weights will emphasize certain terms which impacts the regression result. Uneven weights are commonly encountered when fitting a model to experimental data where uncertainty is different for each measurement.\n\nInput:\n```python\nuneven_weight = [0.3, 0.4, 0.5, 0.3, 0.4, 0.5, 0.3, 0.4, 0.5, 0.3, 0.4, 0.5, 0.3, 0.4, 0.5, 0.3, 0.4, 0.5, 0.3]\ntheta = opt.nelder_mead(theta_initial_guess, opt.least_squares_objective_function, args=(func, x, fx, uneven_weight))\nprint(theta)\n```\nReturn:\n```python\n[2.24874079, -0.68614556, 0.36575071]\n```\n\n### Bootstrapping\nThe 'least_squares_bootstrap' function drives repeated evaluation of the 'least_squares_objective_function' where input parameters for each evaluation are sampled from 'x', 'fx', and 'weight' with replacement. The bootstrapping technique uses the results (i.e. fitted model parameters) from each repeat evaluation to derive summary statistics which describe the overall result set (i.e. fitted model parameters with their uncertainties).\n\nInput:\n```python\nbootstrap_set = opt.least_squares_bootstrap(theta_initial_guess, func, x, fx,\n weight=None,\n bounds=None, constraints=None,\n multiprocess=False,\n samples=100, max_iter=1000)\nprint(bootstrap_set)\n```\nReturn:\n```python\n[[2.11986049, -0.40819355, 0.33375504]\n [2.31068123, -0.80855933, 0.39918449]\n [2.39058295, -0.95418734, 0.40050302]\n [2.23447189, -0.67165168, 0.35552198]\n [2.26151338, -0.59360129, 0.25014203]\n [2.11059271, -0.41854009, 0.31450327]\n [2.24066416, -0.55588869, 0.28528897]\n [2.37982065, -0.87770428, 0.37037564]\n [2.15489980, -0.60434848, 0.37946051]\n [2.26765001, -0.74801637, 0.42275602]\n [2.36032456, -0.93812904, 0.44158196]\n [2.20161137, -0.54674909, 0.32233909]\n [2.68675659, -1.20049127, 0.41312043]\n [2.23992883, -0.73049307, 0.39298691]\n [2.42285334, -1.02007571, 0.41370413]\n [2.00304294, -0.14472707, 0.11966633]\n [2.04840949, -0.23959399, 0.27271515]\n [1.98548708, -0.24492065, 0.27279987]\n [2.11948575, -0.36589710, 0.22541193]\n [2.30513469, -0.81350200, 0.38391252]\n [2.15453856, -0.53658969, 0.39341044]\n [2.27455759, -0.75397887, 0.34604839]\n [2.29431153, -0.76836397, 0.36190855]\n [2.30590514, -0.81572687, 0.37263515]\n [2.21988265, -0.65713741, 0.36846049]]\n```\n\n\n", "description_content_type": "text/markdown", "docs_url": null, "download_url": "", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/theleftcoast/swarm-simplex-bootstrap", "keywords": "", "license": "", "maintainer": "", "maintainer_email": "", "name": "ssb-optimize", "package_url": "https://pypi.org/project/ssb-optimize/", "platform": "", "project_url": "https://pypi.org/project/ssb-optimize/", "project_urls": { "Homepage": "https://github.com/theleftcoast/swarm-simplex-bootstrap" }, "release_url": "https://pypi.org/project/ssb-optimize/0.1.3/", "requires_dist": [ "numpy" ], "requires_python": ">=3.6", "summary": "Particle Swarm and Nelder-Mead Simplex optimization algorithms with Least Squares Regression and Bootstrap confidence intervals.", "version": "0.1.3", "yanked": false, "yanked_reason": null }, "last_serial": 6041052, "releases": { "0.1.3": [ { "comment_text": "", "digests": { "md5": "0bb60946217e545d9578c6788a592e9e", "sha256": "dbbc04bf9c2accf6d8f334306a0eaa9da25e042bd7205c324025725416c29bc1" }, "downloads": -1, "filename": "ssb_optimize-0.1.3-py3-none-any.whl", "has_sig": false, "md5_digest": "0bb60946217e545d9578c6788a592e9e", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": ">=3.6", "size": 29913, "upload_time": "2019-10-28T12:12:54", "upload_time_iso_8601": "2019-10-28T12:12:54.352387Z", "url": "https://files.pythonhosted.org/packages/50/c7/0b96024385016ec9b89f5c2e7a8e9ba02d1bf5f1343f775a39e5ed3ba103/ssb_optimize-0.1.3-py3-none-any.whl", "yanked": false, "yanked_reason": null }, { "comment_text": "", "digests": { "md5": "6449f3a5c1f1273c50c2594e078b0aab", "sha256": "5c0c3c1681d7dc64e3732ccd7d233ba29fb3e0bb88be676a2779a834ed5d3248" }, "downloads": -1, "filename": "ssb_optimize-0.1.3.tar.gz", "has_sig": false, "md5_digest": "6449f3a5c1f1273c50c2594e078b0aab", "packagetype": "sdist", "python_version": "source", "requires_python": ">=3.6", "size": 27858, "upload_time": "2019-10-28T12:12:57", "upload_time_iso_8601": "2019-10-28T12:12:57.442751Z", "url": "https://files.pythonhosted.org/packages/92/74/49015e6c6f3059dbd60a163970a40167f092f12eafacf22a389bc52e81f1/ssb_optimize-0.1.3.tar.gz", "yanked": false, "yanked_reason": null } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "0bb60946217e545d9578c6788a592e9e", "sha256": "dbbc04bf9c2accf6d8f334306a0eaa9da25e042bd7205c324025725416c29bc1" }, "downloads": -1, "filename": "ssb_optimize-0.1.3-py3-none-any.whl", "has_sig": false, "md5_digest": "0bb60946217e545d9578c6788a592e9e", "packagetype": "bdist_wheel", "python_version": "py3", "requires_python": ">=3.6", "size": 29913, "upload_time": "2019-10-28T12:12:54", "upload_time_iso_8601": "2019-10-28T12:12:54.352387Z", "url": "https://files.pythonhosted.org/packages/50/c7/0b96024385016ec9b89f5c2e7a8e9ba02d1bf5f1343f775a39e5ed3ba103/ssb_optimize-0.1.3-py3-none-any.whl", "yanked": false, "yanked_reason": null }, { "comment_text": "", "digests": { "md5": "6449f3a5c1f1273c50c2594e078b0aab", "sha256": "5c0c3c1681d7dc64e3732ccd7d233ba29fb3e0bb88be676a2779a834ed5d3248" }, "downloads": -1, "filename": "ssb_optimize-0.1.3.tar.gz", "has_sig": false, "md5_digest": "6449f3a5c1f1273c50c2594e078b0aab", "packagetype": "sdist", "python_version": "source", "requires_python": ">=3.6", "size": 27858, "upload_time": "2019-10-28T12:12:57", "upload_time_iso_8601": "2019-10-28T12:12:57.442751Z", "url": "https://files.pythonhosted.org/packages/92/74/49015e6c6f3059dbd60a163970a40167f092f12eafacf22a389bc52e81f1/ssb_optimize-0.1.3.tar.gz", "yanked": false, "yanked_reason": null } ], "vulnerabilities": [] }