This is a record of all past estimagic releases and what went into them in reverse chronological order. We follow semantic versioning and all releases are available on



  • #295 Fixes a small bug in estimation_table (@mpetrosian).

  • #286 Adds pytree support for first and second derivative (@timmens).

  • #285 Allows to use estimation functions with external optimization (@janosg).

  • #283 Adds fast solvers for quadratic trustregion subproblems (@segsell).

  • #282 Vastly improves estimation tables (@mpetrosian).

  • #281 Adds some tools to work with pytrees (@janosg and @timmens).

  • #278 adds Estimagic Enhancement Proposal 1 for the use of Pytrees in Estimagic (@janosg)




Add a lot of new functionality with a few minor breaking changes. We have more optimizers, better error handling, bootstrap and inference for method of simulated moments. The breaking changes are: - logging is disabled by default during optimization. - the log_option “if_exists” was renamed to “if_table_exists” - The comparison plot function is removed. - first_derivative now returns a dictionary, independent of arguments. - structure of the logging database has changed - there is an additional boolean flag named scaling in minimize and maximize

0.1.3 - 2021-06-25#

0.1.2 - 2021-02-07#

0.1.1 - 2021-01-13#

This release greatly expands the set of available optimization algorithms, has a better and prettier dashboard and improves the documentation.

0.1.0dev1 - 2020-09-08#

This release entails a complete rewrite of the optimization code with many breaking changes. In particular, some optimizers that were available before are not anymore. Those will be re-introduced soon. The breaking changes include:

  • The database is restructured. The new version simplifies the code, makes logging faster and avoids the sql column limit.

  • Users can provide closed form derivative and/or criterion_and_derivative where the latter one can exploit synergies in the calculation of criterion and derivative. This is also compatible with constraints.

  • Our own (parallelized) first_derivative function is used to calculate gradients during the optimization when no closed form gradients are provided.

  • Optimizer options like convergence criteria and optimization results are harmonized across optimizers.

  • Users can choose from several batch evaluators whenever we parallelize (e.g. for parallel optimizations or parallel function evaluations for numerical derivatives) or pass in their own batch evaluator function as long as it has a compatible interface. The batch evaluator interface also standardizes error handling.

  • There is a well defined internal optimizer interface. Users can select the pre-implemented optimizers by algorithm=”name_of_optimizer” or their own optimizer by algorithm=custom_minimize_function

  • Optimizers from pygmo and nlopt are no longer supported (will be re-introduced)

  • Greatly improved error handling.

  • #169 Add additional dashboard arguments

  • #168 Rename lower and upper to lower_bound and upper_bound (@ChristianZimpelmann)

  • #167 Improve dashboard styling (@roecla)

  • #166 Re-add POUNDERS from TAO (@tobiasraabe)

  • #165 Re-add the scipy optimizers with harmonized options (@roecla)

  • #164 Closed form derivatives for parameter transformations (@timmens)

  • #163 Complete rewrite of optimization with breaking changes (@janosg)

  • #162 Improve packaging and relax version constraints (@tobiasraabe)

  • #160 Generate parameter tables in tex and html (@mpetrosian)

0.0.31 - 2020-06-20#

0.0.30 - 2020-04-22#

  • #158 allows to specify a gradient in maximize and minimize (@janosg)

0.0.29 - 2020-04-16#

0.0.28 - 2020-03-17#