A comprehensive Clojure library for numerical optimization, root-finding, interpolation, and regression. Built on Hipparchus, OjAlgo, jcobyla, and Incanter.
-
roots.continuous— Univariate root-finding with multiple algorithms: Brent-Dekker, modified Newton-Raphson, Muller, plus Hipparchus methods (bisection, Brent, Illinois, Pegasus, Ridders, secant, Regula-Falsi). Includes quadratic equation solver. -
roots.integer— Bisection algorithm for strictly increasing discrete functions. Returns the minimum integer with function value ≥ 0. -
roots.plateau— Root-finding for monotonic functions that return plateau values. Supports univariate and multivariate cases. -
roots.polynomial— Polynomial root-finding using Laguerre's method via Hipparchus.
-
optimize.continuous-univariate— Brent optimizer for 1D minimization/maximization over bounded intervals. -
optimize.integer-univariate— Integer optimizer using exponential search or ternary search. Handles unimodal functions over integer domains. -
optimize.linear— Two-phase Simplex Method. Minimizes/maximizes linear objectives subject to linear constraints (equality, ≤, ≥). -
optimize.mixed-integer— Mixed Integer Programming (MIP) using OjAlgo. Extends linear programming to support integer and binary variable constraints. -
optimize.quadratic— Minimizes (1/2)x^T P x + q^T x subject to equality and inequality constraints using OjAlgo. -
optimize.nlp-constrained— Constrained nonlinear optimization using COBYLA. Minimizes objectives subject to nonlinear inequality constraints (≥ 0). -
optimize.nlp-unbounded— Unbounded nonlinear optimization with multiple pure-Clojure algorithms:- Powell: derivative-free direction-set method
- Nelder-Mead: simplex-based derivative-free optimization
- L-BFGS: limited-memory quasi-Newton with optional gradient
- Gauss-Newton: nonlinear least-squares for residual minimization
- Auto-selecting orchestrator tries multiple algorithms
-
optimize.nlp-bounded— Bounded nonlinear optimization with box constraints:- COBYLA: linear-approximation constrained optimization
- BOBYQA: quadratic-model bound-constrained optimization
- CMA-ES: evolutionary strategy for non-smooth/non-convex objectives
Solvers for systems of equations without an objective function to optimize:
-
systems.linear— Iterative solvers for symmetric linear systems (A × y = b): SYMMLQ and Conjugate Gradient methods. For small/dense systems, use direct least squares fromprovisdom.math.linear-algebrainstead. -
systems.nonlinear— Finds variable values that satisfy nonlinear constraints:- Nonlinear Least Squares (Levenberg-Marquardt, Gauss-Newton)
- Nonlinear Ordered Constraints (prioritized constraint satisfaction)
Interpolation is organized by dimensionality:
-
interpolation.univariate— 1D interpolation methods:- Methods: cubic, cubic-hermite, cubic-closed, Akima, PCHIP (monotone-preserving), barycentric-rational, linear, polynomial, step, LOESS (smoothing)
- Specialized: quadratic with slope, cubic-clamped with 2nd derivatives, B-spline with knots, smoothing spline (with EDF control)
- Extras: extrapolation modes (error/flat/linear), batch evaluation, spline integration, auto-selection
- Slope interpolation: compute derivatives at query points
-
interpolation.grid-2d— 2D grid-based interpolation (x-vals × y-vals → f-matrix):- Methods: polynomial, bicubic, bicubic-Hermite, bilinear
-
interpolation.grid-3d— 3D grid-based interpolation (x-vals × y-vals × z-vals → f-tensor):- Methods: tricubic
-
interpolation.scatter-nd— N-D scattered (non-grid) data interpolation:- Methods: microsphere projection, RBF (radial basis functions), natural neighbor (2D only)
-
fitting.curve— Least-squares fitting to find best-fit parameters:- Nonlinear (Levenberg-Marquardt): Gaussian, Harmonic, Polynomial, arbitrary parametric
- Linear basis (closed-form): custom basis functions for univariate and multivariate data
| Aspect | Interpolation | Curve Fitting |
|---|---|---|
| Returns | A function (fn [x] -> y) |
Parameters (coeffs, etc.) |
| Passes through all? | Yes (except LOESS) | No (minimizes error) |
| You specify | Interpolation method | Functional form + degree |
| Use case | Exact data, lookup tables | Noisy data, known physics |
Example: Given 10 data points, (fit-polynomial points 2) fits a parabola (3 coefficients) that best approximates the data, while (interpolation-1D ...) with :polynomial creates a degree-9 polynomial passing through all 10 points exactly.
Note: LOESS is in the interpolation namespace (same API) but doesn't pass through points—it performs local regression for smoothing noisy data.
| Aspect | Curve Fitting | Regression |
|---|---|---|
| Goal | Find parameters of a known functional form | Model statistical relationships between X and y |
| Functional form | Arbitrary: a·e^(-((x-b)/c)²), a·cos(ωx+φ) |
Linear in parameters: y = Xβ (possibly transformed) |
| Optimization | Nonlinear least squares (Levenberg-Marquardt) | Closed-form (OLS/Ridge) or IRLS (GLMs) |
| Assumptions | Just minimize residuals | Error distribution, link function |
| Output | Parameters of the specific function | Coefficients + diagnostics (R², MSE, precision) |
Curve fitting answers: "Given this formula, what parameters fit best?"
Regression answers: "What's the statistical relationship between predictors X and response y?"
Logistic and beta regression aren't just curve fitting—they model the distribution of y given X (Bernoulli, Beta) with appropriate link functions, using Iteratively Reweighted Least Squares (IRLS).
All regression methods are in the provisdom.solvers.regression namespace hierarchy:
-
regression.ordinary— Ordinary least squares with optional regularization:- OLS — Standard least squares via QR decomposition
- Ridge (L2) — Closed-form solution with λI penalty
- LASSO (L1) — Coordinate descent for sparse solutions
- Elastic Net — Combined L1+L2 via coordinate descent
-
regression.logistic— Binary logistic regression using IRLS with optional ridge regularization. -
regression.multinomial-logistic— Multi-class logistic regression (one-vs-all approach). -
regression.beta— Beta regression for responses bounded in (0, 1) using IRLS. -
regression.kernel-grnn— Generalized Regression Neural Network (Nadaraya-Watson kernel regression) with automatic spread optimization. -
regression.stepwise— Stepwise variable selection:- Forward — Start empty, iteratively add best predictor
- Backward — Start full, iteratively remove worst predictor
- Both — Combine forward and backward steps
- Supports AIC/BIC scoring with ordinary, logistic, or beta regression
Logistic and beta regression use IRLS, which fits generalized linear models by iteratively solving weighted least squares problems:
- Compute working weights W based on current predictions
- Compute working response z (adjusted dependent variable)
- Solve weighted OLS:
β = (X'WX)⁻¹ X'Wz - Repeat until convergence
For logistic regression: W = diag(μ(1-μ)), z = η + (y-μ)/(μ(1-μ))
For beta regression: W = diag(φ·μ(1-μ)), z = η + (y-μ)/(μ(1-μ))
(require '[provisdom.solvers.roots.continuous :as roots])
(require '[provisdom.solvers.common :as common])
(require '[provisdom.solvers.optimize.nlp-bounded :as nlp])
(require '[provisdom.solvers.interpolation.univariate :as interp])
;; Find root of f(x) = x³ - 3x
(roots/root-solver
{::roots/univariate-f (fn [x] (- (* x x x) (* 3 x)))
::roots/guess 2.0
::roots/interval [-5.0 5.0]})
;; Minimize f(x,y) = x² + y² subject to bounds
(nlp/bounded-nlp-without-evolutionary
{::common/objective (fn [da] (let [[x y] da] (+ (* x x) (* y y))))
::common/vars-guess [1.0 1.0]
::nlp/var-intervals [[-10.0 10.0] [-10.0 10.0]]})
;; Cubic spline interpolation
(let [f (interp/interpolation-1D
{::interp/f-vals [0.0 1.0 4.0 9.0]
::interp/x-vals [0.0 1.0 2.0 3.0]})]
(f 1.5))- Hipparchus — root-finding, optimization, interpolation, least squares (replaces Apache Commons Math)
- OjAlgo — linear and quadratic programming
- jcobyla — constrained nonlinear optimization (COBYLA algorithm)
- Incanter — 2D interpolation (bicubic, bilinear, polynomial)
Copyright © 2018-2026 Provisdom Corp.
Distributed under the GNU Lesser General Public License version 3.0.