Code Monkey home page Code Monkey logo

lsqfit.jl's Introduction

LsqFit.jl

The LsqFit package is a small library that provides basic least-squares fitting in pure Julia under an MIT license. The basic functionality was originaly in Optim.jl, before being separated into this library. At this time, LsqFit only utilizes the Levenberg-Marquardt algorithm for non-linear fitting.

Build Status

LsqFit LsqFit LsqFit

Basic Usage

There are top-level methods curve_fit() and estimate_errors() that are useful for fitting data to non-linear models. See the following example:

using LsqFit

# a two-parameter exponential model
# x: array of independent variables
# p: array of model parameters
model(x, p) = p[1]*exp.(-x.*p[2])

# some example data
# xdata: independent variables
# ydata: dependent variable
xdata = range(0, stop=10, length=20)
ydata = model(xdata, [1.0 2.0]) + 0.01*randn(length(xdata))
p0 = [0.5, 0.5]

fit = curve_fit(model, xdata, ydata, p0)
# fit is a composite type (LsqFitResult), with some interesting values:
#	fit.dof: degrees of freedom
#	fit.param: best fit parameters
#	fit.resid: residuals = vector of residuals
#	fit.jacobian: estimated Jacobian at solution

# We can estimate errors on the fit parameters,
# to get standard error of each parameter:
sigma = standard_error(fit)
# to get margin of error and confidence interval of each parameter at 5% significance level:
margin_of_error = margin_error(fit, 0.05)
confidence_interval = confidence_interval(fit, 0.05)

# The finite difference method is used above to approximate the Jacobian.
# Alternatively, a function which calculates it exactly can be supplied instead.
function jacobian_model(x,p)
    J = Array{Float64}(length(x),length(p))
    J[:,1] = exp.(-x.*p[2])    #dmodel/dp[1]
    J[:,2] = -x.*p[1].*J[:,1]  #dmodel/dp[2]
    J
end
fit = curve_fit(model, jacobian_model, xdata, ydata, p0)

Existing Functionality

fit = curve_fit(model, [jacobian], x, y, [w,] p0; kwargs...):

  • model: function that takes two arguments (x, params)
  • jacobian: (optional) function that returns the Jacobian matrix of model
  • x: the independent variable
  • y: the dependent variable that constrains model
  • w: (optional) weight applied to the residual; can be a vector (of length(x) size or empty) or matrix (inverse covariance matrix)
  • p0: initial guess of the model parameters
  • kwargs: tuning parameters for fitting, passed to levenberg_marquardt, such as maxIter or show_trace
  • fit: composite type of results (LsqFitResult)

This performs a fit using a non-linear iteration to minimize the (weighted) residual between the model and the dependent variable data (y). The weight (w) can be neglected (as per the example) to perform an unweighted fit. An unweighted fit is the numerical equivalent of w=1 for each point (although unweighted error estimates are handled differently from weighted error estimates even when the weights are uniform).


sigma = standard_error(fit; atol, rtol):

  • fit: result of curve_fit (a LsqFitResult type)
  • atol: absolute tolerance for negativity check
  • rtol: relative tolerance for negativity check

This returns the error or uncertainty of each parameter fit to the model and already scaled by the associated degrees of freedom. Please note, this is a LOCAL quantity calculated from the jacobian of the model evaluated at the best fit point and NOT the result of a parameter exploration.

If no weights are provided for the fits, the variance is estimated from the mean squared error of the fits. If weights are provided, the weights are assumed to be the inverse of the variances or of the covariance matrix, and errors are estimated based on these and the jacobian, assuming a linearization of the model around the minimum squared error point.

margin_of_error = margin_error(fit, alpha=0.05; atol, rtol):

  • fit: result of curve_fit (a LsqFitResult type)
  • alpha: significance level
  • atol: absolute tolerance for negativity check
  • rtol: relative tolerance for negativity check

This returns the product of standard error and critical value of each parameter at alpha significance level.

confidence_interval = confidence_interval(fit, alpha=0.05; atol, rtol):

  • fit: result of curve_fit (a LsqFitResult type)
  • alpha: significance level
  • atol: absolute tolerance for negativity check
  • rtol: relative tolerance for negativity check

This returns confidence interval of each parameter at alpha significance level.


covar = estimate_covar(fit):

  • fit: result of curve_fit (a LsqFitResult type)
  • covar: parameter covariance matrix calculated from the jacobian of the model at the fit point, using the weights (if specified) as the inverse covariance of observations

This returns the parameter covariance matrix evaluted at the best fit point.

lsqfit.jl's People

Contributors

johnmyleswhite avatar blakejohnson avatar pkofod avatar iewaij avatar cmcbride avatar mlubin avatar timholy avatar jw3126 avatar yuyichao avatar bjarthur avatar iainnz avatar andreasnoack avatar mzaffalon avatar aviks avatar blegat avatar chrisrackauckas avatar datseris avatar marcusps avatar nignatiadis avatar sanga avatar tkelman avatar femtocleaner[bot] avatar

Watchers

James Cloos avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.