Code Monkey home page Code Monkey logo

vectorautoregressions.jl's Introduction

VectorAutoregressions.jl

Vector autoregressive models for Julia

Stable Documenter Build Status Coverage Status codecov

Installation

Pkg.add("https://github.com/lucabrugnolini/VectorAutoregressions.jl")

Introduction

This package is a work in progress for the estimation and identification of Vector Autoregressive (VAR) models.

Status

  • VAR
    • VAR(1) form
    • Lag-length selection
      • AIC
      • AICC
      • BIC
      • HQC
    • VAR impulse response function (IRFs)
      • Identification
        • Reduce form
        • Cholesky
        • Long-run restrictions
        • Sign restrictions
        • Heteroskedasticity
        • External instruments (ex. high-frequency,narrative)
          • Wild bootstrap
      • Confidence bands
        • Asymptotic
        • Bootstrap
        • Bootstrap-after-bootstrap
    • Forecasting
      • BVAR
      • FAVAR
  • Local projection IRFs
    • Lag-length selection
    • Confidence bands
      • Standard
      • Bootstrap
    • Bayesian Local Projection

Example

## Example: fit a VAR(`p`) to the data and derive structural IRFs with asymptotic and bootstrap conf. bands.
using VectorAutoregressions
using DelimitedFiles: readdlm
using Plots
plotly()

# Read example data
path = joinpath(dirname(pathof(VectorAutoregressions)), "..") # set base path to load data
y = readdlm(joinpath(path,"test","cholvar_test_data.csv"), ',') #read example file with data

# Set VAR parameters
intercept = false #intercept in the estimation
p = 2 #select lag-length
H = 15 # IRFs horizon
nrep = 500 #bootstrap sample

# Fit VAR(2) to data
V = VAR(y,p,intercept)

# Estimate IRFs - Cholesky identification
mIRFa = IRFs_a(V,H,intercept) #asymptotic conf. bandf
mIRFb = IRFs_b(V,H,nrep,intercept) #bootstrap conf. bands

# Plot irf + asy ci
T,K = size(y)
pIRF_asy = plot(layout = grid(K,K));
[plot!(pIRF_asy, [mIRFa.CI.CIl[i,:] mIRFa.IRF[i,:] mIRFa.CI.CIh[i,:]], color = ["red" "red" "red"],
line = [:dash :solid :dash], legend = false, subplot = i) for i in 1:K^2]
gui(pIRF_asy)

# Plot irf + bootstraped ci
pIRF_boot = plot(layout = grid(K,K));
[plot!(pIRF_boot, [mIRFb.CI.CIl[i,:] mIRFb.IRF[i,:] mIRFb.CI.CIh[i,:]], color = ["blue" "blue" "blue"],
line = [:dash :solid :dash], legend = false, subplot = i) for i in 1:K^2]
gui(pIRF_boot)

More in details, y is a matrix with data, p is the lag-length of the VAR we fit to the data and i is a Boolean for including an intercept (default is true). VAR(y,p,intercept) returns a fitted VAR(p) model in V with the following structure:

struct VAR
  Y::Array # dep. variables
  X::Array # covariates
  β::Array # parameters
  ϵ::Array # residuals
  Σ::Array # VCV matrix
  p::Int64 # lag-length
  i::Bool # true or false for including an intercept (default is true)
end

You can access to each element writing V. and than the element you are interested in (for example for the covariates V.X).

vectorautoregressions.jl's People

Contributors

benjaminborn avatar gionikola avatar github-actions[bot] avatar gragusa avatar greimel avatar lucabrugnolini avatar nreigl avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

vectorautoregressions.jl's Issues

Cholesky identification

Hi Luca,

Sorry to bother you again but I tried to use your package to replicate some Cholesky identified IRFs generated using old Matlab codes of mine and I didn't succeed. I then noticed that in the irf_chol() function you have the following line:
https://github.com/lucabrugnolini/VAR.jl/blob/2f0f8e51e32c92494df1598b1095b5718811cf2b/src/VAR.jl#L548
This is puzzling as full(cholfact(A)) just reconstructs the matrix A but I think what you need is the lower diagonal matrix cholfact(A)[:L]. See the following example

A = [2.0 2.0; 2.0 2.0]
Base.LinAlg.Cholesky{Float64,Array{Float64,2}} with factor:
[1.41421 0.0; 1.41421 2.10734e-8]

cholfact(A)[:L]
2×2 LowerTriangular{Float64,Array{Float64,2}}:
 1.41421   ⋅        
 1.41421  2.10734e-8

full(cholfact(A))
2×2 Array{Float64,2}:
 2.0  2.0
 2.0  2.0

If I replace line 548 by mSigma = cholfact(V.Σ)[:L], I can replicate my results. I'm probably missing something and it would be great if you could help me out.

Best, Benjamin

Add stability check

Suggestion: add a method that checks eigenvalues and returns whether VAR is stable.

From browsing the code quickly there's already a function that generates the companion form, so it'll be simple to implement.

If you want, I can add it myself and make a PR

Renaming the package

Hi Luca,

It's kind of unfortunate that the package and the module have different names: VAR vs VARs. I think the Julia convention is to keep them the same. We cannot rename the module to VAR as that clashes with the function and type VAR. But what about naming the package and module VectorAutoregressions.jl. The name is a bit more descriptive for non-macroeconomists which might also help when registering the package. I also think it's not too long to type once in a using statement.

Let me know what you think. Best,

Benjamin

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.