Code Monkey home page Code Monkey logo

Comments (9)

pbeckman avatar pbeckman commented on June 28, 2024 1

The issue is that in the Newton iterations at gaussjacobi.jl:382, the recursive formula innerjacobi_rec is used instead of the boundary asymptotics, which are implemented as feval_asy2 in Chebfun. An equivalent feval_asy2 routine is not implemented in this package, so I now have a fork here with the low order terms in these asymptotics implemented. This gives us back 6 digits or so. See below.
jacobi_error_a99_new
However, to get 16 digit accuracy, we need the higher order terms. The higher order coefficients $A_m$ and $B_m$ in equation (3.24) of Hale and Townsend 2012 are computed numerically in Chebfun from recursive integro-differential formulae using Chebyshev interpolation, integration, and differentiation. But these routines are not readily available in FastGaussQuadrature.jl.

Do these Chebyshev routines exist elsewhere in JuliaApproximation? Or (at the risk of bloating this package) should we implement them here?

from fastgaussquadrature.jl.

pbeckman avatar pbeckman commented on June 28, 2024 1

Thanks for your responses!

Unless I'm mistaken, it doesn't appear that Chebyshev integration is currently implemented in ClassicalOrthogonalPolynomials.jl. As an alternative, ApproxFunOrthogonalPolynomials.jl has both Chebyshev integration and differentiation implemented. However, it is similarly a circular dependency with this package...

In my view, part of the value of FastGaussQuadrature.jl is that it is extremely lightweight, with relatively few dependencies. I can use ApproxFunOrthogonalPolynomials.jl to solve this issue, but it's a heavy (not to mention circular) dependency, so from a software engineering perspective it doesn't seem like a great path forward. It's a hefty price to pay for some basic Chebyshev routines on $N = 10$ points.

The alternative is to use some lookups and reimplement very barebones routines for the necessary computations. This is the path I've taken in my fork. I've saved the 10-point Chebyshev integration and differentiation matrices from Chebfun as constants in gaussjacobi.jl, and reimplemented only the very short barycentric interpolation routine. This is pretty painless and keeps the repository lightweight.

@dlfivefifty as a contributor to all the involved libraries, let me know if you think this is a reasonable solution. If so, I'll try to iron out the remaining numerical issues (all the existing package tests pass, but I still get ~11 digits on worst-case tests related to this issue, not 16) and submit a pull request.

from fastgaussquadrature.jl.

dlfivefifty avatar dlfivefifty commented on June 28, 2024 1

It certainly is implemented:

julia> using ClassicalOrthogonalPolynomials

julia> T = ChebyshevT()
ChebyshevT()

julia> cumsum(T*[1; zeros(∞)])
ChebyshevT() * [1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0    ]

It's done by applying an ∞-almost banded matrix:

https://github.com/JuliaApproximation/ClassicalOrthogonalPolynomials.jl/blob/bd5d352eabf7898879f0b3247d86dd8d8c80ccd0/src/classical/chebyshev.jl#L301

But I think this is overkill for what we need.

Here's the equivalent routine in ApproxFunOrthogonalPolynomials.jl:

https://github.com/JuliaApproximation/ApproxFunOrthogonalPolynomials.jl/blob/427faaf2296c0fe48c84741046e4aaf0dbe9da66/src/ultraspherical.jl#L27

This is more usable. I think the best option is to make a new package ChebyshevTransforms.jl that contains:

  1. The chebyshev transforms in https://github.com/JuliaApproximation/FastTransforms.jl/blob/master/src/chebyshevtransform.jl
  2. chebyshevintegration! and chebyshevdifferentiation!

FastChebInterp.jl could depend on these packages if they want. But I would avoid depending on that package directly as it introduces an alternative notion of a Chebyshev expansion as either ApproxFun.jl or ClassicalOrthogonalPolynomials.jl. So have a more "core" package like ChebyshevTransforms.jl that doesn't actually implement convenient structs makes sense here.

from fastgaussquadrature.jl.

pbeckman avatar pbeckman commented on June 28, 2024

If we run the following MATLAB code, which is an equivalent test using the implementation of the Hale and Townsend 2012 algorithm in Chebfun, we see much better results, even using many nodes with a singularity $\alpha = -0.99$

a = -0.99;
ns = 2.^ (1:16);
I_true = 2^(a+1)/(a+1);

Is_quad = zeros(length(ns), 1);
for i=1:length(ns)
    [~, w] = jacpts(ns(i), a, 0);
    Is_quad(i) = sum(w);
end

rel_errs = (I_true - Is_quad) / I_true;
n	relative error
----------------------
2	-4.516e-15
4	1.552e-15
8	4.145e-13
16	7.042e-13
32	1.437e-12
64	7.099e-12
128	-3.387e-15
256	3.810e-15
512	0.000e+00
1024	-3.528e-15
2048	-1.411e-15
4096	-2.258e-15
8192	4.234e-16
16384	-7.056e-16
32768	4.234e-16
65536	1.694e-15

This seems to suggest an issue in the Julia implementation.

from fastgaussquadrature.jl.

pbeckman avatar pbeckman commented on June 28, 2024

If we take the Chebfun weights as ground truth (given that they integrate $f \equiv 1$ to 15 digits) and plot the relative error between the Chebfun and FastGaussQuadrature.jl weights, we see that the error is highly concentrated near the endpoints. For example, with $\alpha = -0.99$ and $n = 2^{16},$ we have
jacobi_error_a99
Markers are sized and colored by error size for emphasis. This is a more exaggerated case of exactly what @JamesCBremerJr noted in the aforementioned issue.

from fastgaussquadrature.jl.

hyrodium avatar hyrodium commented on June 28, 2024

Thank you for the detailed comments!
PRs are always welcomed.

Do these Chebyshev routines exist elsewhere in JuliaApproximation?

I'm not much familiar with Chebyshev polynomials, but maybe FastChebInterp.jl helps here?

from fastgaussquadrature.jl.

dlfivefifty avatar dlfivefifty commented on June 28, 2024

It should all be doable with ClassicalOrthogonalPolynomials.jl but that depends on this package./.

from fastgaussquadrature.jl.

pbeckman avatar pbeckman commented on June 28, 2024

Ah, I see - I was looking for an integrate-like function instead of a cumsum.

Your ChebyshevTransforms.jl solution makes good sense to me. Given my lack of familiarity with the relevant interfaces in JuliaApproximation.jl, I've submitted a pull request in the meantime which resolves this issue and #58 in a way that's completely internal to this package. It uses saved integration and differentiation matrices from Chebfun as I mentioned above. It should be easy to substitute more elegant Chebyshev routines once the restructuring of the JuliaApproximation.jl package ecosystem that you suggest occurs. But I imagine you may want to engineer that yourself as the manager of those packages.

from fastgaussquadrature.jl.

hyrodium avatar hyrodium commented on June 28, 2024

Fixed by #131.

from fastgaussquadrature.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.