Comments (9)
The coefficient function is sampled from a Gaussian random field. Here the parameters tau, alpha, and sigma describe the scaling of the Gaussian random field. There parameters are fixed (given in the paper). For different realizations we sample xi
and scale it by the coefficients. Let me know if you have any further questions.
from neuraloperator.
Okay.
But how do u decide the scaling factor like in darcy u have scaled by tau^(alpha-1).*(pi^2*(K1.^2+K2.^2) + tau^2).^(-alpha/2)
while in navier stokes u have scaled by (size**2)*math.sqrt(2.0)*sigma*((4*(math.pi**2)*(k_x**2 + k_y**2) + tau**2)**(-alpha/2.0))
So how do u decide this scaling factors means how do u arrive at them.
And how do u get initial condition w0(x) is generated according to w0 ∼ µ where µ = N (0, 7^3/2*(−∆ + 49I)^−2.5)
.
Means how do u arrive at this. Like if want to genrate sample for some problem then how should I decide the scaling factor and how to get initial conditon means how to get this which u have obtained for navier stokes µ = N (0, 7^3/2*(−∆ + 49I)^−2.5)
.
So how to obtain scaling factor and that µ for any new problem.
from neuraloperator.
I see. It depends on what problem you are interested in. Ideally, the data you generated should be close to the real problem you want to solve. For example, if you want to solve the Helmholtz equation raised from Geology, you may want to generate the coefficient with a similar structure to the earth. If you want to study fluid mechanics with a certain initial condition, then it will be great to generate the data with respect to it. If you know the initial condition will be large in norm, then it will be great to have larger scaling factors such as sigma and tau, and vice versa. In this case, we just generate these data follow some Gaussian random field as a general showcase, so that it can cover a variety of possibilities.
from neuraloperator.
So scaling factor u chose for darcys equation tau^(alpha-1).*(pi^2*(K1.^2+K2.^2) + tau^2).^(-alpha/2)
this one can change that means I can scale it as I want.
So how does this scaling factor depend on initial condtion in fluid mechanics. So how do to genralise this for particular case in fluid mechanics say I want to know flow around cylinder so how to genralise this scailng factor to it. So for that how do I find the scaling factor or just change the values of tau sigma and alpha in ur scaling factor. Define a new scaling factor and if new how by random guess for some on some bases.
from neuraloperator.
How do u define covariance function for different equations. And how do u decide scaling on bases of covariance function .
from neuraloperator.
For the mean we basically alway use 0, but you could use any function, 0 is just convenient. If you have data, you can estimate the mean just by computing the pointwise mean.
The covarriance is slightly tricker. We use a Matern type covariance of the form sigma^2 (-Delta + tau^2 I)^-alpha. The sigma basically just controls the size, so the larger it is, the larger the value your functions take. Tau controls the scale, so larger tau means more multi scale behavior. Alpha controls the smoothness, so larger alpha means smoother functions. Then you can also choose the boundary conditions for the Laplacian -Delta, we’ve used periodic for most problems and Neumann for Darcy.
If you have a real dataset or an equation, you can certainly estimate all this from data, but, how, depends on what kind of knowledge you assume to have. If you assume nothing, the easiest thing to do is PCA your data. Then you can use those approximate eigenfunctions to estimate the rate of decay of the coefficients, then use them in a KL expansion to generate new data.
from neuraloperator.
Thanks that was very good information.
self.sqrt_eig = (size**2)*math.sqrt(2.0)*sigma*((4*(math.pi**2)*(k_x**2 + k_y**2) + tau**2)**(-alpha/2.0))
Is this square root of eigen value of covariance function. I m not getting like how u come to this formula is it derived or by experimentally taken like sometimes u use sqrt(2) someties u use pi**2 but sometimes u use only pi.
Can u tell some resources which would be helpful for making this kind of datasets. Like from where I would be able to derive that "self.sqrt.eig" formula.
from neuraloperator.
For sampling a Gaussian random field via FFT, you may take a look at this short introduction Generating stationary Gaussian random fields.
This package covers more advanced implementation for simulate Gaussian random fields GaussianRandomFields.jl
from neuraloperator.
Hey, hi I m doing this for college project . So I have to create vortex formation around cylinder. And I m unable to create a gaussian random field for cylinder . Can please help relating to this .
It would be of really great help if you could help me out.
from neuraloperator.
Related Issues (20)
- Fix MLPLinear.forward - don't apply last linearity. HOT 1
- Explicityly Handle Higher-dimensional Convolutions HOT 1
- Bayesian Inverse Problems HOT 1
- FNO for complex-valued spatial data
- NS dataset question HOT 2
- .\neuraloperator\neuralop\training\callbacks.py error HOT 3
- OutputEncoderCallback problem HOT 3
- Inverse Problem in FNO HOT 2
- RuntimeError: The size of tensor a (2) must match the size of tensor b (16) at non-singleton dimension 3 HOT 1
- RuntimeError: The size of tensor a (2) must match the size of tensor b (16) at non-singleton dimension 3
- Fixing gradient backprop in #233
- MLP dropout > 0.5 causes error HOT 1
- DDP wireup requires calling from MPI
- import error in Training a TFNO on Darcy-Flow example HOT 1
- Training the neural operator in 3D spatial domain HOT 3
- Reproducing the published results
- Potential models to implement and merge
- Reproducing GINO
- loss sometimes jumps at restart
- How to create darcy dataset on different resolution? HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from neuraloperator.