Code Monkey home page Code Monkey logo

dl_heat_eq_inversion's Introduction

Exploring numerical inverse modeling with deep learning from a "simplified" perspective

Background

I started exploring this in response to some questions about feasibility of deep-learning based inverse modeling techniques for a much more complex topic. Disregarding all of that context, I'm aiming to understand some limitations and possibilities for using numerical forward modeling, deep-learning inverse modeling, and physically guided constraints.

Problem statement: heat equation

The 1D heat equation is a partial differential equation that describes the distribution of heat in a system over time. The equation is given by:

$$\frac{\partial u(x, t)}{\partial t} = \D(x) \frac{\partial^2 u(x, t)}{\partial x^2}$$

where $u(x, t)$ represents the temperature at position $x$ and time $t$, $D$ is the thermal diffusivity. For the case of this experiment we generally consider a temporally constant diffusivity, though it may vary in space. The domain of the system will be fixed to $x\in[0,1]$. To complete the problem description we require boundary conditions at $x=0$ and $x=1$.

For the sake of simplicity we will tie the boundary conditions and form of the diffusivity function together, making it easy to generate random instances. To generate the diffusivity functions we simply sample a number of Gaussian distributions with random mean and variance terms which will be used as the diffusivity. When sampling more than a single Gaussian, the overal result is averaged for each point in the domain. The boundary conditions are set by the min and max of the diffusivity, so that $u(x=0)=min(\alpha)$ and $u(x=1)=max(\alpha)$.

We solve the heat equation via the Crank-Nicholson implicit method, with a timestep of $dt=0.001$ and end time of $t_{end}=10.0$. One hundred gridpoints are used in the spatial discretization.

Experimental setup

Basic setup for the "complete" inversion model

To set up the baseline model we will use a simple 1 dimentionsl CNN with nothing special going on. The inverse model that we wish to construct is:

$$ D(x) \approx f(u(x)) $$

The learning process set up in the second notebook highlights the first attempt at this, and works quite well.

dl_heat_eq_inversion's People

Contributors

arbennett avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.