Code Monkey home page Code Monkey logo

khu-maslab / cnn-dp Goto Github PK

View Code? Open in Web Editor NEW
1.0 1.0 0.0 10.4 MB

A novel neural network for effective learning of highly impulsive/oscillatory dynamic systems by jointly utilizing low-order derivatives

Home Page: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4296911

License: MIT License

Python 100.00%
data-driven-model dynamics-simulation impulse-response multibody-dynamics nonlinear-dynamics pytorch surrogate-modeling surrogate-models

cnn-dp's Introduction

cNN-DP: Composite neural network with differential propagation

github Lee, Hyeonbeen and Han, Seongji and Choi, Hee-Sun and Kim, Jin-Gyun, cNN-DP: Composite Neural Network with Differential Propagation for Impulsive Nonlinear Dynamics. Available at SSRN: https://ssrn.com/abstract=4296911

What is this for?

cNN-DP: Composite neural network with differential propagation

We propose a novel composite neural network for effective learning of highly impulsive/oscillatory/chaotic dynamic systems. We assume our target as high-order derivatives and utilize additional low-order derivatives, which were wasted or inactively considered before.
It is effective, specifically for:

  • Modeling oscillatory solutions (derivatives) of differential equations
  • Learning noisy and impulsive time series measurements

Examples we present in our paper cover following systems:

  • Solutions of chaotic dynamic systems
  • Earthquake measurements
  • Rigid body contact
  • Industrial-level vehicle simulation

What's the advantage of using this?

  • Highly precise generalization on oscillatory, impulsive, and chaotic systems
  • Fast training and inference time
  • Low VRAM usage
  • Robust to data quality and hyperparameters
  • Does not require any physical equations to use
  • Expandable to any domains, such as functions of space, frequency, or any arbitrary variables

How does it work?

Screenshot 2023-06-11 at 20 33 29

Dynamics tend to become more 'impulsive' in high-order derivatives. In reverse, it becomes 'simpler'.
Then, why don't we let the neural network also refer to 'simple' when it's learning 'impulsive', rather than solely learning the 'impulsive'?

We intend our neural network to learn the 'simple' and the 'impulsive' simultaneously by interconnecting multiple MLP subnetworks with corresponding multi-order derivative losses. The preceding subnets predict the 'simple's, and their outputs are connected to inputs of subsequent subnets. This enables richer context information in the learning of impulsive or chaotic systems as functions of time and design variables.

Using torch pseudocode, the process can be expressed as:

n_dp0=MLP1()
n_dp1=MLP2()
n_dp2=MLP3()
def forward(x):
  y=n_dp0(x)
  yDot=n_dp1(x,y)
  yDDot=n_dp2(x,y,yDDot)
  return y,yDot,yDDot

This simple composition of subnetworks along with augmented multi-order derivatives produce strong improvements while maintaining speeds and costs.
Although we only use three subnets in our paper, the number of subnets in the cNN-DP is not strictly limited. Theoretically 2 to inf.

What is the auto-gradient network?

It is another competitor of the proposed cNN-DP using automatic differentiation to learn multi-order derivatives.
Suppose we have data of multiple orders of derivatives and we target the highest derivative, just like the cNN-DP. The idea of the auto-gradient network is to utilize automatic differentiation (torch.autograd.grad) to compute high-order predictions of neural networks.

It first outputs the lowest-order prediction. Then, we repeatedly differentiate the network to the time variable (which should be given in the input) to reach the highest-order prediction.

This approach can be a considerable substitute. However, it turns out to be extremely expensive and slow for both training and inference.

How do I use the codes?

In the examples directory, we have three examples presented in the paper each including data generation, training, and visualizing codes.

Simply running the datagen.py will automatically construct data files in the data directory.

train.py will train network models and save it to models directory. Inference of saved models can be easily obtained through architectures.interface.NetInterface class's predict method as following pseudocode:

n_dp = NetInterface(models/SAVED_MODEL.pt)
y,yDot,yDDot=n_dp.predict(input)

Citing the paper (BibTeX)

@article{lee4296911cnn,
  title={cNN-DP: Composite Neural Network with Differential Propagation for Impulsive Nonlinear Dynamics},
  author={Lee, Hyeonbeen and Han, Seongji and Choi, Hee-Sun and Kim, Jin-Gyun},
  journal={Available at SSRN 4296911}
}

cnn-dp's People

Contributors

hyeonbeenlee avatar

Stargazers

 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.