Based in Braunschweig, Germany.
- ๐ญ Iโm a Staff Scientist at the German Aerospace Center (DLR)
- ๐ฑ Currently working on reduced order models for fluid dynamics via machine learning
DeepCFD: Efficient Steady-State Laminar Flow Approximation with Deep Convolutional Neural Networks
License: MIT License
First of all, thank you for making this repository available. I have learnt a lot through it so far. However, I found something troubling while looking at the code in this line
I would say that computing channel weights from all the available data cause information leakage from the test/validation set to creep into the model. Ideally, the channel weights should be computed from only the training set, which stems from the assumption that the model generalizes well on the test set just by looking at the train set.
Extend AU/U-Net models and examples to 3D (WIP at https://github.com/carpemonf/DeepCFD/tree/14-create-3d-models)
Create Physics-informed NN Model for problems that are governed by partial differential equations.
Hi there,
Thanks for uploading this, I am exploring the code now and was wondering if you could provide the finalized weights of the model. I don't have the CPU to run the training unfortunately but would like to explore the result.
Josh
This branch will be used to investigate different PINN model strategies without affecting the development version in the "13-create-pinn-model" branch.
GNN (graph neural networks) should be implemented and evaluated as data-driven models and/or used as learning representation for PINNs.
Hi there,
Thank you for publishing this! I have been looking through the code and noticed that the number of layers per block was not specified in your DeepCFD.py script.
model = UNetEx(3, 3, filters=filters, kernel_size=kernel_size, batch_norm=bn, weight_norm=wn)
This results in the layers per block defaulting to 3 as defined in UNetEx
class. I was just wondering if this was intended as this differs from your paper's implementation of 2 layers per block. Thank you again for this!
Kind regards,
Sean
Dear, sir
Thank you so much for publishing this fantastic works. I went through the code, and have a small concerned:
How do you compute the multi-region label? Did you do it directly in OpenFOAM, or any other software?.
Once again, thank you sir for considering me! I'm really appreciated if you guide me through these question. Hope you all the best
Create MLP model to allow the solution of other types of non-image data.
Hello,
First of all thank you a lot for sharing your model.
I've read the paper and tested the model with the data you've provided but I wonder what would be the results if the model was trained with another data or bigger dataset?
In the paper you've stated that the dataset was obtained with simpleFOAM simulator but you've also used some random geometry generator and the shell script to save the data into the form which is then passed to the model.
Is there any chance that you could also share the data generation script or guidelines?
Best regards,
Adam
Hello, and thanks for sharing your model.
I have a question on its universality: how do you represent the import and export conditions of the fluid? For example, the exact speed or pressure conditions in inlet and outlet area. Do I have to re-prepare the data and retrain the model as long as the import and export conditions change? Or are there any method to represent these conditions into the model?
Hey guys,
I was exploring your code and found a huge class imbalance in the dataset. Your testing split contains 2samples which contribute almost everything to the MSE. Leaving those two out would reduce your achieved MSE of 2.04 to around 0.3. The CFD simulation of those samples looks totally different than the rest. In total there are about 6 or 7 of those samples in your dataset. Since I guess this is not intended I wanted to let you know :)
Best regards
Alex
Hi, mdribeiro:
Really appreciate your excellent work. May I ask that the result.json file contains the well-trained model's weights?
Thank you very much
Monica
Tests should be created to make sure the repository grows safely.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.