Code Monkey home page Code Monkey logo

optimism's Introduction

OptimiSM: Computational solid mechanics made easy with Jax

Continuous integration

What is OptimiSM?

OptimiSM is a library for posing and solving problems in solid mechanics using the finite element method. The central theme of this project is exploring how to get better performance and robustness by taking advantages of the tools of variational calculus. OptimiSM uses Lagrangian field theory to pose hard nonlinear solid mechanics problems as optimization problems, and then uses powerful optimization methods to solve them efficiently and reliably.

To do this, OptimiSM relies on Google's JAX library for automatic differentiation and just-in-time compiling for performance.

Why use OptimiSM?

These days, there are lots of finite element software libraries out there. Why would you want to use OptimiSM?

  • OptimiSM is for rapid development: OptimiSM is written in Python and uses the NumPy/SciPy stack. This means that it's easy to read, understand, and extend. If you're like us, and you prefer working in Python/NumPy to C++, you'll find OptimiSM a more pleasant place to work (and play) than heavily abstracted finite element libraries, even the well-designed ones.
    OptimiSM makes use of Jax's just-in-time compilation to get good performance, so the simplicity of Python coding doesn't condemn you to toy problems.
  • OptimiSM provides robust solvers: OptimiSM takes a different approach than most finite element libraries. All problems are formulated by encoding them in a scalar-valued functional and then minimizing that functional. This includes nonlinear phenomena like finite deformations and contact, and even irreversible (dissipative) phenomena like plasticity and viscoelasticity. A big motivation for creating this library was proving to others (and ourselves) that real-world, complex problems could be written in this way, and that it could pay off for solving hard problems. By imposing a minimization structure, the OptimiSM solvers can avoid stagnating in hard problems and also avoid converging to spurious unstable configurations. In other words, OptimiSM helps you find the solutions that should be out there and prevents you from finding "solutions" that really aren't solutions. Check out the examples to see some cases that are difficult or impossible to solve correctly even with commerical codes.
  • OptimiSM gives sensitivities for design optimization, inverse analysis, and training of machine learning models.

Installation instructions

At the moment, OptimiSM is meant to be used as a development package. First, fork and clone the code repository from GitHub. Next, you have a choice: you can pick a basic installation which requires only a minimal set of dependencies, or the recommended installation, which requires some additional packages. The main difference of the recommended installation is that it requires the scikit-sparse package, which provides a sparse Cholesky preconditioner. This is needed if you want to run large-scale problems; without it, you'll only be able to use a dense matrix preconditioner (which is both slower and uses up much more memory).

  • Basic installation: If you just want to try some examples out and test-drive OptimiSM, install the basic installation by navigating into the base project directory and executing
pip install -e .
  • Recommended installation: The scikit-sparse package requires the SuiteSparse library to be present. If you have access to a package manager on your system, this is the easiest way to get it. On a Mac platform, this would be done with MacPorts by running
sudo port install SuiteSparse

or with Homebrew by

brew install suite-sparse

On a Fedora system, you would run

sudo dnf install suitesparse-devel

Of course, you could compile the source yourself if you wish. Check to make sure the version you download is supported by the scikit-sparse package. The source is available from the SuiteSparse website (a GitHub link is also provided there).

Once the SuiteSparse library is in place, navigate into the optimism directory and execute

pip install -e ".[sparse]"

Note that you can always start with the basic installation, and if you want to switch to the recommended version later, you can just get SuiteSpase and run the above recommended installation command to get the additional functionality. You don't need to remove the basic package first.

Sample Installation on OSX using Homebrew

From the optimism directory:

brew install suite-sparse
brew install python-tk 

INC=/usr/local/Cellar/suite-sparse/5.11.0/include
LIB=/usr/local/Cellar/suite-sparse/5.11.0/lib
pip=/usr/local/opt/python/bin/pip3
SUITESPARSE_INCLUDE_DIR=$INC SUITESPARSE_LIBRARY_DIR=$LIB $pip install -e . sparse

Installation using spack

Utilizing spack can alleviate some of the steps and headaches encountered in build described above to use spack to build optimism in a development environment, use the following instructions.

If you don't already have spack, you can clone the git repo using the following command

git clone https://github.com/spack/spack.git

Once you have spack you can do the following in the optimism directory

source /path/to/spack/share/spack/setup-env.sh
spack env activate .
spack concretize -f
spack install

The above will install all dependencies needed for optimism (including suite sparse and testing dependencies).

Finally, you can install optimism via pip with

pip install -e .[sparse,test]

Note that in each new terminal you will need to source the setup-env.sh script from spack and activate the env in the optimism folder.

Citing OptimiSM

If you use OptimiSM in your research, please cite

@software{OptimiSM,
  author = {Michael R. Tupek and Brandon Talamini},
  title = {{OptimiSM}},
  url = {https://github.com/sandialabs/optimism},
  version = {0.0.1},
  year = {2021},
}

TODO: add citation for contact paper

Reference documentation

For details about the OptimiSM API, see the documentation.

Contact

OptimiSM was created and is maintained by Michael Tupek [email protected] and Brandon Talamini [email protected].

SCR#: 2709.0

optimism's People

Contributors

astershic avatar btalami avatar btalamini avatar chamel3 avatar cmhamel avatar lxmota avatar ralberd avatar tupek2 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

optimism's Issues

Multi-material tutorial is broken

In optimism/examples/multi_material_tutorial/optimism_tutorial.ipynb and in Mechanics.py. We should look into testing the Jupyter notebooks.

Self contact issues

Self contact is currently broken. If you desire a sideset to contact itself, the current workaround is to split that edge into two edges.

Shape function derivatives use unsupported jax types

unsupported operand type(s) for *: 'list' and 'DeviceArray' in legendre.py causes failures in 1D quadrature tests. This is not showing up on mac systems, possibly because of an older jax version being used there.

Fix tolerancing in material test example

The example in optimism/examples/material_test runs correctly but appears to have errors because the requested error tolerance is below floating point precision (at least that is what it look like to me).

Make sparse cholesky optional again

We cut a corner earlier and removed all other preconditioning options in favor of the scikit-sparse Cholesky, which is supposed to be an optional package. We need to put in the same functionality with the SciPy dense cholesky routines (which we had at one point), and make this the default if scikit-sparse is not present.

Check if inherited methods on `ScaledObjective` are correct

The ScaledObjective class inherits functions from Objective for computing gradients, hessian-vector products, etc. It looks like some of these methods may not account for the scaling of the unknowns that ScaledObjective applies to the unknowns.

Missing patch test for dynamics

The tests of the Newmark method only check that it can integrate rigid motions. We should also be checking against an exact solution for a problem with deformation.

Unit test multi-materials

I updated the multi-material stuff in the Mechanics module, but didn't test it. It probably needs a redesign - it's weird to have it stand apart from the single-material version. Needing to test and maintain both separately is a burden. See #25 for some ideas.

Need to specify CPU/GPU explicitly in jax dependence

Seems that now pip install --upgrade jax doesn't install the jaxlib dependency. It works if you instead use pip install --upgrade "jax[cpu]".

Change our build dependencies to do this automatically, at least for the standard build. We can add an option later for GPU.

Fix deprecation warning when calling scipy conjugate gradient

The calls to the CG solver are using a deprecated interface:

DeprecationWarning: scipy.sparse.linalg.cg called without specifying atol. The default value will be changed in a future release. For compatibility, specify a value for atol explicitly, e.g., cg(..., atol=0), or to retain the old behavior cg(..., atol='legacy')

Scipy 1.12.0 breaks GMRES usage

Something breaks gmres when going from scipy 1.11.4 to 1.12.0. Below is the error trace:

Traceback (most recent call last):
File "test_NewtonGlobalization.py", line 136, in test_newton_step
xl += newton_step(residual, lambda v: linear_op(xl,v), xl)[0]
File "NewtonSolver.py", line 45, in newton_step
dx, exitcode = gmres(A, r, rtol=relTol*rNorm, atol=0, callback_type='legacy', callback=callback, maxiter=maxIters)
File "lib/scipy/_lib/deprecation.py", line 213, in inner_f
return f(args, **kwargs)
File "lib/scipy/sparse/linalg/_isolve/iterative.py", line 785, in gmres
w -= tmp
v[k, :]
ValueError: output array is read-only

Documentation

  • Add docstrings, at least to the API functions.
  • Use sphinx to generate documentation pages like Numpy and Jax do.

Ensure elements in blocks are consecutive

The element IDs in blocks are not enforced to be consecutive, which is bad for performance. We could sort the elements by block once the mesh is read in or generated. Look into the indices_are_sorted and unique_indices parameters on the jax ndarray indexing tools here.

Another possiblility is to make element block be another global array index, making the loop over blocks be vmap-able. For example, the shape function array would have 3 indices: [block][element][element node].

Avoid array copy when applying preconditioner

The sparse Cholesky preconditioner makes a copy of the residual vector when it is a Jax-numpy array:

b = onp.array(b)

I think the residual is always a Jax-numpy array, so the expense of the copy may be significant. The preconditioner is applied on every CG iteration which compounds the expense. We may want to use the copy=False argument in of the numpy array constructor: see here.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.