Code Monkey home page Code Monkey logo

frontistr / frontistr Goto Github PK

View Code? Open in Web Editor NEW
90.0 19.0 39.0 234.18 MB

This is the official github mirror repository of FrontISTR, Open-Source Large-Scale Parallel FEM Program for Nonlinear Structural Analysis. Active developments of FrontISTR are hosted on https://gitlab.com/FrontISTR-Commons/FrontISTR.

Home Page: https://www.frontistr.com/

License: MIT License

Makefile 1.09% C 19.67% Shell 0.38% Fortran 20.48% C++ 1.43% Lex 0.15% Perl 0.04% CMake 0.47% Dockerfile 0.08% HTML 0.01% Roff 56.22%
fem finite-element-analysis finite-element-methods mechanical-engineering computational-mechanics structural-engineering high-performance-computing parallel-computing

frontistr's Introduction

README of FrontISTR on HEC-MW

FrontISTR Commons

Comments, Questions, Problems etc.

e-mail : [email protected]

Programs in this archive

FrontISTR : Open-Source Large-Scale Parallel FEM Program for Nonlinear Structural Analysis

Detailed manuals

Files in this directory

README.md : README (in English : this file)
README.ja.md : README (in Japanese)
VERSION : version information
setup.sh & setup_fistr.sh : shell script to create makefiles
Makefile.am : base file of makefile for installation
Makefile.conf : setting file for users
Makefile.dev : setting file for developers

doc/ : documents
tutorial/ : tutorial data
examples/ : some examples
fisrt1/ : FrontISTR
hecmw1/ : HEC-MW
etc/ : various setting files

NOTICE

Please read "License.txt" carefully BEFORE you start to use this software.

frontistr's People

Contributors

cae-yoshino avatar chikasuiro avatar getwelsim avatar highlandvalley avatar hillyuan avatar hiroki2805 avatar kameko-haya avatar kazuya-goto avatar kinagaki avatar kinagaki-fj avatar lcheng9 avatar luzpaz avatar michioga avatar mitsumega-idaj avatar noriyukikushida avatar nqomorita avatar sakurano avatar sundaydeveloper avatar t-hishinuma avatar termoshtt avatar tokunaga-advance avatar tvgez avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

frontistr's Issues

Failed to run with parallel

I try to run mpiexec -np 4 fistr1 , but failed. The error infomation is as below:

##################################################################
#                         FrontISTR                              #
##################################################################
---
version:    5.1.0
git_hash:   acab000c8c633b7b9d596424769e14363f720841
build:
  date:     2020-09-28T09:23:01Z
  MPI:      enabled
  OpenMP:   enabled
  option:   "-p --with-tools --with-refiner --with-metis --with-mumps --with-lapack --with-ml "
  HECMW_METIS_VER: 5
execute:  
  date:       2020-09-28T19:41:09+0800
  processes:  1
  threads:    32
  cores:      32
  host:
    0: l-Pro-WS-C621-64L-SAGE-Series
---
...
##################################################################
#                         FrontISTR                              #
##################################################################
---
version:    5.1.0
git_hash:   acab000c8c633b7b9d596424769e14363f720841
build:
  date:     2020-09-28T09:23:01Z
  MPI:      enabled
  OpenMP:   enabled
  option:   "-p --with-tools --with-refiner --with-metis --with-mumps --with-lapack --with-ml "
  HECMW_METIS_VER: 5
execute:  
  date:       2020-09-28T19:41:09+0800
  processes:  1
  threads:    32
  cores:      32
  host:
    0: l-Pro-WS-C621-64L-SAGE-Series
---
...
##################################################################
#                         FrontISTR                              #
##################################################################
---
version:    5.1.0
git_hash:   acab000c8c633b7b9d596424769e14363f720841
build:
  date:     2020-09-28T09:23:01Z
  MPI:      enabled
  OpenMP:   enabled
  option:   "-p --with-tools --with-refiner --with-metis --with-mumps --with-lapack --with-ml "
  HECMW_METIS_VER: 5
execute:  
  date:       2020-09-28T19:41:09+0800
  processes:  1
  threads:    32
  cores:      32
  host:
    0: l-Pro-WS-C621-64L-SAGE-Series
---
...
##################################################################
#                         FrontISTR                              #
##################################################################
---
version:    5.1.0
git_hash:   acab000c8c633b7b9d596424769e14363f720841
build:
  date:     2020-09-28T09:23:01Z
  MPI:      enabled
  OpenMP:   enabled
  option:   "-p --with-tools --with-refiner --with-metis --with-mumps --with-lapack --with-ml "
  HECMW_METIS_VER: 5
execute:  
  date:       2020-09-28T19:41:09+0800
  processes:  1
  threads:    32
  cores:      32
  host:
    0: l-Pro-WS-C621-64L-SAGE-Series
---
...
 Step control not defined! Using default step=1
 Step control not defined! Using default step=1
 Step control not defined! Using default step=1
 fstr_setup: OK
 fstr_setup: OK
 Step control not defined! Using default step=1
 fstr_setup: OK
 fstr_setup: OK
 Start visualize PSF 1 at timestep 0
 Start visualize PSF 1 at timestep 0
 Start visualize PSF 1 at timestep 0
 Start visualize PSF 1 at timestep 0
 
 loading step=    1
 
 loading step=    1
 sub_step= 1,   current_time=  0.0000E+00, time_inc=  0.1000E+01
 loading_factor=    0.0000000   1.0000000
 sub_step= 1,   current_time=  0.0000E+00, time_inc=  0.1000E+01
 loading_factor=    0.0000000   1.0000000
Fatal error in PMPI_Isend: Invalid rank, error stack:
PMPI_Isend(149): MPI_Isend(buf=0x55e9eeff4be0, count=1374, MPI_DOUBLE_PRECISION, dest=1, tag=0, comm=0x84000000, request=0x55e9eefedb60) failed
PMPI_Isend(97).: Invalid rank has value 1 but must be nonnegative and less than 1
[unset]: Fatal error in PMPI_Isend: Invalid rank, error stack:
PMPI_Isend(149): MPI_Isend(buf=0x562909761be0, count=1374, MPI_DOUBLE_PRECISION, dest=1, tag=0, comm=0x84000000, request=0x56290975ab60) failed
PMPI_Isend(97).: Invalid rank has value 1 but must be nonnegative and less than 1
write_line error; fd=-1 buf=:cmd=abort exitcode=805943046
:
system msg for write_line failure : Bad file descriptor
[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=134854406
:
system msg for write_line failure : Bad file descriptor
 
 loading step=    1
-------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code.. Per user-direction, the job has been aborted.
-------------------------------------------------------
Fatal error in PMPI_Isend: Invalid rank, error stack:
PMPI_Isend(149): MPI_Isend(buf=0x561587e06be0, count=1374, MPI_DOUBLE_PRECISION, dest=1, tag=0, comm=0x84000000, request=0x561587dffb60) failed
PMPI_Isend(97).: Invalid rank has value 1 but must be nonnegative and less than 1
[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=269072134
:
system msg for write_line failure : Bad file descriptor
 sub_step= 1,   current_time=  0.0000E+00, time_inc=  0.1000E+01
 loading_factor=    0.0000000   1.0000000
 
 loading step=    1
--------------------------------------------------------------------------
mpiexec detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[31226,1],2]
  Exit code:    6 

and this is my computer configuration:

            .-/+oossssoo+/-.               l@l-Pro-WS-C621-64L-SAGE-Series 
        `:+ssssssssssssssssss+:`           ------------------------------- 
      -+ssssssssssssssssssyyssss+-         OS: Ubuntu 18.04.4 LTS x86_64 
    .ossssssssssssssssssdMMMNysssso.       Host: Pro WS C621-64L SAGE Series 
   /ssssssssssshdmmNNmmyNMMMMhssssss/      Kernel: 5.4.0-48-generic 
  +ssssssssshmydMMMMMMMNddddyssssssss+     Uptime: 1 hour, 28 mins 
 /sssssssshNMMMyhhyyyyhmNMMMNhssssssss/    Packages: 1960 
.ssssssssdMMMNhsssssssssshNMMMdssssssss.   Shell: bash 4.4.20 
+sssshhhyNMMNyssssssssssssyNMMMysssssss+   Resolution: 1920x1080 
ossyNMMMNyMMhsssssssssssssshmmmhssssssso   DE: GNOME 3.28.4 
ossyNMMMNyMMhsssssssssssssshmmmhssssssso   WM: GNOME Shell 
+sssshhhyNMMNyssssssssssssyNMMMysssssss+   WM Theme: Adwaita 
.ssssssssdMMMNhsssssssssshNMMMdssssssss.   Theme: Ambiance [GTK2/3] 
 /sssssssshNMMMyhhyyyyhdNMMMNhssssssss/    Icons: Ubuntu-mono-dark [GTK2/3] 
  +sssssssssdmydMMMMMMMMddddyssssssss+     Terminal: gnome-terminal 
   /ssssssssssshdmNNNNmyNMMMMhssssss/      CPU: Intel Xeon W-3245M (32) @ 4.400GHz 
    .ossssssssssssssssssdMMMNysssso.       GPU: NVIDIA NVIDIA Corporation Device 1e81 
      -+sssssssssssssssssyyyssss+-         Memory: 1411MiB / 515420MiB 
        `:+ssssssssssssssssss+:` 
            .-/+oossssoo+/-.                                       

Output result error (CONTACT_STATE etc.) in dynamic explicit method (center diffrence method)

When I compute something with FrontISTR used dynamic explicit method, I found that there are some result keep 0.000000000000E+0 in all result files, They are the CONTACT_STATE and CONTACT_NFORCE etc. For CONTACT_STATE, I find a funcation named set_contact_state_vector( fstrSOLID%contacts(i), dt, fstrSOLID%CONT_RELVEL, fstrSOLID%CONT_STATE ) in fstr_scan_contact_state, but not in fstr_scan_contact_state_exp , when I added it, CONTACT_STATE changed. But for CONTACT_NFORCE , I do not know how to fix it. Does explicit method should not output these contact information, or just some bugs?

msys2/mingw and build-in packages

I try to compile FrontISTR with this build-in packages and I have this errors:

  1. I have to add a flag -DMUMPS_LIBRARIES=${MINGW_PREFIX}/lib so cmake can find MUMPS, but after this:
[710/710] Linking CXX executable fistr1\fistr1.exe
FAILED: fistr1/fistr1.exe
cmd.exe /C "cd . && C:\msys64\mingw64\bin\c++.exe -march=x86-64 -mtune=generic -O2 -pipe -fopenmp -O3 -DNDEBUG -pipe -fopenmp fistr1/CMakeFiles/fistr1.dir/src/main/fistr_main.f90.obj fistr1/CMakeFiles/fistr1.dir/src/main/main.c.obj -o fistr1\fistr1.exe -Wl,--out-implib,fistr1\libfistr1.dll.a -Wl,--major-image-version,0,--minor-image-version,0  fistr1/libfistr.a  hecmw1/libhecmw.a  -lscalapack  -lparmetis  -lmetis  -lopenblas  -lopenblas  -lws2_32  -lgfortran  -lmingw32  -lgcc_s  -lgcc  -lmoldname  -lmingwex  -lkernel32  -lquadmath  -lm  -lmingw32  -lgcc_s  -lgcc  -lmoldname  -lmingwex  -lkernel32  -lpthread  -ladvapi32  -lshell32  -luser32  -lkernel32  -lmingw32  -lgcc_s  -lgcc  -lmoldname  -lmingwex  -lkernel32  -lquadmath  -lm  -lpthread  -ladvapi32  -lshell32  -luser32  -lgfortran  -lquadmath  -lm  -lkernel32 -luser32 -lgdi32 -lwinspool -lshell32 -lole32 -loleaut32 -luuid -lcomdlg32 -ladvapi32 && cd ."
C:/msys64/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/10.3.0/../../../../x86_64-w64-mingw32/bin/ld.exe: hecmw1/libhecmw.a(hecmw_MUMPS_wrapper.F90.obj):hecmw_MUMPS_wrapper.F90:(.text+0x2df): undefined reference to `dmumps_'
C:/msys64/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/10.3.0/../../../../x86_64-w64-mingw32/bin/ld.exe: hecmw1/libhecmw.a(hecmw_MUMPS_wrapper.F90.obj):hecmw_MUMPS_wrapper.F90:(.text+0x3c7): undefined reference to `dmumps_'
collect2.exe: error: ld returned 1 exit status
ninja: build stopped: subcommand failed.
  1. Cmake find MS-MPI:
-- Found MPI_C: C:/msys64/mingw64/lib/libmsmpi.a (found version "2.0")
-- Found MPI_CXX: C:/msys64/mingw64/lib/libmsmpi.a (found version "2.0")
-- Found MPI_Fortran: C:/msys64/mingw64/lib/libmsmpi.a (found version "2.0")
-- Found MPI: TRUE (found version "2.0")

but have this error:

[264/710] Building Fortran object hecmw1/CMakeFiles/hecmw.dir/src/common/hecmw_util_f.F90.obj
FAILED: hecmw1/CMakeFiles/hecmw.dir/src/common/hecmw_util_f.F90.obj hecmw1/hecmw_util.mod
C:\msys64\mingw64\bin\gfortran.exe -IC:\msys64\usr\local\pkg_frontISTR\src\FrontISTR-v5.2\hecmw1\src\common -I. -IC:/msys64/mingw64/include -IC:/msys64/usr/local/pkg_frontISTR/src/FrontISTR-v5.2/hecmw1/src/common -IC:/msys64/usr/local/pkg_frontISTR/src/FrontISTR-v5.2/hecmw1/src/solver/precond/33 -IC:/msys64/usr/local/pkg_frontISTR/src/FrontISTR-v5.2/hecmw1/src/solver/precond/nn -IC:/msys64/usr/local/pkg_frontISTR/src/FrontISTR-v5.2/hecmw1/src/hecmw -IC:/msys64/usr/local/pkg_frontISTR/src/FrontISTR-v5.2/hecmw1/src/visualizer -fallow-argument-mismatch -fopenmp -fno-range-check -O3 -DNDEBUG -O3 -Jhecmw1 -fpreprocessed -c hecmw1/CMakeFiles/hecmw.dir/src/common/hecmw_util_f.F90-pp.f90 -o hecmw1/CMakeFiles/hecmw.dir/src/common/hecmw_util_f.F90.obj
mpif.h:227:36:

Error: BOZ literal constant at (1) is neither a data-stmt-constant nor an actual argument to INT, REAL, DBLE, or CMPLX intrinsic function [see '-fno-allow-invalid-boz']
mpif.h:303:27:

Error: BOZ literal constant at (1) is neither a data-stmt-constant nor an actual argument to INT, REAL, DBLE, or CMPLX intrinsic function [see '-fno-allow-invalid-boz']
mpif.h:305:36:

Build-in packages in msys2/mingw64
https://packages.msys2.org/package/mingw-w64-x86_64-mumps?repo=mingw64
https://packages.msys2.org/package/mingw-w64-x86_64-msmpi?repo=mingw64

About Coupled Temperature-Displacement analysis type

Dear tech. support of FrontISTR,

Good afternoon, here is TAN.

I wanna try to implement the coupled temp-disp analysis within FrontISTR,
however, the heat transfer and Static or Quasi-Static are individual right now
if I am not wrong.

it will spend a lot of time to totally comprehend the source codes.
(the new coupled element, the new time stepping involving the (nonlinear) load stepping,
{most difficult} the info. communication interface within two processes and so on....)

Could you give me some suggestions about this ?

Thanks so much.

Best regards,
TAN

mpi and hybrid tests fail on macOS PPC

If someone could suggest what is the problem, that would be great.

50% tests passed, 50 tests failed out of 100

Label Time Summary:
analysis/dynamic/explicit,hybrid    =   0.78 sec*proc (1 test)
analysis/dynamic/explicit,mpi       =   0.93 sec*proc (1 test)
analysis/dynamic/explicit,openmp    =  75.94 sec*proc (1 test)
analysis/dynamic/explicit,serial    =  80.01 sec*proc (1 test)
analysis/dynamic/implicit,hybrid    =   0.79 sec*proc (1 test)
analysis/dynamic/implicit,mpi       =   0.99 sec*proc (1 test)
analysis/dynamic/implicit,openmp    = 306.39 sec*proc (1 test)
analysis/dynamic/implicit,serial    = 128.01 sec*proc (1 test)
analysis/eigen/exJ,hybrid           =   3.42 sec*proc (1 test)
analysis/eigen/exJ,mpi              =   4.38 sec*proc (1 test)
analysis/eigen/exJ,openmp           =  97.96 sec*proc (1 test)
analysis/eigen/exJ,serial           =  40.45 sec*proc (1 test)
analysis/eigen/exK,hybrid           =   4.12 sec*proc (1 test)
analysis/eigen/exK,mpi              =   5.64 sec*proc (1 test)
analysis/eigen/exK,openmp           =  86.77 sec*proc (1 test)
analysis/eigen/exK,serial           =  41.22 sec*proc (1 test)
analysis/heat/exM,hybrid            =   2.17 sec*proc (1 test)
analysis/heat/exM,mpi               =   2.54 sec*proc (1 test)
analysis/heat/exM,openmp            =   5.68 sec*proc (1 test)
analysis/heat/exM,serial            =   4.22 sec*proc (1 test)
analysis/heat/exN,hybrid            =   3.73 sec*proc (1 test)
analysis/heat/exN,mpi               =   3.90 sec*proc (1 test)
analysis/heat/exN,openmp            =  20.96 sec*proc (1 test)
analysis/heat/exN,serial            =   8.64 sec*proc (1 test)
analysis/heat/exO,hybrid            =   3.77 sec*proc (1 test)
analysis/heat/exO,mpi               =   4.33 sec*proc (1 test)
analysis/heat/exO,openmp            =  22.45 sec*proc (1 test)
analysis/heat/exO,serial            =   7.84 sec*proc (1 test)
analysis/heat/exP,hybrid            =   3.73 sec*proc (1 test)
analysis/heat/exP,mpi               =   3.90 sec*proc (1 test)
analysis/heat/exP,openmp            =  22.67 sec*proc (1 test)
analysis/heat/exP,serial            =   8.18 sec*proc (1 test)
analysis/heat/exQ,hybrid            =   3.75 sec*proc (1 test)
analysis/heat/exQ,mpi               =   3.87 sec*proc (1 test)
analysis/heat/exQ,openmp            =  21.91 sec*proc (1 test)
analysis/heat/exQ,serial            =   7.68 sec*proc (1 test)
analysis/heat/exQ2,hybrid           =   0.34 sec*proc (1 test)
analysis/heat/exQ2,mpi              =   0.36 sec*proc (1 test)
analysis/heat/exQ2,openmp           =   3.94 sec*proc (1 test)
analysis/heat/exQ2,serial           =   1.72 sec*proc (1 test)
analysis/heat/exR,hybrid            =   3.76 sec*proc (1 test)
analysis/heat/exR,mpi               =   3.98 sec*proc (1 test)
analysis/heat/exR,openmp            =  26.93 sec*proc (1 test)
analysis/heat/exR,serial            =   9.21 sec*proc (1 test)
analysis/heat/exS,hybrid            =   3.76 sec*proc (1 test)
analysis/heat/exS,mpi               =   3.86 sec*proc (1 test)
analysis/heat/exS,openmp            =  24.99 sec*proc (1 test)
analysis/heat/exS,serial            =   7.99 sec*proc (1 test)
analysis/heat/exT,hybrid            =   0.32 sec*proc (1 test)
analysis/heat/exT,mpi               =   0.34 sec*proc (1 test)
analysis/heat/exT,openmp            =   1.09 sec*proc (1 test)
analysis/heat/exT,serial            =   0.68 sec*proc (1 test)
analysis/heat/exU,hybrid            =   4.81 sec*proc (1 test)
analysis/heat/exU,mpi               =   5.14 sec*proc (1 test)
analysis/heat/exU,openmp            = 469.30 sec*proc (1 test)
analysis/heat/exU,serial            = 117.13 sec*proc (1 test)
analysis/heat/exU2,hybrid           =   0.38 sec*proc (1 test)
analysis/heat/exU2,mpi              =   0.42 sec*proc (1 test)
analysis/heat/exU2,openmp           =   3.73 sec*proc (1 test)
analysis/heat/exU2,serial           =   1.91 sec*proc (1 test)
analysis/heat/exV,hybrid            =   0.94 sec*proc (1 test)
analysis/heat/exV,mpi               =   1.01 sec*proc (1 test)
analysis/heat/exV,openmp            = 274.92 sec*proc (1 test)
analysis/heat/exV,serial            =  49.52 sec*proc (1 test)
analysis/static/exA,hybrid          =   5.23 sec*proc (1 test)
analysis/static/exA,mpi             =   5.95 sec*proc (1 test)
analysis/static/exA,openmp          =  22.15 sec*proc (1 test)
analysis/static/exA,serial          =  16.56 sec*proc (1 test)
analysis/static/exB,hybrid          =   3.94 sec*proc (1 test)
analysis/static/exB,mpi             =   4.14 sec*proc (1 test)
analysis/static/exB,openmp          =  18.19 sec*proc (1 test)
analysis/static/exB,serial          =  13.69 sec*proc (1 test)
analysis/static/exC,hybrid          =   3.92 sec*proc (1 test)
analysis/static/exC,mpi             =   4.20 sec*proc (1 test)
analysis/static/exC,openmp          =  18.19 sec*proc (1 test)
analysis/static/exC,serial          =  13.73 sec*proc (1 test)
analysis/static/exD,hybrid          =   3.96 sec*proc (1 test)
analysis/static/exD,mpi             =   4.54 sec*proc (1 test)
analysis/static/exD,openmp          =  17.81 sec*proc (1 test)
analysis/static/exD,serial          =  14.69 sec*proc (1 test)
analysis/static/exE,hybrid          =   3.93 sec*proc (1 test)
analysis/static/exE,mpi             =   4.26 sec*proc (1 test)
analysis/static/exE,openmp          =  17.97 sec*proc (1 test)
analysis/static/exE,serial          =  13.63 sec*proc (1 test)
analysis/static/exF,hybrid          =   3.30 sec*proc (1 test)
analysis/static/exF,mpi             =   3.76 sec*proc (1 test)
analysis/static/exF,openmp          =  16.14 sec*proc (1 test)
analysis/static/exF,serial          =  13.48 sec*proc (1 test)
analysis/static/exG,hybrid          =   3.93 sec*proc (1 test)
analysis/static/exG,mpi             =   4.37 sec*proc (1 test)
analysis/static/exG,openmp          =  19.43 sec*proc (1 test)
analysis/static/exG,serial          =  13.51 sec*proc (1 test)
analysis/static/exI,hybrid          =   2.93 sec*proc (1 test)
analysis/static/exI,mpi             =   3.38 sec*proc (1 test)
analysis/static/exI,openmp          = 127.12 sec*proc (1 test)
analysis/static/exI,serial          =  41.66 sec*proc (1 test)
nonlinear/contact_iter,hybrid       =   2.51 sec*proc (1 test)
nonlinear/contact_iter,mpi          =   2.74 sec*proc (1 test)
nonlinear/contact_iter,openmp       =  82.63 sec*proc (1 test)
nonlinear/contact_iter,serial       =  13.25 sec*proc (1 test)

Total Test time (real) = 2631.52 sec

The following tests FAILED:
	 26 - test_mpi_analysis/dynamic/explicit (Failed)
	 27 - test_mpi_analysis/dynamic/implicit (Failed)
	 28 - test_mpi_analysis/eigen/exJ (Failed)
	 29 - test_mpi_analysis/eigen/exK (Failed)
	 30 - test_mpi_analysis/heat/exM (Failed)
	 31 - test_mpi_analysis/heat/exN (Failed)
	 32 - test_mpi_analysis/heat/exO (Failed)
	 33 - test_mpi_analysis/heat/exP (Failed)
	 34 - test_mpi_analysis/heat/exQ (Failed)
	 35 - test_mpi_analysis/heat/exQ2 (Failed)
	 36 - test_mpi_analysis/heat/exR (Failed)
	 37 - test_mpi_analysis/heat/exS (Failed)
	 38 - test_mpi_analysis/heat/exT (Failed)
	 39 - test_mpi_analysis/heat/exU (Failed)
	 40 - test_mpi_analysis/heat/exU2 (Failed)
	 41 - test_mpi_analysis/heat/exV (Failed)
	 42 - test_mpi_analysis/static/exA (Failed)
	 43 - test_mpi_analysis/static/exB (Failed)
	 44 - test_mpi_analysis/static/exC (Failed)
	 45 - test_mpi_analysis/static/exD (Failed)
	 46 - test_mpi_analysis/static/exE (Failed)
	 47 - test_mpi_analysis/static/exF (Failed)
	 48 - test_mpi_analysis/static/exG (Failed)
	 49 - test_mpi_analysis/static/exI (Failed)
	 50 - test_mpi_nonlinear/contact_iter (Failed)
	 76 - test_hybrid_analysis/dynamic/explicit (Failed)
	 77 - test_hybrid_analysis/dynamic/implicit (Failed)
	 78 - test_hybrid_analysis/eigen/exJ (Failed)
	 79 - test_hybrid_analysis/eigen/exK (Failed)
	 80 - test_hybrid_analysis/heat/exM (Failed)
	 81 - test_hybrid_analysis/heat/exN (Failed)
	 82 - test_hybrid_analysis/heat/exO (Failed)
	 83 - test_hybrid_analysis/heat/exP (Failed)
	 84 - test_hybrid_analysis/heat/exQ (Failed)
	 85 - test_hybrid_analysis/heat/exQ2 (Failed)
	 86 - test_hybrid_analysis/heat/exR (Failed)
	 87 - test_hybrid_analysis/heat/exS (Failed)
	 88 - test_hybrid_analysis/heat/exT (Failed)
	 89 - test_hybrid_analysis/heat/exU (Failed)
	 90 - test_hybrid_analysis/heat/exU2 (Failed)
	 91 - test_hybrid_analysis/heat/exV (Failed)
	 92 - test_hybrid_analysis/static/exA (Failed)
	 93 - test_hybrid_analysis/static/exB (Failed)
	 94 - test_hybrid_analysis/static/exC (Failed)
	 95 - test_hybrid_analysis/static/exD (Failed)
	 96 - test_hybrid_analysis/static/exE (Failed)
	 97 - test_hybrid_analysis/static/exF (Failed)
	 98 - test_hybrid_analysis/static/exG (Failed)
	 99 - test_hybrid_analysis/static/exI (Failed)
	100 - test_hybrid_nonlinear/contact_iter (Failed)
Errors while running CTest

Full log from tests:
frontistr_tests_log.txt
Build log:
frontistr_build_log.txt

Failed to compile tools

Description

I failed to compile while tool compiles with the follow message.

[ 49%] Building C object hecmw1/tools/CMakeFiles/hec2rcap.dir/hec2rcap/hec2rcap.c.o
Linking CXX executable hec2rcap
/usr/bin/ld: cannot open output file hec2rcap: Is a directory

...
Linking CXX executable neu2fstr
/usr/bin/ld: cannot open output file neu2fstr: Is a directory

Steps

  1. Check out from this repository.
  2. Run the follow command.
cmake .
make

Possible Implementation

It seems to cause this for that the source directory name and the binary name were same, and CMake confused its.

So I suggest to

  1. rename hec2rcap and neu2fstr, or
  2. output the binary to the other directory like bin.

Environment

CentOS 6

Failed to Compile

I'm trying to compile with almost all options.

-- The following OPTIONAL packages have been found:

 * MPI
 * OpenMP
 * BLAS
 * Threads
 * LAPACK
 * Metis
 * Mumps
 * Refiner
 * Scalapack
 * Trilinos
 * Doxygen

-- The following OPTIONAL packages have not been found:

 * Parmetis
 * Revocap

When on compilation with make, I've confronted with an error.

[  0%] Building Fortran object hecmw1/CMakeFiles/hecmw.dir/src/solver/mumps/hecmw_MUMPS_wrapper.F90.o
/path/to/FrontISTR/hecmw1/src/solver/mumps/hecmw_MUMPS_wrapper.F90:12: Error: Can't open included file 'dmumps_struc.h'
hecmw1/CMakeFiles/hecmw.dir/build.make:1544: recipe for target 'hecmw1/CMakeFiles/hecmw.dir/src/solver/mumps/hecmw_MUMPS_wrapper.F90.o' failed
make[2]: *** [hecmw1/CMakeFiles/hecmw.dir/src/solver/mumps/hecmw_MUMPS_wrapper.F90.o] Error 1
CMakeFiles/Makefile2:92: recipe for target 'hecmw1/CMakeFiles/hecmw.dir/all' failed
make[1]: *** [hecmw1/CMakeFiles/hecmw.dir/all] Error 2
Makefile:162: recipe for target 'all' failed
make: *** [all] Error 2

The message says dmumps_struc.h doesn't exist, though it certainly exists in MUMPS_INCLUDE_PATH or /use/include.

$> ll /usr/include/dmumps_struc.h 
-rw-r--r-- 1 root root 11437 Oct  2  2017 /usr/include/dmumps_struc.h

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.