alpine-dav / ascent Goto Github PK
View Code? Open in Web Editor NEWA flyweight in situ visualization and analysis runtime for multi-physics HPC simulations
Home Page: https://alpine-dav.github.io/ascent/
License: Other
A flyweight in situ visualization and analysis runtime for multi-physics HPC simulations
Home Page: https://alpine-dav.github.io/ascent/
License: Other
It passes everything to uber as one string.
as additional smoke check we can use in ci
Need to fix this:
#ifdef ASCENT_USE_OPENMP
#pragma omp parrallel for
#endif
for (int i = 0; i < num_shapes; ++i)
{
shapes.GetPortalControl().Set(i, shape_value);
num_indices.GetPortalControl().Set(i, indices_value);
}
}
"parr me maties!"
it overwrites the same file each render
I am starting to think about how to generalize the runtime a bit more.
As we add more filters / capability, I would like to move away from having vtkh filters / api names hard coded in the runtime. This would avoid if statements that translate from the API name (iso_volume) to the filter name (vtkh_iso_volume). As a start we could place all of the API to filter names into a separate file or somehow add this to the filter itself and the add this to the lookup when we add the filter to the graph.
Add capability to connect filter output to the input parameters of another filter. For example, if we have a contour tree that outputs interesting isovalues, then we could connect that output to the iso_values 'params/iso_values' of the contour filter. Maybe we could achieve this some other way.
More as I think of them.
VTK-m is transitioning into a library from its previous header only state. Currently, if we include vtkm/cont/DataSet.h, then the compiler will eventually find TBB code for the device adapter. This means that we must include logic to find the TBB headers which complicates the build system and can lead to build errors. For example, if VTK-m is configured with TBB and Alpine is not, then we will be unable to find the headers.
I assume this will eventually go away.
-- Found Threads: TRUE
-- Failed to configure VTK-m component OpenGL: !OPENGL_FOUND
-- Could NOT find OpenGL (missing: OPENGL_gl_LIBRARY OPENGL_INCLUDE_DIR)
-- Could NOT find GLEW (missing: GLEW_INCLUDE_DIR GLEW_LIBRARY)
-- Failed to configure VTK-m component OpenGL: !OPENGL_FOUND
-- Could NOT find OpenGL (missing: OPENGL_gl_LIBRARY OPENGL_INCLUDE_DIR)
-- Could NOT find GLEW (missing: GLEW_INCLUDE_DIR GLEW_LIBRARY)
-- Failed to configure VTK-m component OpenGL: !OPENGL_FOUND
-- Could NOT find EGL (missing: EGL_LIBRARY EGL_opengl_LIBRARY EGL_gldispatch_LIBRARY EGL_INCLUDE_DIR)
-- Failed to configure VTK-m component EGL: !VTKm_OpenGL_FOUND
-- Could NOT find OpenGL (missing: OPENGL_gl_LIBRARY OPENGL_INCLUDE_DIR)
-- Could NOT find GLEW (missing: GLEW_INCLUDE_DIR GLEW_LIBRARY)
-- Failed to configure VTK-m component OpenGL: !OPENGL_FOUND
-- Could NOT find EGL (missing: EGL_LIBRARY EGL_opengl_LIBRARY EGL_gldispatch_LIBRARY EGL_INCLUDE_DIR)
-- Failed to configure VTK-m component EGL: !VTKm_OpenGL_FOUND
-- Could NOT find OpenGL (missing: OPENGL_gl_LIBRARY OPENGL_INCLUDE_DIR)
-- Could NOT find GLEW (missing: GLEW_INCLUDE_DIR GLEW_LIBRARY)
-- Failed to configure VTK-m component OpenGL: !OPENGL_FOUND
-- Could NOT find GLFW (missing: GLFW_INCLUDE_DIR GLFW_LIBRARY)
-- Failed to configure VTK-m component GLFW: !VTKm_OpenGL_FOUND
-- Could NOT find OpenGL (missing: OPENGL_gl_LIBRARY OPENGL_INCLUDE_DIR)
-- Could NOT find GLEW (missing: GLEW_INCLUDE_DIR GLEW_LIBRARY)
-- Failed to configure VTK-m component OpenGL: !OPENGL_FOUND
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
/tmp/root/spack-stage/spack-stage-hqiA2v/vtk-m/vtkm/rendering/OPENGL_INCLUDE_DIR
used as include directory in directory /tmp/root/spack-stage/spack-stage-hqiA2v/vtk-m/vtkm/rendering
-- Configuring incomplete, errors occurred!
alpine_data_adaptor
to alpine_vtkh_data_adaptor
vtkm
to alpine
runtimevtkm
to alpine
runtimevtkm
runtimeDetermining root rank has a bug
add support to consume blueprint w/ amr info (still under development LLNL/conduit#285)
nesting will be handled with a ghost like indicator field and a threshold before rendering and extracts
When using ccmake there is no menu entry to specify the location of conduit. It would be helpful to have this so that ccmake does not need to be called using:
ccmake -DCONDUIT_DIR=$CONDUITLOCATION
and ability to copy necessary hosted assets to that web root dir
We need to add an option to specify the actions file. This will enable trigger / extracts to use inception without the worry of infinite inception.
Here the par denotes mpi, but its ambiguous (the non-mpi version also includes shared memory parallelism )
Use VTKm_ENABLE_GL_CONTEXT when merged (https://gitlab.kitware.com/vtk/vtk-m/merge_requests/1043)
What should the parallel strategy be for ranks without data? For example, say we have 10 domains distributed on 10 ranks. After a clip, we only have 5 ranks with data sets that contain anything. This would be a problem if a code path has a MPI_Barrier.
Should parallel code handle this or should detect this and create a mpi comm handle for just those domains?
as a flexible interface for capturing performance info
There is incorrect code setting the nodal velocities in the input data.
Currently, default actions exist in CloverLeaf source, as shown below:
!CALL conduit_node_set_path_char8_str(add_scene_act,"action", "add_scenes")
!scenes = conduit_node_fetch(add_scene_act,"scenes")
!CALL conduit_node_set_path_char8_str(scenes,"s1/plots/p1/type", "volume")
!CALL conduit_node_set_path_char8_str(scenes,"s1/plots/p1/params/field", "energy")
Resetting the actions at the top of the ascent_actions.json
file does not remove the defaults.
I was trying to render an int32 field, and received an error about failed to access as float64
to simplify very basic setups, here is an example:
conduit::Node &add_scene = actions.append();
add_scene["action"] = "add_scenes";
add_scene["scenes/scene1/plots/plt1/type"] = "pseudocolor";
add_scene["scenes/scene1/plots/plt1/params/field"] = "braid";
add_scene["scenes/scene1/image_prefix"] = output_file;
actions.print();
conduit::Node &add_scene = actions.append();
add_scene["action"] = "add_scene";
add_scene["scene1/plots/plt1/type"] = "pseudocolor";
add_scene["scene1/plots/plt1/params/field"] = "braid";
add_scene["scene1/image_prefix"] = output_file;
actions.print();
#68 added join_path
into the utils code, but this call exists as join_file_path
in conduit v0.3.0. How are the spack builds succeeding? Are they using conduit master (which is what I had to do to get it to build)? Maybe I just need to update the build documentation with the correct conduit version.
Currently, the runtime has to be aware of a pipeline before a plot can consume it. Rework this so the graph processing can take place in any order.
We should support this. VTKm might have evolved past the point where rectilinear grids must be float_default
add support for mfem to build system, setup low order refine path to enable rendering with vtk-m
I ran into this error with the color_map
settings in alpine_actions.json
(copied below). It was looking for "type": "alpha"
, but I had "type": "Alpha"
. It would be helpful if the error message was formatted like the desired format (i.e., with quotation marks) in the actions file or if the error message printed out what the valid options are.
terminate called after throwing an instance of 'conduit::Error'
what():
{
"file": "/usr/workspace/wsa/labasan1/quartzdat/alpine-generate-perf-model-data/src/alpine/pipelines/alpine_vtkm_renderer.cpp",
"line": 756,
"message": "Unknown control point type Alpha"
}
We should add a few blurbs about how and why we are evolving strawman into alpine.
We should also keep the strawman license info, and adapt to say alpine is a based on strawman.
it would be nice to have an option to hide the axes and the legend
This is sort of a note to myself. I believe the 2d failure is potentially do to a vtkm bug. What is happening is that the incoming data set has a field zero-copied. The data set in contained in a RenderContainer and is deleted by flow when it is not needed, but vtkm is trying to delete the underlying storage it does not own. Since it cannot, the pointer is not null which triggers:
caught ErrorBadValue : User allocated arrays cannot be reallocated. t_ascent_render_2d: /usr/workspace/wsb/larsen30/pascal/alpine/cuda_build/vtk-m/vtkm/cont/StorageBasic.cxx:257: void vtkm::cont::internal::StorageBasicBase::ReleaseResources(): Assertion
this->Array == nullptr' failed.`
VTKM_ASSERT(this->Array == nullptr);
change Filter::output_port
to Filter::has_output_port
git clone ://github.com/Alpine-DAV/vtk-h.git
should be
git clone https://github.com/Alpine-DAV/vtk-h.git
with cond comp for vtk-h/m stuff.
we want some automated check of the alpine lib file size, so we can know if things grow too quickly due to changes in vtk-m, or how we are using it
t_ascent_mpi_render_2d
and the t_python_ascent_mpi_render
both fail when cuda is enabled.
Poking a bit, in ascent_vtkh_data_adapter.cpp:VTKHDataAdapter::UniformBlueprintToVTKmDataSet
Something down stream isn't happy with the structured cell set for the 2d case.
Here is the logic which creates it. @mclarsen traced this a bit, and I believe hit an assert failure about being able to resize an array.
if(is_2d)
{
vtkm::Id2 dims2(dims[0], dims[1]);
vtkm::cont::CellSetStructured<2> cell_set(topo_name.c_str());
cell_set.SetPointDimensions(dims2);
result->AddCellSet(cell_set);
}
else
{
vtkm::cont::CellSetStructured<3> cell_set(topo_name.c_str());
cell_set.SetPointDimensions(dims);
result->AddCellSet(cell_set);
}
for scenes, allow something like:
image_prefix = "my_file_name_%07d"
Wire up the logic to "to the right thing" when an existing file exists, etc.
make -j
does not always work.
When starting from a clean build, the build system tries to copy the fortran module before it is created and all subsequent makes fail.
To resolve, you must make clean
and make
we should throw an error message instead
at a basic level, we need to support extracts w/o the vtkh components.
lodepng is still being built as a shared library when build type is set to static.
to fix: shift where flags from mfem's config.mk are bootstraped into cmake (put them in the libs argument of blt register lib)
when ~shared
we do not build python but the cmake config file output still writes out ENABLE_PYTHON ON
since +python
is still technically in the spec
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.