tudo-astroparticlephysics / proposal Goto Github PK
View Code? Open in Web Editor NEWMonte Carlo Simulation propagating charged Leptons through Media as C++ Library
License: GNU Lesser General Public License v3.0
Monte Carlo Simulation propagating charged Leptons through Media as C++ Library
License: GNU Lesser General Public License v3.0
CMake fails if doxygen is not installed
This would be really handy to get started.
For Ubuntu 16.10 it seems to be
$ apt install cmake \
doxygen \
liblog4cplus-dev
libgtest-dev \
libboost-dev \
libboost-python-dev \
libboost-program-options-dev
GTest is not optional, also listed as not required
in the README
I'm having trouble if I pass a secondaries array to a function, so I'm going to show a simplified version of my code. Let's us assume that I propagate a particle and store the secondaries in an array:
secondaries = propagator.propagate(propagation_length)
And then I pass this array to a function that loops over the secondaries and prints their x coordinate.
def print_x_secondaries(secondaries): for sec in secondaries: print(sec.position.x)
print_x_secondaries(secondaries)
I get the error:
NameError: name 'x' is not defined,
while if I don't call the function and loop directly, the x coordinates are correctly accessed. Any idea of what is going on? Thank you.
There is the option to store just the losses that are inside the detector volume because an experiment is not interested in the secondaries outside of their detector.
But high energetic hadronic cascades can produce muons, that can travel to the detector. Or the muonic tau decays, or the muon pair production.
If these processes occur before the detector they should also be stored.
So the flag should be changed to relevant_losses.
Not in the CMake file or in the readme, but required. Compilation failed at 43 %
pyPROPOSAL just looks strange and py is unecessary for a module.
py is often in the repo name but almost never in the module name.
Opinions?
Enabling parallelization of particle propagation in PROPOSAL could improve the runtime of PROPOSAL.
log4cplus:ERROR could not open file /tmp/pip-req-build-79oqjkru/resources/log4cplus.conf
log4cplus:ERROR No appenders could be found for logger (PROPOSAL).
log4cplus:ERROR Please initialize the log4cplus system properly.
The path is the temporary build path.
Any ideas how to adapt this, so it can find a path at installation?
Maybe it would be a straight forward approach, to handle the continuous randomization as a optional dependency from the displacement object.
Therefore some adjustments in the propagation routine would be necessary.
To make running the tests faster and easier to see breakage,
we should split the test suite between short running unit tests and long runnnig physics results regression tests.
For this, we could also use travis staged builds. So that e.g. the regression tests are only run after the unit tests were successful.
When propagating particles, the following error prints often (many thousands of times) to the terminal:
WARN - Maximum number of iteration exeeded in NewtonRaphson
Which clutters the command line and makes log files very large.
It looks like it comes from a warn level message in the math methods:
It would be nice if the python interface provided a way to adjust the logging level, so that people who install proposal through pypi can control the logging level. If that is present and I missed it, sorry!
Also, perhaps this indicates the logging level on that particular problem should be turned down to INFO
?
Installing PROPOSAL, e.g. via pip install or an AUR package for arch users would be favorable.
At the moment, only the phase space is considered when calculating tau decays. This could be improved by considering specific matrix elements for the decay channels.
I have installed following the instructions for standalone install, but then when I try doing
#include "PROPOSAL/PROPOSAL.h"
I get:
In file included from /usr/local/include/PROPOSAL/PROPOSAL.h:104,
from test.cpp:2:
/usr/local/include/PROPOSAL/Output.h:50:14: fatal error: icetray/I3Logging.h: No such file or directory
#include <icetray/I3Logging.h>
^~~~~~~~~~~~~~~~~~~~~
compilation terminated.
I installed GTest on ubuntu using apt install libgtest-dev
, but cmake still cannot find it:
$ cmake ../PROPOSAL/
-- Boost version: 1.61.0
-- Found the following Boost libraries:
-- program_options
-- Enabled to build the python wrapper library.
-- Looking for Root...
-- Looking for Root... - found /home/maxnoe/.local/root6/bin/root
-- Looking for Root... - version 6.08/00
-- Could NOT find GTest (missing: GTEST_LIBRARY GTEST_MAIN_LIBRARY)
-- No tests will be build.
ln: failed to create symbolic link '/home/maxnoe/Uni/proposal-build/resources/resources': File exists
-- Configuring done
-- Generating done
-- Build files have been written to: /home/maxnoe/Uni/proposal-build
Charged particles can be deflected in magnetic fields. This is relevant especially for electrons and positrons in air showers deflected by the Earth's Magnetic Field.
try it out:
cd resources/resources/resources/resources/resources/resources/resources/resources/resources/
It's awesome
I've found a bug concerning the direction of the particles coming from a decay in the python wrapping. Let's say I create a propagator with a direction (px, py, pz)
propagator.particle.direction = pp.Vector3D(px, py, pz)
Then I propagate it and store the secondary particles in an array.
secondaries = propagator.propagate(propagation_length)
With propagation_length equal to 100 km, for instance. The resulting secondaries have the correct direction if they are datatypes, but particles present the opposite direction, that is (-px, -py, -pz). Have you encountered anything similar in the past? Do you know what's going on?
Thank you.
At the moment, PROPOSAL (silently) uses MeV and cm as default units.
This is not completely transparent for the user and may cause misunderstandings.
A proper handling of units in PROPOSAL would be favorable.
I've had a strange error. In my Python code, I create a dictionary containing propagators. Using the propagators in this dictionary, I propagate either taus or antitaus. If one of the decay products is a muon, I pass the properties of this muon to the correspondent propagator and then propagate it to see if it creates particle showers. The weird thing is that this propagation randomly gives segmentation faults.
I can provide you with a simple code that reproduces this error.
$ tar xzf tests/TestFiles.tar.gz -C /tmp
gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Error is not recoverable: exiting now
The current implementation of the Interpolation and Integrations routines are very static and could be vastly improved by rewriting them in a polymorphic way. This could not only improve the runtime but especially improve the readability of the code.
When propagating in inhomogeneous media, the LPM effect is not treated correctly.
It has to be evaluated whether there are other sources where properties have to be adapted for inhomogeneous media.
If a Propagator is created, all CrossSections or PropagationUtilities check the path to store the interpolation tables, which is not necessary, if it is checked once for all.
In addition to that, to store the interpolation tables in memory, up to now this is done by passing an empty string. Due to all the path checks which fail, because the path doesn't exist, the output is spammed with warnings. For this, a flag to store the interpolation tables in memory should be created.
This would remove the compilation on Ubuntu.
Should be easy, as this project already uses cmake and googletest
cmake has a builtin mechanism for building types and managing cpp flags for them.
We should use that instead of the current option DEBUG
make uninstall
does not remove the created folder structure in the installation path.
It's considered bad style to explicitly set compiler arguments such as --std=c++11
instead of using cmake options such as:
target_compile_features(PROPOSAL PUBLIC cxx_std_11)
set_target_properties(PROPOSAL PROPERTIES CXX_EXTENSIONS OFF)
we should switch to this.
Although completely functional, the Vector3D class in PROPOSAL is not entirely intuitive to use. Users mishandling a Vector3D object may get undesired behaviors. This especially occurs when converting cartesian coordinates to spherical coordinates and vice versa.
Do we need to save both versions of the coordinates in a Vector3D object? If we do, we should ensure that the user doesn't have to call a transformation method (CalculateCartesianFromSpherical
or CalculateSphericalCoordinates
) before he reads the position. This should be handled automatically by the Vector3D class.
Propagator is long and not readable. Prefer small and clear methods. Up to now, the Propagator points the particle to the specific sectors, which propagate the particle. This is counter intuitive and the assignment to the sectors and the length to propagate is not useful. This is related to the Border Problem Issue #33.
The Propagator should be restructured and the Propagation Utility should be upgraded, while the Sector becomes redundant.
Proposal uses $HOME/.local/share/PROPOSAL/tables
for interpolation tables.
Currently, the user has to create this directory to enable table creation.
I think it would be better to use $HOME/.cache/PROPOSAL
as it would fit better for this purpose and to automatically create this, if it does not exist.
At the moment, we are "stuck" with version v2.2.4 of pybind11. If we use a newer version of pybind11 and try to import pyPROPOSAL, the error ImportError: generic_type: type "MuMinusDef" does not have a non-default holder type while its base "PROPOSAL::ParticleDef" does
occurs. This is related to the "issue" pybind/pybind11#1317
I think the CMake files could get a lot more readable and a lot shorter using a modern cmake version, at least something > 3, see e.g. this post:
https://gist.github.com/mbinna/c61dbb39bca0e4fb7d1f73b0d66a4fd1
Muons are particle which are for the most part deflect by multiple scattering. In the most cases, their direction of propagation as an approximation can be assumed to be a straight line, because their losses causes small deflections d as shown in figure.
As a consequence that distance calculation to the next border is quite expensive, algorithm have been developed which calculated them as less than possible.
They are based on the assumption that the propagated distance through a sector can fully describe through a straight line. The highest sector will be chosen and the distance from particle position to the expected border calculated. There will be so many propagation steps in the sector until the supposed border is reached and the distance to the next sector border is calculated.
If a particle leaf the sector earlier than described by the straight line, the particle is propagated a part through the other sector with wrong condition. (Path 2 in figure) The same failure causes, that there will be more corner clipper measured as simulated. This can be avoided by oversizing the detector.
Description
I found a memory leak problem when I try to test the ionization and photonuclear process of tau. After my inspection via valgrind, I think the leakage is caused by the UtilityDecorator. Under this special case, the program will frequently trigger the function at PropagationUtilityInterpolant.cxx:604:
return UtilityIntegralContRand(utility_).Calculate(ei, ef, rnd);
which will generate a temporal UtilityIntegralContRand
. However, the base class of it, which is UtilityDecorator
, has a memory leak. The utility_
variable in the UtilityDecorator
will be clone in the constructor of UtilityDecorator
, however, it won't be released during the deconstruction of UtilityDecorator
because it stores as a reference variable.
How to reproduce
Here's the config file and the sample code I used.
Workaround
The problem can be solved by adding a delete &utility_
in the deconstructor of UtilityDecorator
.
UtilityDecorator::~UtilityDecorator() {
delete &utility_;
}
But deleting a reference through its address is suggested according to C++ standard. I think of changing it from reference to normal variable, that is
from const Utility& utility_;
to const Utility utility_;
However, this will introduce one more copy process each time during the construction of UtilityDecorator
, which may slow down the program a little bit. I think it may be better to store the utility_
as a (smart) pointer. But it require a huge modification of the code so I didn't do it.
Description
During my inspection of the problem above, I found a minor memory leak in the Medium.cxx:341
void Medium::SetDensityDistribution(Density_distr& density_distr) {
dens_distr_ = density_distr.clone();
}
After setting the new density distribution, the original one won't be released. Therefore, I add a line of code to delete the original one to solve this leakage.
void Medium::SetDensityDistribution(Density_distr& density_distr) {
delete distr_;
dens_distr_ = density_distr.clone();
}
Description
Take AnnihilationFactory for example, when the program find a corresponding enum of the parameterization,
return new AnnihilationIntegral(*it->second(particle_def, medium, def.multiplier));
The code it->second(particle_def, medium, def.multiplier)
will generate a pointer of annihilation parameter. After passing the parameter to create a AnnihilationIntegral
, this pointer is not released. I think this problem can be solved by using a smart pointer
std::unique_ptr<Annihilation> param(it->second(particle_def, medium, def.multiplier));
return new AnnihilationIntegral(*param);
When we look at the particle energy spectrum of individual particles in decays, there are inconsistencies between the different decay models (ManyBodyDecay vs. LeptonicDecay) available in PROPOSAL.
While the charged lepton spectra (electron for muon decay, muon for tau decay) are consistent, the neutrino spectra are inconsistent. We believe that the ManyBodyDecay spectra are correct (since they look similar to theoretical expectations).
Interpolant::Exp is defined as:
double Interpolant::Exp(double x)
{
if (x <= aBigNumber_)
{
return 0;
} else
{
return std::exp(x);
}
}
Documentation says:
/**
* Exp(x) with CutOff.
*
* if x > exp(aBigNumber): exp(x) \n
* else: 0
*
* \param x
* \return exp(x) OR 0;
*/
What is aBigNumber
anyways?
Hi:
I am wondering if I can use PROPOSAL
to calculate the Cherenkov photon production of muon and its secondaries.
PROPOSAL
can calculate the secondary particles produced by muon along its track, which greate because those secondary particles can also produce Cherenkov lights.
My main concerns is that if I set energy cut for the secondaries production, the low energy secondaries which are classified as continous energy loss will no longer be tracked. However, those low energy secondaries will contribute to Cherenkov photon production in real physics world.
Do you have any suggestions?
What exactly is meant by the "secondaries"? What is sec.energy
, and why does it start nearly at the primary particle energy and then continuously decrease?
It seems most of the information in sec
is a copy of the information in sec.particles
(for example, sec.energy == [i.energy for i in sec.particles]
returns True
). Is there information about the propagating particle stored anywhere?
And what is meant by parent particle? Is there a relationship between sec.energy
and sec.parent_particle_energy
(for example, sec.energy[i] == sec.parent_particle_energy[i + 1]
returns False
)?
PROPOSAL has one required dependency: log4cplus >= 2 and gtest is required for the tests.
log4cplus >= 2 is in no major package manager on any of the current distributions.
This should be vendored, gtest should also be build against the current project, not globally.
Two different approaches could be made:
Add the code to this repository:
Pro: easy
Con: bloats history and repository
Use git submodules
--recursive
when cloning.I would prefer the second path I think.
Any solution based on cmakes ability to download dependencies should be avoided as then building requires internet connection.
It takes really long to produce these and they should be large but static, right?
So maybe provide them in this repository using
https://git-lfs.github.com/
We cant to use a different Logger than log4cplus. More precisely, we cant to provide some kind of interface for users to use their own logger which can be helpful or even an requirement for PROPOSAL to be used in simulation chains of bigger experiments.
The main page of the HTML documentation contains no links whatsoever to the other pages.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.