Code Monkey home page Code Monkey logo

proposal's People

Contributors

asandrock avatar austinschneider avatar jean1995 avatar kkrings avatar larsbollmann avatar maxnoe avatar maxsac avatar mennthor avatar nega0 avatar pascalgutjahr avatar sudojan avatar themayo12 avatar turingtw avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

proposal's Issues

Provide package list for common distributions

This would be really handy to get started.

For Ubuntu 16.10 it seems to be

$ apt install cmake \
    doxygen \
    liblog4cplus-dev 
    libgtest-dev \
    libboost-dev \
    libboost-python-dev \
    libboost-program-options-dev 

Secondaries array is not correctly passed to a function in the Python wrapper

I'm having trouble if I pass a secondaries array to a function, so I'm going to show a simplified version of my code. Let's us assume that I propagate a particle and store the secondaries in an array:

secondaries = propagator.propagate(propagation_length)

And then I pass this array to a function that loops over the secondaries and prints their x coordinate.

def print_x_secondaries(secondaries): for sec in secondaries: print(sec.position.x)

print_x_secondaries(secondaries)

I get the error:

NameError: name 'x' is not defined,

while if I don't call the function and loop directly, the x coordinates are correctly accessed. Any idea of what is going on? Thank you.

store relevant losses

There is the option to store just the losses that are inside the detector volume because an experiment is not interested in the secondaries outside of their detector.
But high energetic hadronic cascades can produce muons, that can travel to the detector. Or the muonic tau decays, or the muon pair production.
If these processes occur before the detector they should also be stored.
So the flag should be changed to relevant_losses.

Rename pyPROPOSAL โ†’ proposal

pyPROPOSAL just looks strange and py is unecessary for a module.

py is often in the repo name but almost never in the module name.

Opinions?

Installed python package does not find log4cplus config

log4cplus:ERROR could not open file /tmp/pip-req-build-79oqjkru/resources/log4cplus.conf
log4cplus:ERROR No appenders could be found for logger (PROPOSAL).
log4cplus:ERROR Please initialize the log4cplus system properly.

The path is the temporary build path.

Any ideas how to adapt this, so it can find a path at installation?

Split between unit and regression tests

To make running the tests faster and easier to see breakage,
we should split the test suite between short running unit tests and long runnnig physics results regression tests.

For this, we could also use travis staged builds. So that e.g. the regression tests are only run after the unit tests were successful.

verbose math method and python interface to logging level

When propagating particles, the following error prints often (many thousands of times) to the terminal:

WARN - Maximum number of iteration exeeded in NewtonRaphson

Which clutters the command line and makes log files very large.

It looks like it comes from a warn level message in the math methods:

https://github.com/tudo-astroparticlephysics/PROPOSAL/blob/master/private/PROPOSAL/math/MathMethods.cxx#L270

It would be nice if the python interface provided a way to adjust the logging level, so that people who install proposal through pypi can control the logging level. If that is present and I missed it, sorry!

Also, perhaps this indicates the logging level on that particular problem should be turned down to INFO?

Tau decay matrix elements

At the moment, only the phase space is considered when calculating tau decays. This could be improved by considering specific matrix elements for the decay channels.

Standalone installation seems to still require icetray

I have installed following the instructions for standalone install, but then when I try doing

#include "PROPOSAL/PROPOSAL.h"

I get:

In file included from /usr/local/include/PROPOSAL/PROPOSAL.h:104,
                 from test.cpp:2:
/usr/local/include/PROPOSAL/Output.h:50:14: fatal error: icetray/I3Logging.h: No such file or directory
     #include <icetray/I3Logging.h>
              ^~~~~~~~~~~~~~~~~~~~~
compilation terminated.

GTest not found, although installed

I installed GTest on ubuntu using apt install libgtest-dev, but cmake still cannot find it:

$ cmake ../PROPOSAL/
-- Boost version: 1.61.0
-- Found the following Boost libraries:
--   program_options
-- Enabled to build the python wrapper library.
-- Looking for Root...
-- Looking for Root... - found /home/maxnoe/.local/root6/bin/root
-- Looking for Root... - version 6.08/00 
-- Could NOT find GTest (missing:  GTEST_LIBRARY GTEST_MAIN_LIBRARY) 
-- No tests will be build.
ln: failed to create symbolic link '/home/maxnoe/Uni/proposal-build/resources/resources': File exists
-- Configuring done
-- Generating done
-- Build files have been written to: /home/maxnoe/Uni/proposal-build

Magnetic field deflection

Charged particles can be deflected in magnetic fields. This is relevant especially for electrons and positrons in air showers deflected by the Earth's Magnetic Field.

Opposite direction for decay particles

I've found a bug concerning the direction of the particles coming from a decay in the python wrapping. Let's say I create a propagator with a direction (px, py, pz)

propagator.particle.direction = pp.Vector3D(px, py, pz)

Then I propagate it and store the secondary particles in an array.

secondaries = propagator.propagate(propagation_length)

With propagation_length equal to 100 km, for instance. The resulting secondaries have the correct direction if they are datatypes, but particles present the opposite direction, that is (-px, -py, -pz). Have you encountered anything similar in the past? Do you know what's going on?

Thank you.

PROPOSAL unit handling

At the moment, PROPOSAL (silently) uses MeV and cm as default units.
This is not completely transparent for the user and may cause misunderstandings.

A proper handling of units in PROPOSAL would be favorable.

Segmentation fault when propagating decay particles

I've had a strange error. In my Python code, I create a dictionary containing propagators. Using the propagators in this dictionary, I propagate either taus or antitaus. If one of the decay products is a muon, I pass the properties of this muon to the correspondent propagator and then propagate it to see if it creates particle showers. The weird thing is that this propagation randomly gives segmentation faults.

I can provide you with a simple code that reproduces this error.

TestFiles.tar.gz broken

$ tar xzf tests/TestFiles.tar.gz -C /tmp

gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Error is not recoverable: exiting now

Make Integration routine polymorph

The current implementation of the Interpolation and Integrations routines are very static and could be vastly improved by rewriting them in a polymorphic way. This could not only improve the runtime but especially improve the readability of the code.

LPM effect in inhomogeneous media

When propagating in inhomogeneous media, the LPM effect is not treated correctly.

It has to be evaluated whether there are other sources where properties have to be adapted for inhomogeneous media.

more intuitive way to store the interpolation tables

If a Propagator is created, all CrossSections or PropagationUtilities check the path to store the interpolation tables, which is not necessary, if it is checked once for all.

In addition to that, to store the interpolation tables in memory, up to now this is done by passing an empty string. Due to all the path checks which fail, because the path doesn't exist, the output is spammed with warnings. For this, a flag to store the interpolation tables in memory should be created.

Use cmake properties instead of setting cxx flags

It's considered bad style to explicitly set compiler arguments such as --std=c++11 instead of using cmake options such as:

target_compile_features(PROPOSAL PUBLIC cxx_std_11)
set_target_properties(PROPOSAL PROPERTIES CXX_EXTENSIONS OFF)

we should switch to this.

More intuitive usage of Vector3D

Although completely functional, the Vector3D class in PROPOSAL is not entirely intuitive to use. Users mishandling a Vector3D object may get undesired behaviors. This especially occurs when converting cartesian coordinates to spherical coordinates and vice versa.

Do we need to save both versions of the coordinates in a Vector3D object? If we do, we should ensure that the user doesn't have to call a transformation method (CalculateCartesianFromSpherical or CalculateSphericalCoordinates) before he reads the position. This should be handled automatically by the Vector3D class.

Restructuture Propagator/Sector/Propagation Utility

Propagator is long and not readable. Prefer small and clear methods. Up to now, the Propagator points the particle to the specific sectors, which propagate the particle. This is counter intuitive and the assignment to the sectors and the length to propagate is not useful. This is related to the Border Problem Issue #33.
The Propagator should be restructured and the Propagation Utility should be upgraded, while the Sector becomes redundant.

Use `.cache/PROPOSAL/` for proposal tables and create directory

Proposal uses $HOME/.local/share/PROPOSAL/tables for interpolation tables.
Currently, the user has to create this directory to enable table creation.

I think it would be better to use $HOME/.cache/PROPOSAL as it would fit better for this purpose and to automatically create this, if it does not exist.

Enable usage of newest pybind11 version

At the moment, we are "stuck" with version v2.2.4 of pybind11. If we use a newer version of pybind11 and try to import pyPROPOSAL, the error ImportError: generic_type: type "MuMinusDef" does not have a non-default holder type while its base "PROPOSAL::ParticleDef" does occurs. This is related to the "issue" pybind/pybind11#1317

Border problem

Muons are particle which are for the most part deflect by multiple scattering. In the most cases, their direction of propagation as an approximation can be assumed to be a straight line, because their losses causes small deflections d as shown in figure.
As a consequence that distance calculation to the next border is quite expensive, algorithm have been developed which calculated them as less than possible.
They are based on the assumption that the propagated distance through a sector can fully describe through a straight line. The highest sector will be chosen and the distance from particle position to the expected border calculated. There will be so many propagation steps in the sector until the supposed border is reached and the distance to the next sector border is calculated.
If a particle leaf the sector earlier than described by the straight line, the particle is propagated a part through the other sector with wrong condition. (Path 2 in figure) The same failure causes, that there will be more corner clipper measured as simulated. This can be avoided by oversizing the detector.

border_problem

Memory leak issue

In UtilityDecorator

Description
I found a memory leak problem when I try to test the ionization and photonuclear process of tau. After my inspection via valgrind, I think the leakage is caused by the UtilityDecorator. Under this special case, the program will frequently trigger the function at PropagationUtilityInterpolant.cxx:604:

return UtilityIntegralContRand(utility_).Calculate(ei, ef, rnd);

which will generate a temporal UtilityIntegralContRand. However, the base class of it, which is UtilityDecorator, has a memory leak. The utility_ variable in the UtilityDecorator will be clone in the constructor of UtilityDecorator, however, it won't be released during the deconstruction of UtilityDecorator because it stores as a reference variable.

How to reproduce
Here's the config file and the sample code I used.

Workaround
The problem can be solved by adding a delete &utility_ in the deconstructor of UtilityDecorator.

UtilityDecorator::~UtilityDecorator() {
    delete &utility_;
}

But deleting a reference through its address is suggested according to C++ standard. I think of changing it from reference to normal variable, that is

from const Utility& utility_; to const Utility utility_;

However, this will introduce one more copy process each time during the construction of UtilityDecorator, which may slow down the program a little bit. I think it may be better to store the utility_ as a (smart) pointer. But it require a huge modification of the code so I didn't do it.

In Medium.cxx

Description
During my inspection of the problem above, I found a minor memory leak in the Medium.cxx:341

void Medium::SetDensityDistribution(Density_distr& density_distr) {
    dens_distr_ = density_distr.clone();
}

After setting the new density distribution, the original one won't be released. Therefore, I add a line of code to delete the original one to solve this leakage.

void Medium::SetDensityDistribution(Density_distr& density_distr) {
    delete distr_;
    dens_distr_ = density_distr.clone();
}

In XXXXXXXFactory.cxx

Description
Take AnnihilationFactory for example, when the program find a corresponding enum of the parameterization,

return new AnnihilationIntegral(*it->second(particle_def, medium, def.multiplier));

The code it->second(particle_def, medium, def.multiplier) will generate a pointer of annihilation parameter. After passing the parameter to create a AnnihilationIntegral, this pointer is not released. I think this problem can be solved by using a smart pointer

std::unique_ptr<Annihilation> param(it->second(particle_def, medium, def.multiplier)); 
return new AnnihilationIntegral(*param);

Modification of code related to the issues above

here

Inconsistent neutrino spectrum in particle decays

When we look at the particle energy spectrum of individual particles in decays, there are inconsistencies between the different decay models (ManyBodyDecay vs. LeptonicDecay) available in PROPOSAL.

While the charged lepton spectra (electron for muon decay, muon for tau decay) are consistent, the neutrino spectra are inconsistent. We believe that the ManyBodyDecay spectra are correct (since they look similar to theoretical expectations).

Lying docs of Interpolant::Exp

Interpolant::Exp is defined as:

double Interpolant::Exp(double x)
{
    if (x <= aBigNumber_)
    {
        return 0;
    } else
    {
        return std::exp(x);
    }
}

Documentation says:

    /**
     * Exp(x) with CutOff.
     *
     * if x > exp(aBigNumber): exp(x) \n
     * else: 0
     *
     * \param    x
     * \return   exp(x) OR 0;
     */

What is aBigNumber anyways?

Scientific question: Can I calculate Cherenkov photons generation with PROPOSAL?

Hi:
I am wondering if I can use PROPOSAL to calculate the Cherenkov photon production of muon and its secondaries.
PROPOSAL can calculate the secondary particles produced by muon along its track, which greate because those secondary particles can also produce Cherenkov lights.

My main concerns is that if I set energy cut for the secondaries production, the low energy secondaries which are classified as continous energy loss will no longer be tracked. However, those low energy secondaries will contribute to Cherenkov photon production in real physics world.

Do you have any suggestions?

Secondaries Terminology Clarifications

What exactly is meant by the "secondaries"? What is sec.energy, and why does it start nearly at the primary particle energy and then continuously decrease?

It seems most of the information in sec is a copy of the information in sec.particles (for example, sec.energy == [i.energy for i in sec.particles] returns True). Is there information about the propagating particle stored anywhere?

And what is meant by parent particle? Is there a relationship between sec.energy and sec.parent_particle_energy (for example, sec.energy[i] == sec.parent_particle_energy[i + 1] returns False)?

Vendor dependencies

PROPOSAL has one required dependency: log4cplus >= 2 and gtest is required for the tests.

log4cplus >= 2 is in no major package manager on any of the current distributions.
This should be vendored, gtest should also be build against the current project, not globally.

Two different approaches could be made:

  • Add the code to this repository:
    Pro: easy
    Con: bloats history and repository

  • Use git submodules

    • Pro: just a submodule
    • Con: not automatically packaged into tar ball by github and requires --recursive when cloning.
      First can be dealed with by publishing source tar balls via travis second through Readme.

I would prefer the second path I think.
Any solution based on cmakes ability to download dependencies should be avoided as then building requires internet connection.

Revise Logging in PROPOSAL

We cant to use a different Logger than log4cplus. More precisely, we cant to provide some kind of interface for users to use their own logger which can be helpful or even an requirement for PROPOSAL to be used in simulation chains of bigger experiments.

Documentation

The main page of the HTML documentation contains no links whatsoever to the other pages.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.