Code Monkey home page Code Monkey logo

meneco's Introduction

bioasp

BioASP software collection

meneco's People

Contributors

aribad avatar cfrioux avatar davidjsherman avatar sthiele avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

meneco's Issues

Problem for reconstruction of some targets?

I wanted to make some tests with meneco and Ectodata. For a faster computation, I reduced the set of targets to one compound ("ARG" or "GLT"). According to the programme it is not producible but can be reconstructed with the Ectodata metacyc DB. There are no essential reactions associated to the target. There are also no reactions in the optimal solution because no ireactions are found.
It leads to the following output:

meneco -d Ectodata/ectocyc.sbml -s Ectodata/seeds.sbml -t Ectodata/single_target.sbml -r Ectodata/metacyc_16-5.sbml
Reading draft network ...
Draft network file: Ectodata/ectocyc.sbml
Reading seeds ...
Seeds file: Ectodata/seeds.sbml
Reading targets ...
Targets file: Ectodata/single_target.sbml

Checking draftnet for unproducible targets
1 unproducible targets:
	ARG

Reading repair db ...
Repair db file: Ectodata/metacyc_16-5.sbml

Warning: RXN__45__13206 listOfProducts=None
Checking draftnet + repairnet for unproducible targets
Still 0 unreconstructable targets:


1 reconstructable targets:
	ARG

Computing essential reactions for ARG
0 essential reactions for target ARG:


Overall 0 essential reactions found:


Adding essential reactions to network
Computing one minimal completion to produce all targets
/var/folders/76/1vd3yqtd5z1g3rg4t239wrrr0000gp/T/meneco__x018tes.lp
One minimal completion of size 0:


Computing common reactions in all completion with size 0
Intersection of cardinality minimal completions:


Computing union of reactions from all completion with size 0
Union of cardinality minimal completions:

If I test with another target, "CPD__45__8120", I have essential reactions, one solution, an union and an intersection. To be sure I tested very old versions of meneco and the result is alike so this is not brought by the recent changes of meneco.
I will continue to investigate, maybe I am missing something obvious, but I open the issue already in case you understand the behaviour and I don't.

Enable calling meneco from ASP data in addition to SBMLs

So far, an import of meneco in a python program requires SBMLs as input.
It would be useful to enable calling the program with draftnet, repairnet, seeds and targets already processed as ASP facts and atoms. That would help integration into other software.

Path not found error

When running from notebook or commandline with Meneco v2.0.2 I get the following error:

> meneco -d draft.sbml -s seeds.sbml -t targets.sbml -r bigg_universal_v16.sbml
Reading draft network ...
Draft network file: draft.sbml
Reading seeds ...
Seeds file: seeds.sbml
Reading targets ...
Targets file: targets.sbml

Checking draftnet for unproducible targets ...
Traceback (most recent call last):
  File "C:\Users\JanZ\Anaconda3\envs\meneco\lib\runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "C:\Users\JanZ\Anaconda3\envs\meneco\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "C:\Users\JanZ\Anaconda3\envs\meneco\Scripts\meneco.exe\__main__.py", line 7, in <module>
  File "C:\Users\JanZ\Anaconda3\envs\meneco\lib\site-packages\meneco\__main__.py", line 12, in main_meneco
    cmd_meneco(sys.argv[1:])
  File "C:\Users\JanZ\Anaconda3\envs\meneco\lib\site-packages\meneco\meneco.py", line 71, in cmd_meneco
    args.json,
  File "C:\Users\JanZ\Anaconda3\envs\meneco\lib\site-packages\meneco\meneco.py", line 137, in run_meneco
    model = query.get_unproducible(draftnet, seeds, targets)
  File "C:\Users\JanZ\Anaconda3\envs\meneco\lib\site-packages\meneco\query.py", line 43, in get_unproducible
    models = clyngor.solve(prg, options=options, use_clingo_module=False)
  File "C:\Users\JanZ\Anaconda3\envs\meneco\lib\site-packages\clyngor\solving.py", line 64, in solve
    files = tuple(map(cleaned_path, files) if clean_path else files)
  File "C:\Users\JanZ\Anaconda3\envs\meneco\lib\site-packages\clyngor\utils.py", line 272, in cleaned_path
    open(path)  # will raise FileExistsError
FileNotFoundError: [Errno 2] No such file or directory: 'C:\\Users\\JanZ\\Anaconda3\\envs\\meneco\\lib\\site-packages\\meneco\\query.py\\encodings\\unproducible_targets.lp'

Seems like an incorrectly formatted file string with 'query.py' inside? As the path 'C:\Users\JanZ\Anaconda3\envs\meneco\lib\site-packages\meneco\encodings\unproducible_targets.lp' does exist

ls C:\\Users\\JanZ\\Anaconda3\\envs\\meneco\\lib\\site-packages\\meneco\\encodings


   Directory: C:\Users\JanZ\Anaconda3\envs\meneco\lib\site-packages\meneco\encodings


Mode                 LastWriteTime         Length Name
----                 -------------         ------ ----
-a----         1/24/2023   6:51 PM           3402 card_min_completions_all_targets.lp
-a----         1/24/2023   6:51 PM           2967 card_min_completions_all_targets_with_bounds.lp
-a----         1/24/2023   6:51 PM           3321 completions_all_targets.lp
-a----         1/24/2023   6:51 PM            102 heuristic.lp
-a----         1/24/2023   6:51 PM           2088 ireactions.lp
-a----         1/24/2023   6:51 PM           1012 unproducible_targets.lp

Character encoding can cause issues in Meneco 2.0.0 installation, fixed by modifying setup.py

Installation of Meneco 2.0.0 was causing the following error on my system:

UnicodeDecodeError: 'charmap' codec can't decode byte 0x9d in position 2511: character maps to <undefined>

Fixed by specifying the encoding in setup.py:
Change long_description=open('README.md').read(), to long_description=open('README.md', encoding="utf8").read(),

Adding an issue for any other users who may have encountered this problem during installation.

update docs

docs are still on version 1.5.0

  • update docs

Create JSON output incremental

Currently all solutions are collected before the JSON output is printed. This leads to an exhaustive memory consumption on data with many solution. Therefore the JSON output should be printed incrementally.

Locale error from gringo

Since the only gringo repository I could find hasn't been updated in 5 years, I'm just going to post this here.

When running meneco on xubuntu I get the following error:

OSError: got error -6 from gringo:
gringo3: loadlocale.c:130: _nl_intern_locale_data: Assertion `cnt < (sizeof (_nl_value_type_LC_TIME) / sizeof (_nl_value_type_LC_TIME[0]))' failed.

This is rather easily fixed by running export LC_ALL=C before Python or appending this to the .bashrc. But that feels rather hacky. Do you know where this error might come from and if there is a way to fix it without changing my local locale settings?

Cheers

Performance issues parsing large networks

Versions used:

clyngor    | 0.3.31
Meneco     | 2.0.1

This is probably more a clyngor issue, but I'll post it here because that's where it affected me :)

I noticed a heavy performance drop when trying to work with large models, so while a ~4207 reactions E. coli sbml file parsed in roughly 7 seconds, a 27742 reactions metacyc sbml file (both including certain duplications) parsed in 10 minutes! In comparison, both together took roughly 5 seconds on meneco 1.5.3

I used the smaller one to profile on where that performance regression comes from

import cProfile
cProfile.run('readSBMLnetwork("ecoli.sbml", "draft")')

and noticed the following oddity (I'll spare you the rest of the output)

160214 function calls in 7.715 seconds

Ordered by: standard name

ncalls  tottime  percall  cumtime  percall filename:lineno(function)
...
24724    6.163    0.000    6.178    0.000 as_pyasp.py:78(add)
....

So a whopping 6 of 7 seconds is used on the TermSet.add method. Thus I quickly ran a check on how this function scales over more input, and it seems to be roughly exponential.

output

I don't know exactly why they used frozensets in there, but that line here will create two new ones every time the function is called, which probably explains the runtime.

def add(self, atom: Atom):
    self._terms |= frozenset({atom})

I don't know if you can work around that (I haven't checked your source yet) or if the clyngor folks would be open to change, but in the meantime, I will have to downgrade our moped software, because some of our users ran into problems due to this.

Hope we can quickly find a fix for this, if I can help just let me know, I'd be happy to.

Cheers

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.