Code Monkey home page Code Monkey logo

clodius's Introduction

Build Status Live Docs DOI Twitter Slack

Introduction

HiGlass is a web-based viewer for datasets too large to view at once. It features synchronized navigation of multiple views as well as continuous zooming and panning for navigation across genomic loci and resolutions. It supports visual comparison of genomic (e.g., Hi-C, ChIP-seq, or bed annotations) and other data (e.g., geographic maps, gigapixel images, or abstract 1D and 2D sequential data) from different experimental conditions and can be used to efficiently identify salient outcomes of experimental perturbations, generate new hypotheses, and share the results with the community.

A live instance can be found at https://higlass.io. A Docker container is available for running an instance locally, although we recommend using the higlass-manage package to start, stop and configure local instances.

For documentation about how to use and install HiGlass, please visit https://docs.higlass.io.

Citation

Kerpedjiev, P., Abdennur, N., Lekschas, F., McCallum, C., Dinkla, K., Strobelt, H., ... & Gehlenborg, N. HiGlass: Web-based Visual Exploration and Analysis of Genome Interaction Maps. Genome Biology (2018): 19:125. https://doi.org/10.1186/s13059-018-1486-1

Example

Development

To run higlass from its source code simply run the following:

npm clean-install // use --legacy-peer-deps if you get peer dependency errors
npm run start

This starts a server in development mode at http://localhost:5173/.

Warning The following examples need to be migrated to the latest build. Once started, a list of the examples can be found at http://localhost:8080/examples.html. Template viewconfs located at /docs/examples/viewconfs can viewed directly at urls such as http://localhost:8080/apis/svg.html?/viewconfs/overlay-tracks.json.

Tests

The tests for the React components and API functions are located in the test directory. Tests are run with web-test-runner, which you can learn more about the CLI in the documentation.

Useful commands:

  • Run all tests: npm test
  • Run all tests in interactive "watch" mode: npm test -- --watch
  • Run a specific test or "glob" of tests: npm test -- test/APITests.js [--watch]
  • Manually run individual tests in an open browser window: npm test -- --manual

Troubleshooting:

  • If the installation fails due to sharp > node-gyp try installing the node packages using python2:

    npm ci --python=/usr/bin/python2 && rm -rf node_modules/node-sass && npm ci
    

API

HiGlass provides an API for controlling the component from with JavaScript. Below is a minimal working example to get started and the complete documentation is availabe at docs.higlass.io.

<!DOCTYPE html>
<head>
  <meta charset="utf-8">
  <title>Minimal Working Example &middot; HiGlass</title>
  <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css">
  <link rel="stylesheet" href="https://unpkg.com/[email protected]/dist/hglib.css">

  <style type="text/css">
    html, body {
      width: 100vw;
      height: 100vh;
      overflow: hidden;
    }
  </style>

  <script crossorigin src="https://unpkg.com/react@16/umd/react.production.min.js"></script>
  <script crossorigin src="https://unpkg.com/react-dom@16/umd/react-dom.production.min.js"></script>
  <script crossorigin src="https://unpkg.com/pixi.js@5/dist/pixi.min.js"></script>
  <script crossorigin src="https://unpkg.com/[email protected]/dist/react-bootstrap.min.js"></script>
  <script crossorigin src="https://unpkg.com/[email protected]/dist/hglib.min.js"></script>
</head>
<body></body>
<script>
const hgApi = window.hglib.viewer(
  document.body,
  'https://higlass.io/api/v1/viewconfs/?d=default',
  { bounded: true },
);
</script>
</html>

Related

diagram of related tools

  • HiGlass Clodius - Package that provides implementations for aggregation and tile generation for many common 1D and 2D data types
  • HiGlass Python - Python bindings to the HiGlass for tile serving, view config generation, and Jupyter Notebook + Lab integration.
  • HiGlass Manage - Easy to use interface for deploying a local HiGlass instance
  • HiGlass Docker - Build an image containing all the components necessary to deploy HiGlass
  • HiGlass Server - Server component for serving multi-resolution data
  • HiGlass App - The code for the web application hosted at https://higlass.io
  • Cooler - Package for efficient storage of and access to sparse 2D data

License

HiGlass is provided under the MIT License.

clodius's People

Contributors

101arrowz avatar alexander-veit avatar alexpreynolds avatar flekschas avatar keller-mark avatar liz-is avatar lmercadom avatar manzt avatar maxwellsh avatar mccalluc avatar mildewey avatar ngehlenborg avatar nvictus avatar pkerpedjiev avatar scottx611x avatar zhqu1148980644 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

clodius's Issues

mrmatrix: col/row header handling

Pete was considering symmetric matrices, so it actually only keeps the header, I think.

  • Not all matrices are symmetric: Consider saving row headers, too.
  • Not all matrices have row/column labels: For image raster it's just a distraction. Make optional?

Pypi in documentation

Between using the Docker image, and working with the source, is there a story for the README about pulling this from pypi? Pushing to pypi is mentioned at the end, but it's not clear to me what depends on pull this.

Release process: Docker + Pypi?

Currently, Docker is using the same strategy as higlass-server, and new images are being pushed to docker-hub on successful travis runs. It might make sense to get these releases in sync with the pypi release process described at the end of the README. Maybe they both should be automatic, or they both should be manual? Or not.

Dependencies are not pulled automatically

> pip install clodius

Collecting clodius
  Downloading clodius-0.7.4.tar.gz (72kB)
    100% |████████████████████████████████| 81kB 3.8MB/s 
    ...
    pkg_resources.DistributionNotFound: The 'pyBigWig' distribution was not found and is required by the application
    
    ----------------------------------------
Command "python setup.py egg_info" failed with error code 1 in /private/var/folders/1q/43l11bl1573fx_sb848qnwmr0000gp/T/pip-build-aunst0qk/clodius/```

Getting h5py TypeError exceptions when converting epilogos BED to multivec

In preparation for ingesting epilogos tilesets, I tried converting two test epilogos BED files to multivec format using conda (Python v3.6.4) and clodius v0.7.4. The host is running Ubuntu Xenial v16.04.

I get two different types of TypeError exceptions when attempting conversion, which appear to be thrown by h5py.

Here is the first test dataset:

$ wget -qO- https://epilogos.altiusinstitute.org/assets/epilogos/v06_16_2017/hg19/15/group/all.KL.bed.gz > /tmp/all.KL.bed.gz

Here is the row-infos text file I am using:

$ cat > ~/epilogos_hg38_observed_states.txt
Active TSS
Flanking Active TSS
Transcription at gene 5p and 3p
Strong transcription
Weak transcription
Genic enhancers
Enhancers
ZNF genes + repeats
Heterochromatin
Bivalent/Poised TSS
Flanking Bivalent TSS/Enh
Bivalent Enhancer
Repressed PolyComb
Weak Repressed PolyComb
Quiescent/Low

I installed conda via Anaconda:

$ wget https://repo.anaconda.com/archive/Anaconda3-5.1.0-Linux-x86_64.sh
$ bash ./Anaconda3-5.1.0-Linux-x86_64.sh
$ python --version
Python 3.6.4 :: Anaconda, Inc.

I then installed libz and pybigwig dependencies, and then installed clodius from the develop branch:

$ sudo apt-get install zlib1g-dev
$ conda update -n base conda
$ conda install -c bioconda pybigwig
$ cd ~/github/hms-dbmi
$ git clone https://github.com/hms-dbmi/clodius.git
$ cd clodius
$ git branch
* develop
$ python setup.py develop
...
$ which clodius
/home/ubuntu/anaconda3/bin/clodius

When I attempt to convert the epilogos file all.KL.bed.gz, here is the error message I get:

$ clodius convert bedfile_to_multivec /tmp/all.KL.bed.gz \
--assembly hg38 \
--starting-resolution 200 \
--row-infos-filename /home/ubuntu/epilogos_hg38_observed_states.txt \
--num-rows 15 \
--format epilogos
temporary dir: /tmp/tmpjrk4ruzt
dumping batch: chr1 100000
dumping batch: chr1 200000
dumping batch: chr1 300000
dumping batch: chr1 400000
dumping batch: chr1 500000
dumping batch: chr1 600000
dumping batch: chr1 700000
dumping batch: chr1 800000
dumping batch: chr1 900000
dumping batch: chr1 1000000
dumping batch: chr1 1100000
dumping batch: chr1 1200000
Traceback (most recent call last):
  File "/home/ubuntu/anaconda3/bin/clodius", line 11, in <module>
    load_entry_point('clodius', 'console_scripts', 'clodius')()
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/click/core.py", line 722, in __call__
    return self.main(*args, **kwargs)
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/click/core.py", line 697, in main
    rv = self.invoke(ctx)
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/click/core.py", line 895, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/click/core.py", line 535, in invoke
    return callback(*args, **kwargs)
  File "/home/ubuntu/github/hms-dbmi/clodius/clodius/cli/convert.py", line 227, in bedfile_to_multivec
    format, row_infos_filename, tile_size)
  File "/home/ubuntu/github/hms-dbmi/clodius/clodius/cli/convert.py", line 101, in _bedgraph_to_multivec
    starting_resolution, has_header, chunk_size);
  File "/home/ubuntu/github/hms-dbmi/clodius/clodius/multivec.py", line 43, in bedfile_to_multivec
    f_out[prev_chrom][batch_start_index:batch_start_index+len(batch)] = np.array(batch)
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/h5py/_hl/dataset.py", line 631, in __setitem__
    for fspace in selection.broadcast(mshape):
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/h5py/_hl/selections.py", line 299, in broadcast
    raise TypeError("Can't broadcast %s -> %s" % (target_shape, count))
TypeError: Can't broadcast (46253, 15) -> (44782, 15)

I get the same TypeError exception and error message thrown if I use virtualenv to run clodius, in place of conda.

The second test dataset is derived from the first, where I excise chrX and try to convert just that chromosome to a tileset:

$ conda install -c bioconda bedops
$ gunzip -c /tmp/all.KL.bed.gz > /tmp/all.KL.bed
$ bedextract chrX /tmp/all.KL.bed > /tmp/all.KL.chrX.bed
$ gzip -c /tmp/all.KL.chrX.bed > /tmp/all.KL.chrX.bed.gz
$ clodius convert bedfile_to_multivec /tmp/all.KL.chrX.bed.gz
--assembly hg38 \
--starting-resolution 200 \
--row-infos-filename /home/ubuntu/epilogos_hg38_observed_states.txt \
--num-rows 15 \
--format epilogos
temporary dir: /tmp/tmpiw7ebd50
dumping batch: chrX 100000
dumping batch: chrX 200000
dumping batch: chrX 300000
dumping batch: chrX 400000
dumping batch: chrX 500000
dumping batch: chrX 600000
dumping batch: chrX 700000
output_file: /tmp/all.KL.chrX.bed.multires.mv5
creating new dataset
array_data.shape (1244782, 15)
copy start: 0 100000
copy start: 100000 100000
copy start: 200000 100000
copy start: 300000 100000
copy start: 400000 100000
copy start: 500000 100000
copy start: 600000 100000
copy start: 700000 100000
copy start: 800000 100000
copy start: 900000 100000
copy start: 1000000 100000
copy start: 1100000 100000
copy start: 1200000 100000
…
creating new dataset
array_data.shape (4, 15)
copy start: 0 4
start: 0
Traceback (most recent call last):
  File "/home/ubuntu/anaconda3/bin/clodius", line 11, in <module>
    load_entry_point('clodius', 'console_scripts', 'clodius')()
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/click/core.py", line 722, in __call__
    return self.main(*args, **kwargs)
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/click/core.py", line 697, in main
    rv = self.invoke(ctx)
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/click/core.py", line 895, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/click/core.py", line 535, in invoke
    return callback(*args, **kwargs)
  File "/home/ubuntu/github/hms-dbmi/clodius/clodius/cli/convert.py", line 227, in bedfile_to_multivec
    format, row_infos_filename, tile_size)
  File "/home/ubuntu/github/hms-dbmi/clodius/clodius/cli/convert.py", line 124, in _bedgraph_to_multivec
    row_infos=row_infos)
  File "/home/ubuntu/github/hms-dbmi/clodius/clodius/multivec.py", line 257, in create_multivec_multires
    f['resolutions'][str(curr_resolution)]['values'][chrom][start/2:start/2+chunk_size/2] = new_data
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/h5py/_hl/dataset.py", line 609, in __setitem__
    selection = sel.select(self.shape, args, dsid=self.id)
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/h5py/_hl/selections.py", line 94, in select
    sel[args]
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/h5py/_hl/selections.py", line 261, in __getitem__
    start, count, step, scalar = _handle_simple(self.shape,args)
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/h5py/_hl/selections.py", line 447, in _handle_simple
    x,y,z = _translate_slice(arg, length)
  File "/home/ubuntu/anaconda3/lib/python3.6/site-packages/h5py/_hl/selections.py", line 480, in _translate_slice
    start, stop, step = exp.indices(length)
TypeError: slice indices must be integers or None or have an __index__ method

In both cases, it appears the exception is thrown from the h5py library. I have v2.7.1 of this library installed.

I was wondering if there is a specific version of this library I should use, or other changes I should make to my Python environment, which could help conversion tests.

installation error

Hey guys, I tried to install clodius on my computer using pip and could not finish the installation. Could you help to solve the problem? Thanks! Here is the error messages:
pip install clodius
Collecting clodius
Requirement already satisfied: cooler>=0.8.5 in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from clodius) (0.8.6.post0)
Requirement already satisfied: slugid in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from clodius) (2.0.0)
Requirement already satisfied: numpy in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from clodius) (1.17.2)
Requirement already satisfied: pandas in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from clodius) (0.25.1)
Requirement already satisfied: dask in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from clodius) (2.5.2)
Requirement already satisfied: nose in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from clodius) (1.3.7)
Requirement already satisfied: Click>=7 in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from clodius) (7.0)
Requirement already satisfied: h5py in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from clodius) (2.9.0)
Requirement already satisfied: negspy in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from clodius) (0.2.23)
Collecting pybbi>=0.2.0 (from clodius)
Using cached https://files.pythonhosted.org/packages/ef/39/d1dab1bd79e118237a8de87c12b7a8c99964c005de5ca31ee4f60683099e/pybbi-0.2.0.tar.gz
Requirement already satisfied: sortedcontainers in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from clodius) (2.1.0)
Requirement already satisfied: pysam in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from clodius) (0.15.3)
Requirement already satisfied: requests in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from clodius) (2.22.0)
Requirement already satisfied: asciitree in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from cooler>=0.8.5->clodius) (0.3.3)
Requirement already satisfied: pypairix in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from cooler>=0.8.5->clodius) (0.3.7)
Requirement already satisfied: scipy>=0.16 in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from cooler>=0.8.5->clodius) (1.3.1)
Requirement already satisfied: pyyaml in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from cooler>=0.8.5->clodius) (5.1.2)
Requirement already satisfied: pyfaidx in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from cooler>=0.8.5->clodius) (0.5.5.2)
Requirement already satisfied: biopython in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from cooler>=0.8.5->clodius) (1.75)
Requirement already satisfied: six in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from cooler>=0.8.5->clodius) (1.12.0)
Requirement already satisfied: cytoolz in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from cooler>=0.8.5->clodius) (0.10.0)
Requirement already satisfied: multiprocess in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from cooler>=0.8.5->clodius) (0.70.9)
Requirement already satisfied: pytz>=2017.2 in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from pandas->clodius) (2019.3)
Requirement already satisfied: python-dateutil>=2.6.1 in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from pandas->clodius) (2.8.0)
Requirement already satisfied: certifi>=2017.4.17 in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from requests->clodius) (2019.9.11)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from requests->clodius) (3.0.4)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from requests->clodius) (1.24.2)
Requirement already satisfied: idna<2.9,>=2.5 in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from requests->clodius) (2.8)
Requirement already satisfied: setuptools>=0.7 in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from pyfaidx->cooler>=0.8.5->clodius) (41.4.0)
Requirement already satisfied: toolz>=0.8.0 in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from cytoolz->cooler>=0.8.5->clodius) (0.10.0)
Requirement already satisfied: dill>=0.3.1 in /Users/jiangxu/anaconda3/lib/python3.7/site-packages (from multiprocess->cooler>=0.8.5->clodius) (0.3.1.1)
Building wheels for collected packages: pybbi
Building wheel for pybbi (setup.py) ... error
ERROR: Command errored out with exit status 1:
command: /Users/jiangxu/anaconda3/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/yy/87g5h_vd1s51f7ghkv2y9bnw0000gn/T/pip-install-wuedhxyq/pybbi/setup.py'"'"'; file='"'"'/private/var/folders/yy/87g5h_vd1s51f7ghkv2y9bnw0000gn/T/pip-install-wuedhxyq/pybbi/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' bdist_wheel -d /private/var/folders/yy/87g5h_vd1s51f7ghkv2y9bnw0000gn/T/pip-wheel-4i3t_8eq --python-tag cp37
cwd: /private/var/folders/yy/87g5h_vd1s51f7ghkv2y9bnw0000gn/T/pip-install-wuedhxyq/pybbi/
...
1 error generated.
make[1]: *** [udc.o] Error 1
make: *** [src/x86_64/libkent.a] Error 2
Traceback (most recent call last):
File "", line 1, in
File "/private/var/folders/yy/87g5h_vd1s51f7ghkv2y9bnw0000gn/T/pip-install-wuedhxyq/pybbi/setup.py", line 152, in
'build_ext': build_ext
File "/Users/jiangxu/anaconda3/lib/python3.7/site-packages/setuptools/init.py", line 145, in setup
return distutils.core.setup(**attrs)
File "/Users/jiangxu/anaconda3/lib/python3.7/distutils/core.py", line 148, in setup
dist.run_commands()
File "/Users/jiangxu/anaconda3/lib/python3.7/distutils/dist.py", line 966, in run_commands
self.run_command(cmd)
File "/Users/jiangxu/anaconda3/lib/python3.7/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/Users/jiangxu/anaconda3/lib/python3.7/site-packages/setuptools/command/install.py", line 61, in run
return orig.install.run(self)
File "/Users/jiangxu/anaconda3/lib/python3.7/distutils/command/install.py", line 545, in run
self.run_command('build')
File "/Users/jiangxu/anaconda3/lib/python3.7/distutils/cmd.py", line 313, in run_command
self.distribution.run_command(command)
File "/Users/jiangxu/anaconda3/lib/python3.7/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/Users/jiangxu/anaconda3/lib/python3.7/distutils/command/build.py", line 135, in run
self.run_command(cmd_name)
File "/Users/jiangxu/anaconda3/lib/python3.7/distutils/cmd.py", line 313, in run_command
self.distribution.run_command(command)
File "/Users/jiangxu/anaconda3/lib/python3.7/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/private/var/folders/yy/87g5h_vd1s51f7ghkv2y9bnw0000gn/T/pip-install-wuedhxyq/pybbi/setup.py", line 86, in run
check_call(['make', 'build-c'])
File "/Users/jiangxu/anaconda3/lib/python3.7/subprocess.py", line 347, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['make', 'build-c']' returned non-zero exit status 2.
----------------------------------------
ERROR: Command errored out with exit status 1: /Users/jiangxu/anaconda3/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/yy/87g5h_vd1s51f7ghkv2y9bnw0000gn/T/pip-install-wuedhxyq/pybbi/setup.py'"'"'; file='"'"'/private/var/folders/yy/87g5h_vd1s51f7ghkv2y9bnw0000gn/T/pip-install-wuedhxyq/pybbi/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' install --record /private/var/folders/yy/87g5h_vd1s51f7ghkv2y9bnw0000gn/T/pip-record-a7uczv4x/install-record.txt --single-version-externally-managed --compile Check the logs for full command output.

Error Clodius aggregate: RuntimeError: Invalid interval bounds!

Hi ,
I am trying to aggregate a bigWig and clodius send me the following error:

position: 167772161 progress: 0.05 elapsed: 106.55 remaining: 1885.81
len(data_buffers[curr_zoom]) 16777216
positions[curr_zoom]: 167772160
Traceback (most recent call last):
  File "/home/samuel/programs/bin/clodius", line 11, in <module>
    load_entry_point('clodius==0.7.0', 'console_scripts', 'clodius')()
  File "/usr/local/src/anaconda2/lib/python2.7/site-packages/click/core.py", line 722, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/src/anaconda2/lib/python2.7/site-packages/click/core.py", line 697, in main
    rv = self.invoke(ctx)
  File "/usr/local/src/anaconda2/lib/python2.7/site-packages/click/core.py", line 1066, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/src/anaconda2/lib/python2.7/site-packages/click/core.py", line 1066, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/src/anaconda2/lib/python2.7/site-packages/click/core.py", line 895, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/src/anaconda2/lib/python2.7/site-packages/click/core.py", line 535, in invoke
    return callback(*args, **kwargs)
  File "build/bdist.linux-x86_64/egg/clodius/cli/aggregate.py", line 1104, in bigwig
  File "build/bdist.linux-x86_64/egg/clodius/cli/aggregate.py", line 669, in _bigwig
RuntimeError: Invalid interval bounds!

(he is still on chr1).
Do you have any idea what it could be? My bigwig seems fine with other programs...
Thanks!

error installation

I am getting this error while trying to install clodius.

sudo pip install clodius

Collecting clodius
Downloading https://files.pythonhosted.org/packages/7f/f4/9409fa3e1917cd8de859943de7dd77286391873c7aa0f55bccfdfad86cb2/clodius-0.9.3.tar.gz (90kB)
100% |████████████████████████████████| 92kB 2.1MB/s
Requirement already satisfied: cython in /usr/local/lib64/python3.6/site-packages (from clodius) (0.28.5)
Requirement already satisfied: numpy in /usr/local/lib64/python3.6/site-packages (from clodius) (1.15.1)
Requirement already satisfied: negspy in /usr/local/lib/python3.6/site-packages (from clodius) (0.2.20)
Requirement already satisfied: pysam in /usr/local/lib64/python3.6/site-packages (from clodius) (0.15.0)
Requirement already satisfied: requests in /usr/lib/python3.6/site-packages (from clodius) (2.18.4)
Requirement already satisfied: h5py in /usr/local/lib64/python3.6/site-packages (from clodius) (2.8.0)
Requirement already satisfied: pandas in /usr/local/lib64/python3.6/site-packages (from clodius) (0.23.4)
Requirement already satisfied: slugid in /usr/local/lib/python3.6/site-packages (from clodius) (1.0.7)
Requirement already satisfied: sortedcontainers in /usr/local/lib/python3.6/site-packages (from clodius) (2.0.5)
Requirement already satisfied: nose in /usr/local/lib/python3.6/site-packages (from clodius) (1.3.7)
Requirement already satisfied: Click in /usr/local/lib64/python3.6/site-packages (from clodius) (6.7)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/lib/python3.6/site-packages (from requests->clodius) (3.0.4)
Requirement already satisfied: idna<2.7,>=2.5 in /usr/lib/python3.6/site-packages (from requests->clodius) (2.5)
Requirement already satisfied: urllib3<1.23,>=1.21.1 in /usr/lib/python3.6/site-packages (from requests->clodius) (1.22)
Requirement already satisfied: six in /usr/lib/python3.6/site-packages (from h5py->clodius) (1.11.0)
Requirement already satisfied: pytz>=2011k in /usr/lib/python3.6/site-packages (from pandas->clodius) (2017.2)
Requirement already satisfied: python-dateutil>=2.5.0 in /usr/local/lib/python3.6/site-packages (from pandas->clodius) (2.7.3)
Installing collected packages: clodius
Running setup.py install for clodius ... error
Complete output from command /usr/bin/python3 -u -c "import setuptools, tokenize;file='/tmp/pip-install-t95klw2h/clodius/setup.py';f=getattr(tokenize, 'open', open)(file);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, file, 'exec'))" install --record /tmp/pip-record-vpzh88ol/install-record.txt --single-version-externally-managed --compile:
packages: ['clodius', 'scripts', 'test', 'clodius.cli']
running install
running build
running build_py
creating build
creating build/lib.linux-x86_64-3.6
creating build/lib.linux-x86_64-3.6/clodius
copying clodius/describe_dataset.py -> build/lib.linux-x86_64-3.6/clodius
copying clodius/save_tiles.py -> build/lib.linux-x86_64-3.6/clodius
copying clodius/init.py -> build/lib.linux-x86_64-3.6/clodius
copying clodius/tiles.py -> build/lib.linux-x86_64-3.6/clodius
copying clodius/hdf_tiles.py -> build/lib.linux-x86_64-3.6/clodius
copying clodius/fpark.py -> build/lib.linux-x86_64-3.6/clodius
copying clodius/higlass_getter.py -> build/lib.linux-x86_64-3.6/clodius
copying clodius/chromosomes.py -> build/lib.linux-x86_64-3.6/clodius
copying clodius/db_tiles.py -> build/lib.linux-x86_64-3.6/clodius
copying clodius/multivec.py -> build/lib.linux-x86_64-3.6/clodius
creating build/lib.linux-x86_64-3.6/clodius/cli
copying clodius/cli/utils.py -> build/lib.linux-x86_64-3.6/clodius/cli
copying clodius/cli/init.py -> build/lib.linux-x86_64-3.6/clodius/cli
copying clodius/cli/convert.py -> build/lib.linux-x86_64-3.6/clodius/cli
copying clodius/cli/aggregate.py -> build/lib.linux-x86_64-3.6/clodius/cli
running build_ext
building 'clodius.fast' extension
creating build/temp.linux-x86_64-3.6
creating build/temp.linux-x86_64-3.6/clodius
gcc -pthread -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -g -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fexceptions -fstack-protector-strong -grecord-gcc-switches -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -mcet -fcf-protection -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python3.6m -I/usr/local/lib64/python3.6/site-packages/numpy/core/include -I/usr/local/lib64/python3.6/site-packages/numpy/core/include -c clodius/fast.c -o build/temp.linux-x86_64-3.6/clodius/fast.o
clodius/fast.c:24:10: fatal error: Python.h: No such file or directory
#include "Python.h"
^~~~~~~~~~
compilation terminated.
error: command 'gcc' failed with exit status 1

----------------------------------------

Command "/usr/bin/python3 -u -c "import setuptools, tokenize;file='/tmp/pip-install-t95klw2h/clodius/setup.py';f=getattr(tokenize, 'open', open)(file);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, file, 'exec'))" install --record /tmp/pip-record-vpzh88ol/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-install-t95klw2h/clodius/

Clodius aggregate with custom assembly: TypeError: Can't broadcast

Hi there,

I'm working with Drosophila data, aligned to dm6 from Flybase, which is in an Ensembl-like format (i.e., no 'chr' prefix'). Because negspy only has UCSC-like assemblies included, using --assembly dm6 I get errors like KeyError: 'X'.

So, I'm using --chromsizes-filename to specify a file that contains chrom sizes for my genome version and for only the main chromosomes, since my bedgraph has already been filtered to have only the main chromosomes. Here's the command I'm running and the output:

clodius aggregate bedgraph test_Rep1_10kb_corrected_pc.eigenvector.bed \
--output-file test_Rep1_10kb_corrected_pc.eigenvector.hitile \
--chromosome-col 1 --from-pos-col 2 --to-pos-col 3 --value-col 5 \
--chromsizes-filename dm6_chrom_sizes_sanitized.txt  --nan-value nan --no-header
output file: test_Rep1_10kb_corrected_pc.eigenvector.hitile
assembly_size: 137547960
assembly: hg19
assembly size (max-length) 137547960
max-width 268435456
max_zoom: 18
chunk-size: 16777216
chrom-order [b'2L' b'2R' b'3L' b'3R' b'4' b'X' b'Y']
len(values): 110458336 16777216
line: X	1	120000	A	0.0	.

position: 1 progress: 0.00 elapsed: 8.87 remaining: 1220465716.46
len(data_buffers[curr_zoom]) 16777216
positions[curr_zoom]: 0
len(values): 93681120 16777216
line: X	1	120000	A	0.0	.

[some output removed]

Traceback (most recent call last):
  File "/home/research/vaquerizas/liz/test/env/bin/clodius", line 11, in <module>
    load_entry_point('clodius==0.10.8', 'console_scripts', 'clodius')()
  File "/home/research/vaquerizas/liz/test/env/lib/python3.7/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/home/research/vaquerizas/liz/test/env/lib/python3.7/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/home/research/vaquerizas/liz/test/env/lib/python3.7/site-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/research/vaquerizas/liz/test/env/lib/python3.7/site-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/research/vaquerizas/liz/test/env/lib/python3.7/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/research/vaquerizas/liz/test/env/lib/python3.7/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/home/research/vaquerizas/liz/test/env/lib/python3.7/site-packages/clodius/cli/aggregate.py", line 1322, in bedgraph
    chromsizes_filename, zoom_step)
  File "/home/research/vaquerizas/liz/test/env/lib/python3.7/site-packages/clodius/cli/aggregate.py", line 938, in _bedgraph
    values[:chunk_size], nan_values[:chunk_size]
  File "/home/research/vaquerizas/liz/test/env/lib/python3.7/site-packages/clodius/cli/aggregate.py", line 842, in add_values_to_data_buffers
    dsets[curr_zoom][curr_pos:curr_pos+chunk_size] = curr_chunk
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "/home/research/vaquerizas/liz/test/env/lib/python3.7/site-packages/h5py/_hl/dataset.py", line 707, in __setitem__
    for fspace in selection.broadcast(mshape):
  File "/home/research/vaquerizas/liz/test/env/lib/python3.7/site-packages/h5py/_hl/selections.py", line 299, in broadcast
    raise TypeError("Can't broadcast %s -> %s" % (target_shape, self.mshape))
TypeError: Can't broadcast (16777216,) -> (3330232,)

Any suggestions would be appreciated! I was wondering if this is also related to #87 ?

Make sure rectangular data is not truncated

tsv_to_mrmatrix.py:

    if top_n is None:
        top_n = len(parts) - 1
        # TODO: If it's taller than it is wide, it will be truncated to a square,
        # unless an explicit top_n is provided.

mrmatrix: coarsen should create nan_values at each resolution?

@pkerpedjiev : Am I understanding it correctly, that there should be a nan_values at each resolution? It is not producing that right now. Without that, it would still need to consult the large matrix at resolution 1 to calculate averages. Assign back to me if there's work to do.

ts_to_mrmatrix_test.py:

class ParseTest(unittest.TestCase):
    def test_parse(self):
        ...
        self.assertEqual(list(hdf5['resolutions']['1'].keys()), ['nan_values', 'values'])
        ...
        self.assertEqual(list(hdf5['resolutions']['2'].keys()), ['values'])

docker run example in README does not work for me

(clodius) clodius$ curl https://raw.githubusercontent.com/hms-dbmi/clodius/develop/test/sample_data/geneAnnotationsExonsUnions.short.bed \
   > /tmp/clodius/input/sample.short.bed

(clodius) clodius$ docker stop clodius
Error response from daemon: No such container: clodius

(clodius) clodius$ docker pull gehlenborglab/clodius
Using default tag: latest
latest: Pulling from gehlenborglab/clodius
....
Digest: sha256:69c09cc20bdfa0d102a7888d502713a9cd0e16fb9637250dd5f6bdb3dc5c1ef3
Status: Downloaded newer image for gehlenborglab/clodius:latest

(clodius) clodius$ docker run -v /tmp/clodius/input:/tmp/ \
>            gehlenborglab/clodius \
>            clodius aggregate bigwig /tmp/file.bigwig
Traceback (most recent call last):
  File "/usr/local/bin/clodius", line 11, in <module>
    load_entry_point('clodius==0.9.4', 'console_scripts', 'clodius')()
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "build/bdist.linux-x86_64/egg/clodius/cli/aggregate.py", line 1372, in bigwig
NameError: global name '_bigwig' is not defined

Warning from h5py

I got this warning while running clodius aggregate bigwig

/usr/local/lib/python2.7/dist-packages/h5py/__init__.py:36: 
FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` 
is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
  from ._conv import register_converters as _register_converters

Keep clodius docs under clodius

The clodius docs are currently in the higlass repo. They should be moved here, and a sphinx build should be set up, with the results pushed to an S3 bucket. Once it's in place, a least one inbound link should be fixed.

@pkerpedjiev : Could you confirm that this is what you want? I'm also wondering if that complexity is really necessary. I guess you wanted a distinction between the release docs and the under-construction docs, but I feel that just looking at the markdown on github on the branch you prefer does that well enough.

Split documentation into smaller chunks

Follow up for #70

   Non-genomic raster data
   Genomic data
      Bed Files
      Bedpe-like Files
      BedGraph files
      bigWig files
      Chromosome Sizes
      Gene Annotation Tracks
      Hitile files
      Cooler files
      Multivec Files (Maybe multivec is a single page?)
         Epilogos Data (multivec)
         States Data (multivec)
         Other Data (multivec)

Validate mrmatrix on load

  • Confirm that resolutions are powers of two
  • Confirm that higher resolutions have smaller dimensions
  • Not sure what's up with nan_values: #62

libcurl-devel is required for install

Looks like pip install clodius tried to install pyBigWig but it couldn't finish without the libcurl dev package. I installed libcurl3-dev and it works, though it looks like it's a known issue on pyBigWig (deeptools/pyBigWig#9)

Collecting clodius
  Downloading clodius-0.7.4.tar.gz (72kB)
    100% |████████████████████████████████| 81kB 3.5MB/s 
    Complete output from command python setup.py egg_info:
    Either libcurl isn't installed, it didn't come with curl-config, or curl-config isn't in your $PATH. pyBigWig will be installed without support for remote files.
    /usr/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'classifier'
      warnings.warn(msg)
    ('packages:', ['clodius', 'clodius.cli'])
    pyBigWig.c: In function ‘canAppend’:
    pyBigWig.c:911:11: warning: unused variable ‘foo’ [-Wunused-variable]
         void *foo;
               ^~~
    pyBigWig.c: In function ‘PyAddIntervals’:
    pyBigWig.c:1004:11: warning: unused variable ‘foo’ [-Wunused-variable]
         void *foo;
               ^~~
    pyBigWig.c:998:15: warning: unused variable ‘tmp’ [-Wunused-variable]
         PyObject *tmp;
                   ^~~
    /usr/bin/ld: cannot find -lcurl
    collect2: error: ld returned 1 exit status
    /usr/bin/ld: cannot find -lcurl
    collect2: error: ld returned 1 exit status

Error in aggregate: slugid no longer needs decoding

Hi there,
I'm getting the following error trying to aggregate a bedpe file:

output_file: consensus_loops_all_peaks.hitile
plain
Traceback (most recent call last):
  File "/home/lingsim/.linuxbrew/bin/clodius", line 10, in <module>
    sys.exit(cli())
  File "/home/lingsim/.linuxbrew/opt/python/lib/python3.7/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/home/lingsim/.linuxbrew/opt/python/lib/python3.7/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/home/lingsim/.linuxbrew/opt/python/lib/python3.7/site-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/lingsim/.linuxbrew/opt/python/lib/python3.7/site-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/lingsim/.linuxbrew/opt/python/lib/python3.7/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/lingsim/.linuxbrew/opt/python/lib/python3.7/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/home/lingsim/.linuxbrew/opt/python/lib/python3.7/site-packages/clodius/cli/aggregate.py", line 1548, in bedpe
    chr2_col=chr2_col-1, from2_col=from2_col-1, to2_col=to2_col-1
  File "/home/lingsim/.linuxbrew/opt/python/lib/python3.7/site-packages/clodius/cli/aggregate.py", line 261, in _bedpe
    entries = [line_to_dict(first_line)]
  File "/home/lingsim/.linuxbrew/opt/python/lib/python3.7/site-packages/clodius/cli/aggregate.py", line 216, in line_to_dict
    d['uid'] = slugid.nice().decode('utf-8')
AttributeError: 'str' object has no attribute 'decode'

This appears to be because slugid recently changed to now return a string, so it no longer needs decoding.

(I'm happy to make a pull request with a fix if it's as simple as removing the decode, but I'm not familiar with what's necessary for compatibility with both Python2 and Python3, if that's a concern here.)

aggregate bedgraph keeps being killed

I'm running the command

clodius aggregate bedgraph data/bedgraph_chrx_2e6_5e6.bg.bedgraph.sorted --output-file data/bedgraph_chrx_2e6_5e6.bg.bedgraph.out --chromosome-col 1 --from-pos-col 2 --to-pos-col 3 --value-col 4 --assembly hg19 --nan-value NA --chromsizes-filename negspy/negspy/data/hg19/chromSizes.tsv

and it appears to start up properly every time:

output file: data/bedgraph_chrx_2e6_5e6.bg.bedgraph.out
assembly_size: 3095693983
assembly: hg19
assembly size (max-length) 3095693983
max-width 4294967296
max_zoom: 22
chunk-size: 16777216
chrom-order [b'chr1' b'chr2' b'chr3' b'chr4' b'chr5' b'chr6' b'chr7' b'chr8' b'chr9'
 b'chr10' b'chr11' b'chr12' b'chr13' b'chr14' b'chr15' b'chr16' b'chr17'
 b'chr18' b'chr20' b'chr19' b'chr21' b'chr22' b'chrX' b'chrY' b'chrM']

But then it just returns "killed" and exits. How do I fix this/tell what is going wrong?

Support bounds for mrmatrix

pete says:

In the future, it should work like in npmatrix. The reasoning: a matrix, as implemented programmatically is necessarily a 2D array whose indices go from (0, 0) to (len-1, len-1). Imagine that we want that matrix to represent rasterized values for a geographical region. Bounds would indicate the upper left and lower right corner latitude, longitude pairs for the rasterized region. So you could have a 1000,1000 matrix represent a region that goes from (10 degrees,10 degrees) to (10.1 degrees, 10.1 degrees), for example.

'Invalid line' message when aggregating bed file

Hi,

I am trying to use clodius aggregate bedfile on a bed file I made myself, and I keep getting the following message printed to stdout:

Invalid line: 1	15543353	15543354
Invalid line: 1	31706778	31706779

Invalid line: 1	31732787	31732788

Invalid line: 1	31872749	31872750

Invalid line: 1	31920245	31920246

Invalid line: 1	33496638	33496639

Invalid line: 1	47684932	47684933

Invalid line: 1	47695094	47695095

this repeats for every line in my bed file.

does this have to do with the format of my file? I am giving it the --delimiter '\t' option. when I use $'t' as suggested in the higlass manual, clodius is unable to parse my file. I also have a description column in my original bed file, but I removed it to see if it was causing the issue. since I got the error message without the description column, I'm not sure what's causing this.

thx in advance for your help!

clodius aggregate bigwig throws "RuntimeError: Unable to create attribute (Object header message is too large)" for large genomes

I am running clodius aggregate on a bigwig that was mapped to a poorly assembled genome with 23k contigs. I intend to show only <40 chromosomes and I created a negspy genome using the reduced set of chromsizes. However, aggregate still writes the full chromsizes from the bigwig header into the hitile, which overflows the limit set by HDF5 (see below the error at line aggregate.py:542). I believe this error could be fixed by writing the chromsizes from the negspy genome instead?..

me@mymachine:$ clodius aggregate bigwig -a bigGenomeReducedTo40Chroms data.fc.signal.bw

Traceback (most recent call last):
File "/home/golobor/miniconda3/bin/clodius", line 11, in
sys.exit(cli())
File "/home/golobor/miniconda3/lib/python3.6/site-packages/click/core.py", line 722, in call
return self.main(*args, **kwargs)
File "/home/golobor/miniconda3/lib/python3.6/site-packages/click/core.py", line 697, in main
rv = self.invoke(ctx)
File "/home/golobor/miniconda3/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/golobor/miniconda3/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/golobor/miniconda3/lib/python3.6/site-packages/click/core.py", line 895, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/golobor/miniconda3/lib/python3.6/site-packages/click/core.py", line 535, in invoke
return callback(*args, **kwargs)
File "/home/golobor/miniconda3/lib/python3.6/site-packages/clodius/cli/aggregate.py", line 1026, in bigwig
_bigwig(filepath, chunk_size, zoom_step, tile_size, output_file, assembly, chromosome)
File "/home/golobor/miniconda3/lib/python3.6/site-packages/clodius/cli/aggregate.py", line 542, in _bigwig
d.attrs['chrom-names'] = [s.encode('utf-8') for s in bwf.chroms().keys()]
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/h5py_1490028290543/work/h5py/_objects.c:2846)
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/h5py_1490028290543/work/h5py/_objects.c:2804)
File "/home/golobor/miniconda3/lib/python3.6/site-packages/h5py/_hl/attrs.py", line 93, in setitem
self.create(name, data=value, dtype=base.guess_dtype(value))
File "/home/golobor/miniconda3/lib/python3.6/site-packages/h5py/_hl/attrs.py", line 188, in create
attr = h5a.create(self._id, self._e(tempname), htype, space)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/h5py_1490028290543/work/h5py/_objects.c:2846)
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/h5py_1490028290543/work/h5py/_objects.c:2804)
File "h5py/h5a.pyx", line 47, in h5py.h5a.create (/home/ilan/minonda/conda-bld/h5py_1490028290543/work/h5py/h5a.c:2075)
RuntimeError: Unable to create attribute (Object header message is too large)

Error when importance column is not an integer

Hi,

When using clodius to aggregate a bed file with a non-integer importance column, I get the following error:

[lingsim@node-10-04 boundaries]$ clodius aggregate bedfile boundaries.bed --output-file boundaries.hitile --assembly hg38 --importance-column 5 --no-header

/home/lingsim/.linuxbrew/opt/python3/lib/python3.6/site-packages/h5py-2.7.0-py3.6-linux-x86_64.egg/h5py/__init__.py:34: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
  from ._conv import register_converters as _register_converters
delimiter: None
Traceback (most recent call last):
  File "/home/lingsim/.linuxbrew/bin/clodius", line 11, in <module>
    sys.exit(cli())
  File "/home/lingsim/.linuxbrew/opt/python3/lib/python3.6/site-packages/click/core.py", line 722, in __call__
    return self.main(*args, **kwargs)
  File "/home/lingsim/.linuxbrew/opt/python3/lib/python3.6/site-packages/click/core.py", line 697, in main
    rv = self.invoke(ctx)
  File "/home/lingsim/.linuxbrew/opt/python3/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/lingsim/.linuxbrew/opt/python3/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/lingsim/.linuxbrew/opt/python3/lib/python3.6/site-packages/click/core.py", line 895, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/lingsim/.linuxbrew/opt/python3/lib/python3.6/site-packages/click/core.py", line 535, in invoke
    return callback(*args, **kwargs)
  File "/home/lingsim/.linuxbrew/opt/python3/lib/python3.6/site-packages/clodius/cli/aggregate.py", line 1448, in bedfile
    offset
  File "/home/lingsim/.linuxbrew/opt/python3/lib/python3.6/site-packages/clodius/cli/aggregate.py", line 484, in _bedfile
    dset += [line_to_np_array(line_parts)]
  File "/home/lingsim/.linuxbrew/opt/python3/lib/python3.6/site-packages/clodius/cli/aggregate.py", line 451, in line_to_np_array
    importance = int(line[int(importance_column)-1])
ValueError: invalid literal for int() with base 10: '0.07065425316492718'

It looks like importance = int(line[int(importance_column)-1]) in line_to_np_array
should be importance = float(line[int(importance_column)-1]), similar to line_to_dict.

Edit: Since this is such a small change I realised I could just submit a pull request with it - hope that's okay :)

Typo in make_tiles.py

There is a typo on line 48 in make_tiles.py. The first sentence does not make sense. Can you correct the statement?

help='The that has the value of each point. Used for aggregation and display')

h5py KeyError exception with clodius 0.9.4

Running clodius version 0.9.4 within a conda virtual environment raised the following error when doing bedfile-to-multivec conversion:

Traceback (most recent call last):
  File "/home/areynolds/.conda/envs/clodius/bin/clodius", line 11, in <module>
    sys.exit(cli())
  File "/home/areynolds/.conda/envs/clodius/lib/python3.6/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/home/areynolds/.conda/envs/clodius/lib/python3.6/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/home/areynolds/.conda/envs/clodius/lib/python3.6/site-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/areynolds/.conda/envs/clodius/lib/python3.6/site-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/areynolds/.conda/envs/clodius/lib/python3.6/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/areynolds/.conda/envs/clodius/lib/python3.6/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/home/areynolds/.conda/envs/clodius/lib/python3.6/site-packages/clodius/cli/convert.py", line 275, in bedfile_to_multivec
    format, row_infos_filename, tile_size, method)
  File "/home/areynolds/.conda/envs/clodius/lib/python3.6/site-packages/clodius/cli/convert.py", line 162, in _bedgraph_to_multivec
    row_infos=row_infos)
  File "/home/areynolds/.conda/envs/clodius/lib/python3.6/site-packages/clodius/multivec.py", line 142, in create_multivec_multires
--More--(70%) 
    print("array_data:", array_data['segment1'][-20:])
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "/home/areynolds/.conda/envs/clodius/lib/python3.6/site-packages/h5py/_hl/group.py", line 177, in __getitem__
    oid = h5o.open(self.id, self._e(name), lapl=self._lapl)
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "h5py/h5o.pyx", line 190, in h5py.h5o.open
KeyError: "Unable to open object (object 'segment1' doesn't exist)"

While multivec conversion failed, a test aggregation of a small BED file did not fail (clodius aggregate bedfile etc.). I did not try other dataset types.

Here is the conversion environment setup and breakdown, where I'm using defaults of what pip makes available:

>&2 echo "CREATE ${ENVIRONMENT}"
conda create -n ${ENVIRONMENT} python=3.6 --no-default-packages --yes

--More--(42%) 
>&2 echo "ACTIVATE ${ENVIRONMENT}"
source activate ${ENVIRONMENT}

>&2 echo "INSTALL ${ENVIRONMENT}"
pip install --upgrade pip
pip install --ignore-installed numpy
pip install --ignore-installed scipy
pip install --ignore-installed h5py
pip install clodius

clodius convert -h

clodius convert bedfile-to-multivec ${RWD}/data/E001.bed.gz \
    --assembly ${ASSEMBLY} \
    --starting-resolution ${RESOLUTION} \
    --row-infos-filename ${RWD}/data/${ASSEMBLY}.${STATES}.txt \
    --num-rows ${STATES} \
    --format epilogos

>&2 echo "DEACTIVATE ${ENVIRONMENT}"
source deactivate ${ENVIRONMENT}

>&2 echo "REMOVE ${ENVIRONMENT}"
conda remove -n ${ENVIRONMENT} --all --yes

This script installed the following libraries (among other dependencies):

h5py-2.8.0
numpy-1.15.3
scipy-1.1.0

Reverting to version 0.9.3 of clodius and specifying older versions of h5py and numpy dependencies seemed to resolve this issue (I also specified scipy, even though it is current with the default):

...
pip install --ignore-installed numpy==1.12.0
pip install --ignore-installed scipy==1.1.0
pip install --ignore-installed h5py==2.6.0
pip install clodius==0.9.3
...

I was able to generate multivec output from this combination of packages.

tiff -> mrmatrix?

It's on our plate to be able to handle tiff files. I do remember your wariness about adding support for more matrix forms beyond TSV but...

  • For our audience, we want to be able to say that it can be handled out of the box.
  • Big tiffs could get very, very, big if translated into an intermediate TSV
  • Getting the details of the tiling right is finicky... Now that it's done, I feel like the TSV parser itself should really just parse to a numpy array, and then do the same thing for tiffs with this: https://pypi.org/project/tifffile/

If this is going to happen, would you rather the work be done in clodius, or should there be a new clodius-imaging where it would go?

Also, just a gut check: You do feel confident that we're not getting too far into wheel reinvention here? I think there's an alternate route, where we work on the client side to handle a wider range of image tiling formats.

Would a DAG diagram of filetypes -> tiletypes -> tracktypes be useful?

aggregate bedpe memory issue

I am running the following on a cluster (allocating ~96GB to the Job):
clodius aggregate bedpe --assembly mm10 --chr1-col 1 --from1-col 2 --to1-col 3 --chr2-col 4 --from2-col 5 --to2-col 6 --chromosome chr22 GSM.bedpe
...where GSM.bedpe has the format:
chr start end chr start end
...and I am getting the following error:
slurmstepd: error : Exceeded step memory limit at some point.
So my question is: how much memory would be appropriate for a job like this? Or is there some other thing I am doing wrong?

clodius aggregate: NameError: global name '_bigwig' is not defined

On ubuntu xenial with a fresh python 2.7.12 install, I get the following:

root@609c5b8e4115:/$ clodius aggregate bigwig HistonH3_cre_trt_rep1.bw
/usr/local/lib/python2.7/dist-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
  from ._conv import register_converters as _register_converters
Traceback (most recent call last):
  File "/usr/local/bin/clodius", line 11, in <module>
    sys.exit(cli())
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 722, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 697, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 1066, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 1066, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 895, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 535, in invoke
    return callback(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/clodius/cli/aggregate.py", line 1347, in bigwig
    _bigwig(
NameError: global name '_bigwig' is not defined

installation issue with python3.6 in conda

pip install clodius installation under conda python3.6 was giving me this (already after conda install pyBigWig):

Collecting clodius==0.7.4
  Using cached clodius-0.7.4.tar.gz
Requirement already satisfied: cython in /home/venevs/miniconda3/lib/python3.6/site-packages (from clodius==0.7.4)
Requirement already satisfied: numpy in /home/venevs/miniconda3/lib/python3.6/site-packages (from clodius==0.7.4)
Collecting negspy (from clodius==0.7.4)
  Using cached negspy-0.2.20.tar.gz
Requirement already satisfied: pysam in /home/venevs/miniconda3/lib/python3.6/site-packages (from clodius==0.7.4)
Requirement already satisfied: requests in /home/venevs/miniconda3/lib/python3.6/site-packages (from clodius==0.7.4)
Requirement already satisfied: h5py in /home/venevs/miniconda3/lib/python3.6/site-packages (from clodius==0.7.4)
Requirement already satisfied: pandas in /home/venevs/miniconda3/lib/python3.6/site-packages (from clodius==0.7.4)
Collecting slugid (from clodius==0.7.4)
  Using cached slugid-1.0.7.tar.gz
Requirement already satisfied: sortedcontainers in /home/venevs/miniconda3/lib/python3.6/site-packages (from clodius==0.7.4)
Requirement already satisfied: nose in /home/venevs/miniconda3/lib/python3.6/site-packages (from clodius==0.7.4)
Requirement already satisfied: pyBigWig in /home/venevs/miniconda3/lib/python3.6/site-packages (from clodius==0.7.4)
Exception:
Traceback (most recent call last):
  File "/home/venevs/miniconda3/lib/python3.6/site-packages/pip/basecommand.py", line 215, in main
    status = self.run(options, args)
  File "/home/venevs/miniconda3/lib/python3.6/site-packages/pip/commands/install.py", line 335, in run
    wb.build(autobuilding=True)
  File "/home/venevs/miniconda3/lib/python3.6/site-packages/pip/wheel.py", line 749, in build
    self.requirement_set.prepare_files(self.finder)
  File "/home/venevs/miniconda3/lib/python3.6/site-packages/pip/req/req_set.py", line 380, in prepare_files
    ignore_dependencies=self.ignore_dependencies))
  File "/home/venevs/miniconda3/lib/python3.6/site-packages/pip/req/req_set.py", line 666, in _prepare_file
    check_dist_requires_python(dist)
  File "/home/venevs/miniconda3/lib/python3.6/site-packages/pip/utils/packaging.py", line 48, in check_dist_requires_python
    feed_parser.feed(metadata)
  File "/home/venevs/miniconda3/lib/python3.6/email/feedparser.py", line 175, in feed
    self._input.push(data)
  File "/home/venevs/miniconda3/lib/python3.6/email/feedparser.py", line 103, in push
    self._partial.write(data)
TypeError: string argument expected, got 'NoneType'

everything worked after manually installing negspy at first: pip install negspy and slugid and forcing clodius to be installed without dependencies:
pip install 'clodius==0.7.4' --force-reinstall --no-deps

Some typos in aggregate bedpe

Options:
  -o, --output-file TEXT      The default output file name to use. If this
                              isn'tspecified, clodius will replace the current
                              extensionwith .bed2db
  -a, --assembly TEXT         The genome assembly that this file was created
                              against
  --importance-column TEXT    The column (1-based) containing information
                              about how importantthat row is. If it's absent,
                              then use the length of the region.If the value
                              is equal to `random`, then a random value will
                              beused for the importance (effectively leading
                              to random sampling)

for -o:

  • isn'tspecified (space missing)
  • extensionwith (space missing)
  • bed2db (d missing)

for --importance-column TEXT

  • TEXT should be INTEGER?
  • importantthat (space missing)
  • beused (space missing)

Clodius process killed when converting bigWig to hitile

In preparation for ingesting bigWig signal tracks, I tried converting a test bigWig file to hitile format using conda (Python v3.6.4) and clodius v0.7.4. The host is a t2.micro EC2 instance running Ubuntu Xenial v16.04.

The aggregation proceeds up to the point of processing chr1 and the process is then killed.

The file /tmp/LN43287.75_20.normalized.GRCh38_no_alts.bw is a bigWig file containing DNase I density signal. I can make this available privately, if useful.

Here is the log of my conversion attempt:

$ clodius aggregate bigwig /tmp/LN43287.75_20.normalized.GRCh38_no_alts.bw \
--output-file /tmp/LN43287.75_20.normalized.GRCh38_no_alts.bw.hitile \
--assembly hg38
there
chrom_order: ['chr1', 'chr2', 'chr3', 'chr4', 'chr5', 'chr6', 'chr7', 'chr8', 'chr9', 'chr10', 'chr11', 'chr12', 'chr13', 'chr14', 'chr15', 'chr16', 'chr17', 'chr18', 'chr19', 'chr20', 'chr21', 'chr22', 'chrX', 'chrY', 'chrM', 'chr15_KI270905v1_alt', 'chr6_GL000256v2_alt', 'chr6_GL000254v2_alt', 'chr6_GL000251v2_alt', 'chr6_GL000253v2_alt', 'chr6_GL000250v2_alt', 'chr6_GL000255v2_alt', 'chr6_GL000252v2_alt', 'chr17_KI270857v1_alt', 'chr16_KI270853v1_alt', 'chr16_KI270728v1_random', 'chr17_GL000258v2_alt', 'chr5_GL339449v2_alt', 'chr14_KI270847v1_alt', 'chr17_KI270908v1_alt', 'chr14_KI270846v1_alt', 'chr5_KI270897v1_alt', 'chr7_KI270803v1_alt', 'chr19_GL949749v2_alt', 'chr19_KI270938v1_alt', 'chr19_GL949750v2_alt', 'chr19_GL949748v2_alt', 'chr19_GL949751v2_alt', 'chr19_GL949746v1_alt', 'chr19_GL949752v1_alt', 'chr8_KI270821v1_alt', 'chr1_KI270763v1_alt', 'chr6_KI270801v1_alt', 'chr19_GL949753v2_alt', 'chr19_GL949747v2_alt', 'chr8_KI270822v1_alt', 'chr4_GL000257v2_alt', 'chr12_KI270904v1_alt', 'chr4_KI270925v1_alt', 'chr15_KI270852v1_alt', 'chr15_KI270727v1_random', 'chr9_KI270823v1_alt', 'chr15_KI270850v1_alt', 'chr1_KI270759v1_alt', 'chr12_GL877876v1_alt', 'chrUn_KI270442v1', 'chr17_KI270862v1_alt', 'chr15_GL383555v2_alt', 'chr19_GL383573v1_alt', 'chr4_KI270896v1_alt', 'chr4_GL383528v1_alt', 'chr17_GL383563v3_alt', 'chr8_KI270810v1_alt', 'chr1_GL383520v2_alt', 'chr1_KI270762v1_alt', 'chr15_KI270848v1_alt', 'chr17_KI270909v1_alt', 'chr14_KI270844v1_alt', 'chr8_KI270900v1_alt', 'chr10_GL383546v1_alt', 'chr13_KI270838v1_alt', 'chr8_KI270816v1_alt', 'chr22_KI270879v1_alt', 'chr8_KI270813v1_alt', 'chr11_KI270831v1_alt', 'chr15_GL383554v1_alt', 'chr8_KI270811v1_alt', 'chr18_GL383567v1_alt', 'chrX_KI270880v1_alt', 'chr8_KI270812v1_alt', 'chr19_KI270921v1_alt', 'chr17_KI270729v1_random', 'chr17_JH159146v1_alt', 'chrX_KI270913v1_alt', 'chr6_KI270798v1_alt', 'chr7_KI270808v1_alt', 'chr22_KI270876v1_alt', 'chr15_KI270851v1_alt', 'chr22_KI270875v1_alt', 'chr1_KI270766v1_alt', 'chr19_KI270882v1_alt', 'chr3_KI270778v1_alt', 'chr15_KI270849v1_alt', 'chr4_KI270786v1_alt', 'chr12_KI270835v1_alt', 'chr17_KI270858v1_alt', 'chr19_KI270867v1_alt', 'chr16_KI270855v1_alt', 'chr8_KI270926v1_alt', 'chr5_GL949742v1_alt', 'chr3_KI270780v1_alt', 'chr17_GL383565v1_alt', 'chr2_KI270774v1_alt', 'chr4_KI270790v1_alt', 'chr11_KI270927v1_alt', 'chr19_KI270932v1_alt', 'chr11_KI270903v1_alt', 'chr2_KI270894v1_alt', 'chr14_GL000225v1_random', 'chrUn_KI270743v1', 'chr11_KI270832v1_alt', 'chr7_KI270805v1_alt', 'chr4_GL000008v2_random', 'chr7_KI270809v1_alt', 'chr19_KI270887v1_alt', 'chr4_KI270789v1_alt', 'chr3_KI270779v1_alt', 'chr19_KI270914v1_alt', 'chr19_KI270886v1_alt', 'chr11_KI270829v1_alt', 'chr14_GL000009v2_random', 'chr21_GL383579v2_alt', 'chr11_JH159136v1_alt', 'chr19_KI270930v1_alt', 'chrUn_KI270747v1', 'chr18_GL383571v1_alt', 'chr19_KI270920v1_alt', 'chr6_KI270797v1_alt', 'chr3_KI270935v1_alt', 'chr17_KI270861v1_alt', 'chr15_KI270906v1_alt', 'chr5_KI270791v1_alt', 'chr14_KI270722v1_random', 'chr16_GL383556v1_alt', 'chr13_KI270840v1_alt', 'chr14_GL000194v1_random', 'chr11_JH159137v1_alt', 'chr19_KI270917v1_alt', 'chr7_KI270899v1_alt', 'chr19_KI270923v1_alt', 'chr10_KI270825v1_alt', 'chr19_GL383576v1_alt', 'chr19_KI270922v1_alt', 'chrUn_KI270742v1', 'chr22_KI270878v1_alt', 'chr19_KI270929v1_alt', 'chr11_KI270826v1_alt', 'chr6_KB021644v2_alt', 'chr17_GL000205v2_random', 'chr1_KI270765v1_alt', 'chr19_KI270916v1_alt', 'chr19_KI270890v1_alt', 'chr3_KI270784v1_alt', 'chr12_GL383551v1_alt', 'chr20_KI270870v1_alt', 'chrUn_GL000195v1', 'chr1_GL383518v1_alt', 'chr22_KI270736v1_random', 'chr10_KI270824v1_alt', 'chr14_KI270845v1_alt', 'chr3_GL383526v1_alt', 'chr13_KI270839v1_alt', 'chr22_KI270733v1_random', 'chrUn_GL000224v1', 'chr10_GL383545v1_alt', 'chrUn_GL000219v1', 'chr5_KI270792v1_alt', 'chr17_KI270860v1_alt', 'chr19_GL000209v2_alt', 'chr11_KI270830v1_alt', 'chr9_KI270719v1_random', 'chrUn_GL000216v2', 'chr22_KI270928v1_alt', 'chr1_KI270712v1_random', 'chr6_KI270800v1_alt', 'chr1_KI270706v1_random', 'chr2_KI270776v1_alt', 'chr18_KI270912v1_alt', 'chr3_KI270777v1_alt', 'chr5_GL383531v1_alt', 'chr3_JH636055v2_alt', 'chr14_KI270725v1_random', 'chr5_KI270796v1_alt', 'chr9_GL383541v1_alt', 'chr19_KI270885v1_alt', 'chr19_KI270919v1_alt', 'chr19_KI270889v1_alt', 'chr19_KI270891v1_alt', 'chr19_KI270915v1_alt', 'chr19_KI270933v1_alt', 'chr19_KI270883v1_alt', 'chr19_GL383575v2_alt', 'chr19_KI270931v1_alt', 'chr12_GL383550v2_alt', 'chr13_KI270841v1_alt', 'chrUn_KI270744v1', 'chr18_KI270863v1_alt', 'chr18_GL383569v1_alt', 'chr12_GL877875v1_alt', 'chr21_KI270874v1_alt', 'chr3_KI270924v1_alt', 'chr1_KI270761v1_alt', 'chr3_KI270937v1_alt', 'chr22_KI270734v1_random', 'chr18_GL383570v1_alt', 'chr5_KI270794v1_alt', 'chr4_GL383527v1_alt', 'chrUn_GL000213v1', 'chr3_KI270936v1_alt', 'chr3_KI270934v1_alt', 'chr9_GL383539v1_alt', 'chr3_KI270895v1_alt', 'chr22_GL383582v2_alt', 'chr3_KI270782v1_alt', 'chr1_KI270892v1_alt', 'chrUn_GL000220v1', 'chr2_KI270767v1_alt', 'chr2_KI270715v1_random', 'chr2_KI270893v1_alt', 'chrUn_GL000218v1', 'chr18_GL383572v1_alt', 'chr8_KI270817v1_alt', 'chr4_KI270788v1_alt', 'chrUn_KI270749v1', 'chr7_KI270806v1_alt', 'chr7_KI270804v1_alt', 'chr18_KI270911v1_alt', 'chrUn_KI270741v1', 'chr17_KI270910v1_alt', 'chr19_KI270884v1_alt', 'chr19_GL383574v1_alt', 'chr19_KI270888v1_alt', 'chr3_GL000221v1_random', 'chr11_GL383547v1_alt', 'chr2_KI270716v1_random', 'chr12_GL383553v2_alt', 'chr6_KI270799v1_alt', 'chr22_KI270731v1_random', 'chrUn_KI270751v1', 'chrUn_KI270750v1', 'chr8_KI270818v1_alt', 'chrX_KI270881v1_alt', 'chr21_KI270873v1_alt', 'chr2_GL383521v1_alt', 'chr8_KI270814v1_alt', 'chr12_GL383552v1_alt', 'chrUn_KI270519v1', 'chr2_KI270775v1_alt', 'chr17_KI270907v1_alt', 'chrUn_GL000214v1', 'chr8_KI270901v1_alt', 'chr2_KI270770v1_alt', 'chr16_KI270854v1_alt', 'chr8_KI270819v1_alt', 'chr17_GL383564v2_alt', 'chr2_KI270772v1_alt', 'chr8_KI270815v1_alt', 'chr5_KI270795v1_alt', 'chr5_KI270898v1_alt', 'chr20_GL383577v2_alt', 'chr1_KI270708v1_random', 'chr7_KI270807v1_alt', 'chr5_KI270793v1_alt', 'chr6_GL383533v1_alt', 'chr2_GL383522v1_alt', 'chr19_KI270918v1_alt', 'chr12_GL383549v1_alt', 'chr2_KI270769v1_alt', 'chr4_KI270785v1_alt', 'chr12_KI270834v1_alt', 'chr7_GL383534v2_alt', 'chr20_KI270869v1_alt', 'chr21_GL383581v2_alt', 'chr3_KI270781v1_alt', 'chr17_KI270730v1_random', 'chrUn_KI270438v1', 'chr4_KI270787v1_alt', 'chr18_KI270864v1_alt', 'chr2_KI270771v1_alt', 'chr1_GL383519v1_alt', 'chr2_KI270768v1_alt', 'chr1_KI270760v1_alt', 'chr3_KI270783v1_alt', 'chr17_KI270859v1_alt', 'chr11_KI270902v1_alt', 'chr18_GL383568v1_alt', 'chr22_KI270737v1_random', 'chr13_KI270843v1_alt', 'chr22_KI270877v1_alt', 'chr5_GL383530v1_alt', 'chr11_KI270721v1_random', 'chr22_KI270738v1_random', 'chr22_GL383583v2_alt', 'chr2_GL582966v2_alt', 'chrUn_KI270748v1', 'chrUn_KI270435v1', 'chr5_GL000208v1_random', 'chrUn_KI270538v1', 'chr17_GL383566v1_alt', 'chr16_GL383557v1_alt', 'chr17_JH159148v1_alt', 'chr5_GL383532v1_alt', 'chr21_KI270872v1_alt', 'chrUn_KI270756v1', 'chr6_KI270758v1_alt', 'chr12_KI270833v1_alt', 'chr6_KI270802v1_alt', 'chr21_GL383580v2_alt', 'chr22_KB663609v1_alt', 'chr22_KI270739v1_random', 'chr9_GL383540v1_alt', 'chrUn_KI270757v1', 'chr2_KI270773v1_alt', 'chr17_JH159147v1_alt', 'chr11_KI270827v1_alt', 'chr1_KI270709v1_random', 'chrUn_KI270746v1', 'chr16_KI270856v1_alt', 'chr21_GL383578v2_alt', 'chrUn_KI270753v1', 'chr19_KI270868v1_alt', 'chr9_GL383542v1_alt', 'chr20_KI270871v1_alt', 'chr12_KI270836v1_alt', 'chr19_KI270865v1_alt', 'chr1_KI270764v1_alt', 'chrUn_KI270589v1', 'chr14_KI270726v1_random', 'chr19_KI270866v1_alt', 'chr22_KI270735v1_random', 'chr1_KI270711v1_random', 'chrUn_KI270745v1', 'chr1_KI270714v1_random', 'chr22_KI270732v1_random', 'chr1_KI270713v1_random', 'chrUn_KI270754v1', 'chr1_KI270710v1_random', 'chr12_KI270837v1_alt', 'chr9_KI270717v1_random', 'chr14_KI270724v1_random', 'chr9_KI270720v1_random', 'chr14_KI270723v1_random', 'chr9_KI270718v1_random', 'chrUn_KI270317v1', 'chr13_KI270842v1_alt', 'chrY_KI270740v1_random', 'chrUn_KI270755v1', 'chr8_KI270820v1_alt', 'chr1_KI270707v1_random', 'chrUn_KI270579v1', 'chrUn_KI270752v1', 'chrUn_KI270512v1', 'chrUn_KI270322v1', 'chrUn_GL000226v1', 'chrUn_KI270311v1', 'chrUn_KI270366v1', 'chrUn_KI270511v1', 'chrUn_KI270448v1', 'chrUn_KI270521v1', 'chrUn_KI270581v1', 'chrUn_KI270582v1', 'chrUn_KI270515v1', 'chrUn_KI270588v1', 'chrUn_KI270591v1', 'chrUn_KI270522v1', 'chrUn_KI270507v1', 'chrUn_KI270590v1', 'chrUn_KI270584v1', 'chrUn_KI270320v1', 'chrUn_KI270382v1', 'chrUn_KI270468v1', 'chrUn_KI270467v1', 'chrUn_KI270362v1', 'chrUn_KI270517v1', 'chrUn_KI270593v1', 'chrUn_KI270528v1', 'chrUn_KI270587v1', 'chrUn_KI270364v1', 'chrUn_KI270371v1', 'chrUn_KI270333v1', 'chrUn_KI270374v1', 'chrUn_KI270411v1', 'chrUn_KI270414v1', 'chrUn_KI270510v1', 'chrUn_KI270390v1', 'chrUn_KI270375v1', 'chrUn_KI270420v1', 'chrUn_KI270509v1', 'chrUn_KI270315v1', 'chrUn_KI270302v1', 'chrUn_KI270518v1', 'chrUn_KI270530v1', 'chrUn_KI270304v1', 'chrUn_KI270418v1', 'chrUn_KI270424v1', 'chrUn_KI270417v1', 'chrUn_KI270508v1', 'chrUn_KI270303v1', 'chrUn_KI270381v1', 'chrUn_KI270529v1', 'chrUn_KI270425v1', 'chrUn_KI270396v1', 'chrUn_KI270363v1', 'chrUn_KI270386v1', 'chrUn_KI270465v1', 'chrUn_KI270383v1', 'chrUn_KI270384v1', 'chrUn_KI270330v1', 'chrUn_KI270372v1', 'chrUn_KI270548v1', 'chrUn_KI270580v1', 'chrUn_KI270387v1', 'chrUn_KI270391v1', 'chrUn_KI270305v1', 'chrUn_KI270373v1', 'chrUn_KI270422v1', 'chrUn_KI270316v1', 'chrUn_KI270338v1', 'chrUn_KI270340v1', 'chrUn_KI270583v1', 'chrUn_KI270334v1', 'chrUn_KI270429v1', 'chrUn_KI270393v1', 'chrUn_KI270516v1', 'chrUn_KI270389v1', 'chrUn_KI270466v1', 'chrUn_KI270388v1', 'chrUn_KI270544v1', 'chrUn_KI270310v1', 'chrUn_KI270412v1', 'chrUn_KI270395v1', 'chrUn_KI270376v1', 'chrUn_KI270337v1', 'chrUn_KI270335v1', 'chrUn_KI270378v1', 'chrUn_KI270379v1', 'chrUn_KI270329v1', 'chrUn_KI270419v1', 'chrUn_KI270336v1', 'chrUn_KI270312v1', 'chrUn_KI270539v1', 'chrUn_KI270385v1', 'chrUn_KI270423v1', 'chrUn_KI270392v1', 'chrUn_KI270394v1']
assembly size (max-length) 3209286105
max-width 4294967296
max_zoom: 22
chunk-size: 16777216
chrom-order [b'chr1' b'chr2' b'chr3' b'chr4' b'chr5' b'chr6' b'chr7' b'chr8' b'chr9'
 b'chr10' b'chr11' b'chr12' b'chr13' b'chr14' b'chr15' b'chr16' b'chr17'
 b'chr18' b'chr19' b'chr20' b'chr21' b'chr22' b'chrX' b'chrY' b'chrM'
 b'chr15_KI270905v1_alt' b'chr6_GL000256v2_alt' b'chr6_GL000254v2_alt'
 b'chr6_GL000251v2_alt' b'chr6_GL000253v2_alt' b'chr6_GL000250v2_alt'
 b'chr6_GL000255v2_alt' b'chr6_GL000252v2_alt' b'chr17_KI270857v1_alt'
 b'chr16_KI270853v1_alt' b'chr16_KI270728v1_random' b'chr17_GL000258v2_alt'
 b'chr5_GL339449v2_alt' b'chr14_KI270847v1_alt' b'chr17_KI270908v1_alt'
 b'chr14_KI270846v1_alt' b'chr5_KI270897v1_alt' b'chr7_KI270803v1_alt'
 b'chr19_GL949749v2_alt' b'chr19_KI270938v1_alt' b'chr19_GL949750v2_alt'
 b'chr19_GL949748v2_alt' b'chr19_GL949751v2_alt' b'chr19_GL949746v1_alt'
 b'chr19_GL949752v1_alt' b'chr8_KI270821v1_alt' b'chr1_KI270763v1_alt'
 b'chr6_KI270801v1_alt' b'chr19_GL949753v2_alt' b'chr19_GL949747v2_alt'
 b'chr8_KI270822v1_alt' b'chr4_GL000257v2_alt' b'chr12_KI270904v1_alt'
 b'chr4_KI270925v1_alt' b'chr15_KI270852v1_alt' b'chr15_KI270727v1_random'
 b'chr9_KI270823v1_alt' b'chr15_KI270850v1_alt' b'chr1_KI270759v1_alt'
 b'chr12_GL877876v1_alt' b'chrUn_KI270442v1' b'chr17_KI270862v1_alt'
 b'chr15_GL383555v2_alt' b'chr19_GL383573v1_alt' b'chr4_KI270896v1_alt'
 b'chr4_GL383528v1_alt' b'chr17_GL383563v3_alt' b'chr8_KI270810v1_alt'
 b'chr1_GL383520v2_alt' b'chr1_KI270762v1_alt' b'chr15_KI270848v1_alt'
 b'chr17_KI270909v1_alt' b'chr14_KI270844v1_alt' b'chr8_KI270900v1_alt'
 b'chr10_GL383546v1_alt' b'chr13_KI270838v1_alt' b'chr8_KI270816v1_alt'
 b'chr22_KI270879v1_alt' b'chr8_KI270813v1_alt' b'chr11_KI270831v1_alt'
 b'chr15_GL383554v1_alt' b'chr8_KI270811v1_alt' b'chr18_GL383567v1_alt'
 b'chrX_KI270880v1_alt' b'chr8_KI270812v1_alt' b'chr19_KI270921v1_alt'
 b'chr17_KI270729v1_random' b'chr17_JH159146v1_alt' b'chrX_KI270913v1_alt'
 b'chr6_KI270798v1_alt' b'chr7_KI270808v1_alt' b'chr22_KI270876v1_alt'
 b'chr15_KI270851v1_alt' b'chr22_KI270875v1_alt' b'chr1_KI270766v1_alt'
 b'chr19_KI270882v1_alt' b'chr3_KI270778v1_alt' b'chr15_KI270849v1_alt'
 b'chr4_KI270786v1_alt' b'chr12_KI270835v1_alt' b'chr17_KI270858v1_alt'
 b'chr19_KI270867v1_alt' b'chr16_KI270855v1_alt' b'chr8_KI270926v1_alt'
 b'chr5_GL949742v1_alt' b'chr3_KI270780v1_alt' b'chr17_GL383565v1_alt'
 b'chr2_KI270774v1_alt' b'chr4_KI270790v1_alt' b'chr11_KI270927v1_alt'
 b'chr19_KI270932v1_alt' b'chr11_KI270903v1_alt' b'chr2_KI270894v1_alt'
 b'chr14_GL000225v1_random' b'chrUn_KI270743v1' b'chr11_KI270832v1_alt'
 b'chr7_KI270805v1_alt' b'chr4_GL000008v2_random' b'chr7_KI270809v1_alt'
 b'chr19_KI270887v1_alt' b'chr4_KI270789v1_alt' b'chr3_KI270779v1_alt'
 b'chr19_KI270914v1_alt' b'chr19_KI270886v1_alt' b'chr11_KI270829v1_alt'
 b'chr14_GL000009v2_random' b'chr21_GL383579v2_alt' b'chr11_JH159136v1_alt'
 b'chr19_KI270930v1_alt' b'chrUn_KI270747v1' b'chr18_GL383571v1_alt'
 b'chr19_KI270920v1_alt' b'chr6_KI270797v1_alt' b'chr3_KI270935v1_alt'
 b'chr17_KI270861v1_alt' b'chr15_KI270906v1_alt' b'chr5_KI270791v1_alt'
 b'chr14_KI270722v1_random' b'chr16_GL383556v1_alt' b'chr13_KI270840v1_alt'
 b'chr14_GL000194v1_random' b'chr11_JH159137v1_alt' b'chr19_KI270917v1_alt'
 b'chr7_KI270899v1_alt' b'chr19_KI270923v1_alt' b'chr10_KI270825v1_alt'
 b'chr19_GL383576v1_alt' b'chr19_KI270922v1_alt' b'chrUn_KI270742v1'
 b'chr22_KI270878v1_alt' b'chr19_KI270929v1_alt' b'chr11_KI270826v1_alt'
 b'chr6_KB021644v2_alt' b'chr17_GL000205v2_random' b'chr1_KI270765v1_alt'
 b'chr19_KI270916v1_alt' b'chr19_KI270890v1_alt' b'chr3_KI270784v1_alt'
 b'chr12_GL383551v1_alt' b'chr20_KI270870v1_alt' b'chrUn_GL000195v1'
 b'chr1_GL383518v1_alt' b'chr22_KI270736v1_random' b'chr10_KI270824v1_alt'
 b'chr14_KI270845v1_alt' b'chr3_GL383526v1_alt' b'chr13_KI270839v1_alt'
 b'chr22_KI270733v1_random' b'chrUn_GL000224v1' b'chr10_GL383545v1_alt'
 b'chrUn_GL000219v1' b'chr5_KI270792v1_alt' b'chr17_KI270860v1_alt'
 b'chr19_GL000209v2_alt' b'chr11_KI270830v1_alt' b'chr9_KI270719v1_random'
 b'chrUn_GL000216v2' b'chr22_KI270928v1_alt' b'chr1_KI270712v1_random'
 b'chr6_KI270800v1_alt' b'chr1_KI270706v1_random' b'chr2_KI270776v1_alt'
 b'chr18_KI270912v1_alt' b'chr3_KI270777v1_alt' b'chr5_GL383531v1_alt'
 b'chr3_JH636055v2_alt' b'chr14_KI270725v1_random' b'chr5_KI270796v1_alt'
 b'chr9_GL383541v1_alt' b'chr19_KI270885v1_alt' b'chr19_KI270919v1_alt'
 b'chr19_KI270889v1_alt' b'chr19_KI270891v1_alt' b'chr19_KI270915v1_alt'
 b'chr19_KI270933v1_alt' b'chr19_KI270883v1_alt' b'chr19_GL383575v2_alt'
 b'chr19_KI270931v1_alt' b'chr12_GL383550v2_alt' b'chr13_KI270841v1_alt'
 b'chrUn_KI270744v1' b'chr18_KI270863v1_alt' b'chr18_GL383569v1_alt'
 b'chr12_GL877875v1_alt' b'chr21_KI270874v1_alt' b'chr3_KI270924v1_alt'
 b'chr1_KI270761v1_alt' b'chr3_KI270937v1_alt' b'chr22_KI270734v1_random'
 b'chr18_GL383570v1_alt' b'chr5_KI270794v1_alt' b'chr4_GL383527v1_alt'
 b'chrUn_GL000213v1' b'chr3_KI270936v1_alt' b'chr3_KI270934v1_alt'
 b'chr9_GL383539v1_alt' b'chr3_KI270895v1_alt' b'chr22_GL383582v2_alt'
 b'chr3_KI270782v1_alt' b'chr1_KI270892v1_alt' b'chrUn_GL000220v1'
 b'chr2_KI270767v1_alt' b'chr2_KI270715v1_random' b'chr2_KI270893v1_alt'
 b'chrUn_GL000218v1' b'chr18_GL383572v1_alt' b'chr8_KI270817v1_alt'
 b'chr4_KI270788v1_alt' b'chrUn_KI270749v1' b'chr7_KI270806v1_alt'
 b'chr7_KI270804v1_alt' b'chr18_KI270911v1_alt' b'chrUn_KI270741v1'
 b'chr17_KI270910v1_alt' b'chr19_KI270884v1_alt' b'chr19_GL383574v1_alt'
 b'chr19_KI270888v1_alt' b'chr3_GL000221v1_random' b'chr11_GL383547v1_alt'
 b'chr2_KI270716v1_random' b'chr12_GL383553v2_alt' b'chr6_KI270799v1_alt'
 b'chr22_KI270731v1_random' b'chrUn_KI270751v1' b'chrUn_KI270750v1'
 b'chr8_KI270818v1_alt' b'chrX_KI270881v1_alt' b'chr21_KI270873v1_alt'
 b'chr2_GL383521v1_alt' b'chr8_KI270814v1_alt' b'chr12_GL383552v1_alt'
 b'chrUn_KI270519v1' b'chr2_KI270775v1_alt' b'chr17_KI270907v1_alt'
 b'chrUn_GL000214v1' b'chr8_KI270901v1_alt' b'chr2_KI270770v1_alt'
 b'chr16_KI270854v1_alt' b'chr8_KI270819v1_alt' b'chr17_GL383564v2_alt'
 b'chr2_KI270772v1_alt' b'chr8_KI270815v1_alt' b'chr5_KI270795v1_alt'
 b'chr5_KI270898v1_alt' b'chr20_GL383577v2_alt' b'chr1_KI270708v1_random'
 b'chr7_KI270807v1_alt' b'chr5_KI270793v1_alt' b'chr6_GL383533v1_alt'
 b'chr2_GL383522v1_alt' b'chr19_KI270918v1_alt' b'chr12_GL383549v1_alt'
 b'chr2_KI270769v1_alt' b'chr4_KI270785v1_alt' b'chr12_KI270834v1_alt'
 b'chr7_GL383534v2_alt' b'chr20_KI270869v1_alt' b'chr21_GL383581v2_alt'
 b'chr3_KI270781v1_alt' b'chr17_KI270730v1_random' b'chrUn_KI270438v1'
 b'chr4_KI270787v1_alt' b'chr18_KI270864v1_alt' b'chr2_KI270771v1_alt'
 b'chr1_GL383519v1_alt' b'chr2_KI270768v1_alt' b'chr1_KI270760v1_alt'
 b'chr3_KI270783v1_alt' b'chr17_KI270859v1_alt' b'chr11_KI270902v1_alt'
 b'chr18_GL383568v1_alt' b'chr22_KI270737v1_random' b'chr13_KI270843v1_alt'
 b'chr22_KI270877v1_alt' b'chr5_GL383530v1_alt' b'chr11_KI270721v1_random'
 b'chr22_KI270738v1_random' b'chr22_GL383583v2_alt' b'chr2_GL582966v2_alt'
 b'chrUn_KI270748v1' b'chrUn_KI270435v1' b'chr5_GL000208v1_random'
 b'chrUn_KI270538v1' b'chr17_GL383566v1_alt' b'chr16_GL383557v1_alt'
 b'chr17_JH159148v1_alt' b'chr5_GL383532v1_alt' b'chr21_KI270872v1_alt'
 b'chrUn_KI270756v1' b'chr6_KI270758v1_alt' b'chr12_KI270833v1_alt'
 b'chr6_KI270802v1_alt' b'chr21_GL383580v2_alt' b'chr22_KB663609v1_alt'
 b'chr22_KI270739v1_random' b'chr9_GL383540v1_alt' b'chrUn_KI270757v1'
 b'chr2_KI270773v1_alt' b'chr17_JH159147v1_alt' b'chr11_KI270827v1_alt'
 b'chr1_KI270709v1_random' b'chrUn_KI270746v1' b'chr16_KI270856v1_alt'
 b'chr21_GL383578v2_alt' b'chrUn_KI270753v1' b'chr19_KI270868v1_alt'
 b'chr9_GL383542v1_alt' b'chr20_KI270871v1_alt' b'chr12_KI270836v1_alt'
 b'chr19_KI270865v1_alt' b'chr1_KI270764v1_alt' b'chrUn_KI270589v1'
 b'chr14_KI270726v1_random' b'chr19_KI270866v1_alt'
 b'chr22_KI270735v1_random' b'chr1_KI270711v1_random' b'chrUn_KI270745v1'
 b'chr1_KI270714v1_random' b'chr22_KI270732v1_random'
 b'chr1_KI270713v1_random' b'chrUn_KI270754v1' b'chr1_KI270710v1_random'
 b'chr12_KI270837v1_alt' b'chr9_KI270717v1_random'
 b'chr14_KI270724v1_random' b'chr9_KI270720v1_random'
 b'chr14_KI270723v1_random' b'chr9_KI270718v1_random' b'chrUn_KI270317v1'
 b'chr13_KI270842v1_alt' b'chrY_KI270740v1_random' b'chrUn_KI270755v1'
 b'chr8_KI270820v1_alt' b'chr1_KI270707v1_random' b'chrUn_KI270579v1'
 b'chrUn_KI270752v1' b'chrUn_KI270512v1' b'chrUn_KI270322v1'
 b'chrUn_GL000226v1' b'chrUn_KI270311v1' b'chrUn_KI270366v1'
 b'chrUn_KI270511v1' b'chrUn_KI270448v1' b'chrUn_KI270521v1'
 b'chrUn_KI270581v1' b'chrUn_KI270582v1' b'chrUn_KI270515v1'
 b'chrUn_KI270588v1' b'chrUn_KI270591v1' b'chrUn_KI270522v1'
 b'chrUn_KI270507v1' b'chrUn_KI270590v1' b'chrUn_KI270584v1'
 b'chrUn_KI270320v1' b'chrUn_KI270382v1' b'chrUn_KI270468v1'
 b'chrUn_KI270467v1' b'chrUn_KI270362v1' b'chrUn_KI270517v1'
 b'chrUn_KI270593v1' b'chrUn_KI270528v1' b'chrUn_KI270587v1'
 b'chrUn_KI270364v1' b'chrUn_KI270371v1' b'chrUn_KI270333v1'
 b'chrUn_KI270374v1' b'chrUn_KI270411v1' b'chrUn_KI270414v1'
 b'chrUn_KI270510v1' b'chrUn_KI270390v1' b'chrUn_KI270375v1'
 b'chrUn_KI270420v1' b'chrUn_KI270509v1' b'chrUn_KI270315v1'
 b'chrUn_KI270302v1' b'chrUn_KI270518v1' b'chrUn_KI270530v1'
 b'chrUn_KI270304v1' b'chrUn_KI270418v1' b'chrUn_KI270424v1'
 b'chrUn_KI270417v1' b'chrUn_KI270508v1' b'chrUn_KI270303v1'
 b'chrUn_KI270381v1' b'chrUn_KI270529v1' b'chrUn_KI270425v1'
 b'chrUn_KI270396v1' b'chrUn_KI270363v1' b'chrUn_KI270386v1'
 b'chrUn_KI270465v1' b'chrUn_KI270383v1' b'chrUn_KI270384v1'
 b'chrUn_KI270330v1' b'chrUn_KI270372v1' b'chrUn_KI270548v1'
 b'chrUn_KI270580v1' b'chrUn_KI270387v1' b'chrUn_KI270391v1'
 b'chrUn_KI270305v1' b'chrUn_KI270373v1' b'chrUn_KI270422v1'
 b'chrUn_KI270316v1' b'chrUn_KI270338v1' b'chrUn_KI270340v1'
 b'chrUn_KI270583v1' b'chrUn_KI270334v1' b'chrUn_KI270429v1'
 b'chrUn_KI270393v1' b'chrUn_KI270516v1' b'chrUn_KI270389v1'
 b'chrUn_KI270466v1' b'chrUn_KI270388v1' b'chrUn_KI270544v1'
 b'chrUn_KI270310v1' b'chrUn_KI270412v1' b'chrUn_KI270395v1'
 b'chrUn_KI270376v1' b'chrUn_KI270337v1' b'chrUn_KI270335v1'
 b'chrUn_KI270378v1' b'chrUn_KI270379v1' b'chrUn_KI270329v1'
 b'chrUn_KI270419v1' b'chrUn_KI270336v1' b'chrUn_KI270312v1'
 b'chrUn_KI270539v1' b'chrUn_KI270385v1' b'chrUn_KI270423v1'
 b'chrUn_KI270392v1' b'chrUn_KI270394v1']
chrom: chr1

Killed

For this test, I am using v0.3.11 of the pyBigWig library, which appears to be the current version.

I am wondering if there are perhaps specific versions of Python or libraries I should be using for data conversion, or are there memory requirements for conversion that would be satisfied with a host with more memory?

Set up error

Hi,

I am trying to set up the tool for HiGlass analysis, but it gives an error when the command is run:

module load python/cpu/2.7.15
clodius

Traceback (most recent call last):
File "/gpfs/share/apps/python/cpu/2.7.15-ES/bin/clodius", line 5, in
from clodius.cli.aggregate import cli
File "/gpfs/share/apps/python/cpu/2.7.15-ES/lib/python2.7/site-packages/clodius/cli/init.py", line 18, in
from . import ( # noqa: F401
File "/gpfs/share/apps/python/cpu/2.7.15-ES/lib/python2.7/site-packages/clodius/cli/aggregate.py", line 539
f"Unable to find {line_parts[0]} in the list of chromosome sizes. "
^
SyntaxError: invalid syntax

clodius aggregate bedgraph requires the same sorting order in .bedgraph and negspy's chromOrder/chromInfo

hi,
clodius requires the chromosomes in bedgraphs to be sorted the same way as in negspy's chromOrder/chromInfo. B/c of this requirement it's hard to use genomes with an arbitrary sorting order, b/c it's hard to resort bedgraphs in any order besides the lexicographic one. Would it be possible to change clodius to accept lexsorted bedgraphs and resort the aggregated data using the order from negspy?

Introduce dataframe file type to replace / supplement beddb and bed2ddb

The goal of this PR is to discuss the introduction of a new file type that replaces the beddb and bed2ddb formats. This file type will be able to store any type of data and be used as backing for gene annotations, bed-like regions, arbitrary points, etc...

Questions to address:

  1. Tile API: the current API takes a zoom level, start and end position. It works right now because any genomic data is converted to a linearized representation where chromosomes are concatenated using a given chromosome order.

Dataframe-backed files will not have this limitation. The tile API will have to have a chromosome order associated with it to indicate which data should be retrieved between coordinates x0 and x1.

Example API:

def get_1D_tile_data(
  filename='my_file.tsv', 
  tile_position=[1,0],
  group_column=['chr'], 
  position_columns=['start', 'end'],
  group_order=[('chr1', 1000), ('chr2', 5000), ('chrX', 4000), ('chrM', 3000)]
)
  1. Column to use as the index: A dataframe may have the start and end positions at arbitrary positions. The request should include an indicator of which columns to use for the positions of the data.

Use cases

  1. Replacing the current beddb and tile bed2db formats.

Perfomance

Filtering a 970K line file takes about 200ms. It may be possible to improve this through parallelization, sorting, indexing or subdividing the file into sections (e.g. chromosomes)

image

aggregate bedpe not working when output file is not specified

According to the help text

  -o, --output-file TEXT      The default output file name to use. If this
                              isn'tspecified, clodius will replace the current
                              extensionwith .bed2db

but when I do

clodius aggregate bedpe -a hg38 --has-header outrageously-awesome-stuff.bedpe

I only see

output_file: None
plain

while I would expect get

outrageously-awesome-stuff.bed2ddb

but instead I get

outrageously-awesome-stuff.bedpe.multires.db

therefore I am

🤔

Failed wheel build in Python 3

Installing via pip install clodius in a conda Python 3 environment:

...
gcc -pthread -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/home/nezar/miniconda3/lib/python3.5/site-packages/numpy/core/include -I/home/nezar/miniconda3/lib/python3.5/site-packages/numpy/core/include -c clodius/fast.c -o build/temp.linux-x86_64-3.5/clodius/fast.o
  clodius/fast.c:19:20: fatal error: Python.h: No such file or directory
  compilation terminated.
  error: command 'gcc' failed with exit status 1
  
  ----------------------------------------
  Failed building wheel for clodius

Probably some build config issue in setup.py.

mean +/- error

Hi,

Is there a way of visualising mean +/- error of tracks?

Thanks for your help
Best
Zhan

bigwig_test.py fails with "numpy.ufunc size changed"

Running tests I get this error. I'm on the branch for PR #71. I've installed dependencies with pip install -r requirements.txt.

ERROR: Failure: ValueError (numpy.ufunc size changed, may indicate binary incompatibility. Expected 216 from C header, got 192 from PyObject)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/chuck/anaconda3/envs/clodius/lib/python3.6/site-packages/nose-1.3.7-py3.6.egg/nose/failure.py", line 39, in runTest
    raise self.exc_val.with_traceback(self.tb)
  File "/Users/chuck/anaconda3/envs/clodius/lib/python3.6/site-packages/nose-1.3.7-py3.6.egg/nose/loader.py", line 418, in loadTestsFromName
    addr.filename, addr.module)
  File "/Users/chuck/anaconda3/envs/clodius/lib/python3.6/site-packages/nose-1.3.7-py3.6.egg/nose/importer.py", line 47, in importFromPath
    return self.importFromDir(dir_path, fqname)
  File "/Users/chuck/anaconda3/envs/clodius/lib/python3.6/site-packages/nose-1.3.7-py3.6.egg/nose/importer.py", line 94, in importFromDir
    mod = load_module(part_fqname, fh, filename, desc)
  File "/Users/chuck/anaconda3/envs/clodius/lib/python3.6/imp.py", line 235, in load_module
    return load_source(name, filename, file)
  File "/Users/chuck/anaconda3/envs/clodius/lib/python3.6/imp.py", line 172, in load_source
    module = _load(spec)
  File "<frozen importlib._bootstrap>", line 684, in _load
  File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/Users/chuck/github/higlass/clodius/test/tiles/bigwig_test.py", line 1, in <module>
    import clodius.tiles.bigwig as hgbi
  File "/Users/chuck/github/higlass/clodius/clodius/tiles/bigwig.py", line 1, in <module>
    import bbi
  File "/Users/chuck/anaconda3/envs/clodius/lib/python3.6/site-packages/pybbi-0.2.0-py3.6-macosx-10.7-x86_64.egg/bbi/__init__.py", line 3, in <module>
    from .cbbi import (
  File ".eggs/Cython-0.29.6-py3.6-macosx-10.7-x86_64.egg/Cython/Includes/numpy/__init__.pxd", line 918, in init bbi.cbbi
ValueError: numpy.ufunc size changed, may indicate binary incompatibility. Expected 216 from C header, got 192 from PyObject

how to prepare the a special bed fild for Higlass

Hey guys, I have downloaded a special bed file, the file contains the information about the subcompartment of the genome(the fourth column), which I believe is encoded with the red blue green color encoded in the last column), so how should prepare this kind of data for higlass? thanks a lot!
chr19 0 200000 NA 0 . 0 200000 255,255,255
chr19 200000 500000 B1 -1 . 200000 500000 220,20,60
chr19 500000 3800000 A1 2 . 500000 3800000 34,139,34
chr19 3800000 3900000 B1 -1 . 3800000 3900000 220,20,60

Clodius aggregate with new assembly

Hi all,
I'm trying to use clodius aggregate bedgraph to generate TADs.

sudo docker exec gibcus-chicken2.0-higlass clodius aggregate bedgraph \
> /tmp/WT-G2-S4-R5.insulation.boundaries.bed \
> --output-file /tmp/WT-G2-S4-R5.insulation.boundaries.bed2ddb \
> --chromosome-col 1 \
> --from-pos-col 2 \
> --to-pos-col 3 \
> --value-col 6 \
> --assembly galGal6 \
> --nan-value NA \
> --has-header

Clodius does not have the galGal6 assembly, so I get this error:

Error: Invalid value for "--assembly" / "-a": invalid choice: galGal6. (choose from b37, mm9, GCF_000005845.2_ASM584v2_genomic, test, dm3, dm6, grch37, GCA_000001215.4_Release_6_plus_ISO1_MT_genomic, GCA_000001405.15_GRCh38_genomic, grch37-lite, test3chroms, mm10, danRer10, hg19, hg38)

I recently added galGal6 to negspy, but it seems that Clodius has a different source for its assemblies. Is there a way (for me) to add galGal6 to Clodius?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.