Code Monkey home page Code Monkey logo

torch-hdf5's Introduction

torch-hdf5

This package allows you to read and write Torch data from and to HDF5 files. The format is fast, flexible, and supported by a wide range of other software - including MATLAB, Python, and R.

Build Status

Usage

For further information, please consult the user manual.

torch-hdf5's People

Contributors

akfidjeland avatar alschua avatar anibali avatar colesbury avatar d11 avatar didw avatar ffmpbgrnn avatar georgostrovski avatar ili3p avatar iskra-vitaly avatar ivendrov avatar jagapiou avatar jashmenn avatar kadamwhite avatar mightybigcar avatar paulgwamanda avatar rdhindsa14 avatar rewonc avatar soumith avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

torch-hdf5's Issues

[CentOS6.5] can't load hdf5

luarocks make is successful but can't load library.
Any idea?
Using torch-distro btw.

th> require 'hdf5'
...himada/torch-distro/install/share/lua/5.1/trepl/init.lua:354: .../shimada/torch-distro/install/share/lua/5.1/hdf5/ffi.lua:29: /usr/local/lib/libhdf5.a: invalid ELF header
stack traceback:
    [C]: in function 'error'
    ...himada/torch-distro/install/share/lua/5.1/trepl/init.lua:354: in function 'f'
    [string "local f = function() return require 'hdf5' en..."]:1: in main chunk
    [C]: in function 'xpcall'
    ...himada/torch-distro/install/share/lua/5.1/trepl/init.lua:620: in function 'repl'
    ...rch-distro/install/lib/luarocks/rocks/trepl/scm-1/bin/th:185: in main chunk
    [C]: at 0x004057d0  

OS X: runtime error: dyld: lazy symbol binding failed: Symbol not found: _H5_init_library

Hi!

I am running OS X 10.11.4 and I used the steps described in user manual with exception of the git clone command. It didn't work for me so I replaced it by following one:
git clone https://github.com/deepmind/torch-hdf5.git

Everything seemed to run smooth without errors, however whenever I try to run anything using hdf5, I get following error:

dyld: lazy symbol binding failed: Symbol not found: _H5_init_library
  Referenced from: /Users/binus/homebrew/Cellar/hdf5/1.8.16_1/lib/libhdf5.dylib
  Expected in: flat namespace

dyld: Symbol not found: _H5_init_library
  Referenced from: /Users/binus/homebrew/Cellar/hdf5/1.8.16_1/lib/libhdf5.dylib
  Expected in: flat namespace

Trace/BPT trap: 5

However the library seems to be fine as executing nm mentions the symbol

nm /Users/binus/homebrew/Cellar/hdf5/1.8.16_1/lib/libhdf5.dylib | grep H5_init
0000000000001a78 T _H5_init_library

And also the commands within /Users/binus/homebrew/Cellar/hdf5/1.8.16_1/ seems to work correctly.
Do you have an idea, what might be wrong? I spent quite lot of time on it already and I helpless.
Thanks a lot!

Running out of RAM in writing tensors

Hey, I'm running into an issue where the more write operations I do, the more RAM is taken. I assume it is because the dataset is in RAM however I thought closing the file would flush it to disk (correct me if I'm wrong!).

Small test which causes RAM to get allocated as the file gets larger:

require 'hdf5'

local tensor = torch.randn(100, 129)
for x = 1, 1000 do
    local tensorFile = hdf5.open('tensors.h5', 'a')
    tensorFile:write("/data/" .. x, tensor)
    tensorFile:close()
    print(x)
end

I'm sure I'm not understanding something correctly, so any help would be great!

String not supported

I have this piece of code where I am trying to insert a dataset with string values:

require 'hdf5';
label = {"a", "b"}

myFile = hdf5.open('t.h5', 'w')
myFile:write('label', label)
myFile:close()

And I am getting this error

/home/mlagunas/torch/install/share/lua/5.1/hdf5/group.lua:222: torch-hdf5: writing data of type string is not supported
stack traceback:
    [C]: in function 'error'
    /home/mlagunas/torch/install/share/lua/5.1/hdf5/group.lua:222: in function '_writeData'
    /home/mlagunas/torch/install/share/lua/5.1/hdf5/group.lua:292: in function '_write_or_append'
    /home/mlagunas/torch/install/share/lua/5.1/hdf5/group.lua:255: in function </home/mlagunas/torch/install/share/lua/5.1/hdf5/group.lua:254>
    /home/mlagunas/torch/install/share/lua/5.1/hdf5/group.lua:282: in function '_write_or_append'
    /home/mlagunas/torch/install/share/lua/5.1/hdf5/group.lua:255: in function </home/mlagunas/torch/install/share/lua/5.1/hdf5/group.lua:254>
    /home/mlagunas/torch/install/share/lua/5.1/hdf5/file.lua:69: in function '_write_or_append'
    /home/mlagunas/torch/install/share/lua/5.1/hdf5/file.lua:43: in function 'f'
    [string "local f = function() return myFile:write('lab..."]:1: in main chunk
    [C]: in function 'xpcall'
    /home/mlagunas/torch/install/share/lua/5.1/itorch/main.lua:209: in function </home/mlagunas/torch/install/share/lua/5.1/itorch/main.lua:173>
    /home/mlagunas/torch/install/share/lua/5.1/lzmq/poller.lua:75: in function 'poll'
    .../mlagunas/torch/install/share/lua/5.1/lzmq/impl/loop.lua:307: in function 'poll'
    .../mlagunas/torch/install/share/lua/5.1/lzmq/impl/loop.lua:325: in function 'sleep_ex'
    .../mlagunas/torch/install/share/lua/5.1/lzmq/impl/loop.lua:370: in function 'start'
    /home/mlagunas/torch/install/share/lua/5.1/itorch/main.lua:381: in main chunk
    [C]: in function 'require'
    (command line):1: in main chunk
    [C]: at 0x00406670

Is there any way to introduce Strings using HDF5?

Thanks in advance!

Installing for Lua 5.2

Install works fine for LuaJIT 2.0
I get the following error when requiring in Lua 5.2 (and 5.1)
I have installed both luaffifb and luabitop for compatibility.

lua -e require 'hdf5'
lua: .../Alex/anaconda/envs/_test/share/lua/5.1/hdf5/ffi.lua:56: expected 'static const int' on line 208
stack traceback:
    [C]: in function 'cdef'
    .../Alex/anaconda/envs/_test/share/lua/5.1/hdf5/ffi.lua:56: in function 'loadHDF5Header'
    .../Alex/anaconda/envs/_test/share/lua/5.1/hdf5/ffi.lua:60: in main chunk
    [C]: in function 'dofile'
    ...lex/anaconda/envs/_test/share/lua/5.1/torch/init.lua:54: in function 'include'
    ...Alex/anaconda/envs/_test/share/lua/5.1/hdf5/init.lua:29: in main chunk
    [C]: in function 'require'
    (command line):1: in main chunk
    [C]: ?

Support for partial writing

In the docs there is a section on partial reading:

local myFile = hdf5.open('/path/to/read.h5','r')
-- Specify the range for each dimension of the dataset.
local data = f:read('/path/to/data'):partial({start1, end1}, {start2, end2})
myFile:close()

it would be great if there was also support for partial writing, when making datasets that don't fit in RAM. For instance, ideally something along the lines

local data = f:write('/path/to/data', data):partial({start1, end1}, {start2, end2})

I don't believe this is currently supported, or at least it's not documented in the docs.

Manual not up to date about reading files

The instructions don't work for me: the hdf5 object has no open(), only open_file(). Also, what to do after opening the file?

th> h = require "hdf5"
[0.0001s]
th> h.op
opaque open_file

th> h.open_file( 'train.hdf5' )
cdata<struct 3519>: 0x412f8808
[0.0002s]
th> f = h.open_file( 'train.hdf5' )
[0.0002s]

Can torch-hdf5 support variable-length types?

h5py support variable-length types, for example, my data contains different length of time series data
of size (6, 40), (10, 40), (15, 40) ..., the first dimension varies.

Can torch-hdf5 read in the data and process them? how? Thanks.

error in getting size of dataset

I have an hdf5 file with structure { data = { target=DoubleTensor - size: 1x1, sentence = DoubleTensor - size 6x8 } }
local testData = { sentence = torch.rand(4, 6), target = torch.rand(1,1)}
local writeFile = hdf5.open('/Users/ddua/Desktop/try.h5', 'w')
local options = hdf5.DataSetOptions()
options:setChunked(1,1)
options:setDeflate()
writeFile:write("data", testData, options)
writeFile:append("data", testData, options)
writeFile:close()

When I try to get the size of dataset
local dim = readFile:read("data"):dataspaceSize()

I get this error

attempt to call method 'dataspaceSize' (a nil value)
stack traceback:
[string "..."]:8: in main chunk
[C]: in function 'xpcall'
./itorch/main.lua:209: in function <./itorch/main.lua:173>
/Users/ddua/torch/install/share/lua/5.1/lzmq/poller.lua:75: in function 'poll'
/Users/ddua/torch/install/share/lua/5.1/lzmq/impl/loop.lua:307: in function 'poll'
/Users/ddua/torch/install/share/lua/5.1/lzmq/impl/loop.lua:325: in function 'sleep_ex'
/Users/ddua/torch/install/share/lua/5.1/lzmq/impl/loop.lua:370: in function 'start'
./itorch/main.lua:381: in main chunk
[C]: in function 'require'
(command line):1: in main chunk
[C]: at 0x01010bfbd0

installation problem

I had problem installing the library using luarocks i think that the library it downloads is not yours.
I managed to solve this problem by cloning and installing directly from the rockspec file.

No "open" function for default version

Hi,

I tried to import .h5 file into torch following the steps shown in the user manual, but I got the error message with no "open" function in default installed version.

For luarocks installation, it heads to the v0.0 version, which is different from the source code. When I compiled the source code, it works.

However, even for the source code version, there are still some problems. When I tried to run "tests/testData.lua", I got such error message:

/usr/local/bin/luajit: cannot open /usr/local/share/lua/5.1/totem/asserts.lua: No such file or directo
ry
stack traceback:
[C]: in function 'dofile'
/usr/local/share/lua/5.1/torch/init.lua:41: in function 'include'
/usr/local/share/lua/5.1/totem/init.lua:17: in main chunk
[C]: in function 'require'
testData.lua:11: in main chunk
[C]: in function 'dofile'
/usr/local/lib/luarocks/rocks/trepl/scm-1/bin/th:129: in main chunk
[C]: at 0x004061d0

And for "/tests/benchmark/benchmark.lua", the error message is shown as:

Size torch.save hdf5
/usr/local/bin/luajit: benchmark.lua:14: attempt to call method 'set' (a nil value)
stack traceback:
benchmark.lua:14: in main chunk
[C]: in function 'dofile'
/usr/local/lib/luarocks/rocks/trepl/scm-1/bin/th:129: in main chunk
[C]: at 0x004061d0

Please address this problem. Thank you!

lua/5.2/hdf5/ffi.lua:56: expected 'static const int' on line 208 |

Hello

I am trying to use neuraltalk2 which depends on torch and on this project, and when I try to run
require('hdf5') in the torch interpreter, I run into the error bellow. I have followed the various install procedures on OSX and Ubuntu 14.04 and end up with the same error.

I lack any lua experience, so I do not see where to go from here. I believe I have the latest version of the hdf5 library:

"Warning: homebrew/science/hdf5-1.8.16 already installed"

Here is the error, followed by a list of the version of the various packages installed.

th> require('hdf5')
.../lua-extra/torch/install/share/lua/5.2/trepl/init.lua:383: .../lua-extra/torch/install/share/lua/5.2/hdf5/ffi.lua:56: expected 'static const int' on line 208
stack traceback:
.../lua-extra/torch/install/share/lua/5.2/trepl/init.lua:500: in function <.../lua-extra/torch/install/share/lua/5.2/trepl/init.lua:493>
[C]: in function 'error'
.../lua-extra/torch/install/share/lua/5.2/trepl/init.lua:383: in function 'require'
[string "_RESULT={require('hdf5')}"]:1: in main chunk
[C]: in function 'xpcall'
.../lua-extra/torch/install/share/lua/5.2/trepl/init.lua:650: in function 'repl'
...xtra/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:199: in main chunk
[C]: in ?

I have h5py (2.5.0) (python 2.7)

~ lua -v
Lua 5.2.3 Copyright (C) 1994-2013 Lua.org, PUC-Rio

~ luarocks list

Installed rocks:

argcheck
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

cwrap
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

dok
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

env
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

fftw3
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

gnuplot
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

graph
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

graphicsmagick
1.scm-0 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

hdf5
0-0 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

image
1.1.alpha-0 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

itorch
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

lbase64
20120807-3 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

lua-cjson
2.1.0-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

luabitop
1.0.2-2 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

luacrypto
0.3.2-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

luaffi
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

luafilesystem
1.6.3-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

lzmq
0.4.3-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

nn
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

nngraph
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

nnx
0.1-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

optim
1.0.5-0 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

paths
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

penlight
1.3.2-2 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

signal
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

sundown
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

sys
1.1-0 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

threads
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

torch
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

totem
0-0 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

trepl
scm-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

uuid
0.2-1 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

xlua
1.0-0 (installed) - /Users/niko/workspace/lua-extra/torch/install/lib/luarocks/rocks

Any idea where to go from here to fix that error?

Thanks a lot,

Nicolas

Cannot make installation to work

Hi

I am a Ubuntu 14.04 user.

I am using the same commands that you have given for installing the rock package vut I am constantly getting this error:

Error: Could not find expected file libhdf5.a, or libhdf5.so, or libhdf5.so.* for hdf5 -- you may have to install hdf5 in your system and/or pass hdf5_DIR or hdf5_LIBDIR to the luarocks command. Example: luarocks install hdf5 hdf5_DIR=/usr/local

But i know that I am giving the library path when I use the
sudo luarocks install hdf5 LIBHDF5_LIBDIR="/usr/lib/x86_64-linux-gnu/"

What can be the problem?

Can't open hdf5 file on mac or linux

Trying to open a file I get this error on Linux 14.04:

> local myFile = hdf5.open('write.hd5','w')
/usr/local/share/lua/5.1/hdf5/group.lua:48: attempt to index global 'ffi' (a nil value)
stack traceback:
        /usr/local/share/lua/5.1/hdf5/group.lua:48: in function '__init'
        /usr/local/share/lua/5.1/torch/init.lua:43: in function </usr/local/share/lua/5.1/torch/init.lua:39>
        [C]: in function 'HDF5Group'
        /usr/local/share/lua/5.1/hdf5/init.lua:67: in function '_loadObject'
        /usr/local/share/lua/5.1/hdf5/file.lua:18: in function '__init'
        /usr/local/share/lua/5.1/torch/init.lua:43: in function </usr/local/share/lua/5.1/torch/init.lua:39>
        [C]: in function 'open'
        [string "local myFile = hdf5.open('write.hd5','w')..."]:1: in main chunk
        [C]: in function 'xpcall'
        /usr/local/share/lua/5.1/trepl/init.lua:522: in function </usr/local/share/lua/5.1/trepl/init.lua:429>

And on Mac (10.9.3)

local myFile = hdf5.open('write.hd5','w')
/usr/local/share/lua/5.1/hdf5/group.lua:48: attempt to index global 'ffi' (a nil value)
stack traceback:
        /usr/local/share/lua/5.1/hdf5/group.lua:48: in function '__init'
        /usr/local/share/lua/5.1/torch/init.lua:43: in function </usr/local/share/lua/5.1/torch/init.lua:39>
        [C]: in function 'HDF5Group'
        /usr/local/share/lua/5.1/hdf5/init.lua:67: in function '_loadObject'
        /usr/local/share/lua/5.1/hdf5/file.lua:18: in function '__init'
        /usr/local/share/lua/5.1/torch/init.lua:43: in function </usr/local/share/lua/5.1/torch/init.lua:39>
        [C]: in function 'open'
        [string "local myFile = hdf5.open('write.hd5','w')..."]:1: in main chunk
        [C]: in function 'xpcall'
        /usr/local/share/lua/5.1/trepl/init.lua:519: in function </usr/local/share/lua/5.1/trepl/init.lua:426>

Error: Could not satisfy dependency: totem

torch-hdf5$ luarocks make hdf5-0-0.rockspec

Missing dependencies for hdf5:
penlight 
totem 
torch >= 7.0

Using https://luarocks.org/penlight-1.3.2-2.rockspec... switching to 'build' mode

Missing dependencies for penlight:
luafilesystem 

Using https://luarocks.org/luafilesystem-1.6.3-2.src.rock... switching to 'build' mode
env MACOSX_DEPLOYMENT_TARGET=10.8 gcc -O2 -fPIC -I/usr/local/include -c src/lfs.c -o src/lfs.o
env MACOSX_DEPLOYMENT_TARGET=10.8 gcc -bundle -undefined dynamic_lookup -all_load -o lfs.so -L/usr/local/lib src/lfs.o
Updating manifest for /usr/local/lib/luarocks/rocks-5.2
luafilesystem 1.6.3-2 is now built and installed in /usr/local (license: MIT/X11)

Archive:  penlight-1.3.2-core.zip
  inflating: penlight-1.3.2/LICENSE.md  
   creating: penlight-1.3.2/lua/
   creating: penlight-1.3.2/lua/pl/
  inflating: penlight-1.3.2/lua/pl/lexer.lua  
  inflating: penlight-1.3.2/lua/pl/dir.lua  
  inflating: penlight-1.3.2/lua/pl/func.lua  
  inflating: penlight-1.3.2/lua/pl/Set.lua  
  inflating: penlight-1.3.2/lua/pl/compat.lua  
  inflating: penlight-1.3.2/lua/pl/utils.lua  
  inflating: penlight-1.3.2/lua/pl/luabalanced.lua  
  inflating: penlight-1.3.2/lua/pl/comprehension.lua  
  inflating: penlight-1.3.2/lua/pl/file.lua  
  inflating: penlight-1.3.2/lua/pl/init.lua  
  inflating: penlight-1.3.2/lua/pl/template.lua  
  inflating: penlight-1.3.2/lua/pl/MultiMap.lua  
  inflating: penlight-1.3.2/lua/pl/xml.lua  
  inflating: penlight-1.3.2/lua/pl/stringio.lua  
  inflating: penlight-1.3.2/lua/pl/pretty.lua  
  inflating: penlight-1.3.2/lua/pl/path.lua  
  inflating: penlight-1.3.2/lua/pl/lapp.lua  
  inflating: penlight-1.3.2/lua/pl/class.lua  
  inflating: penlight-1.3.2/lua/pl/config.lua  
  inflating: penlight-1.3.2/lua/pl/permute.lua  
  inflating: penlight-1.3.2/lua/pl/stringx.lua  
  inflating: penlight-1.3.2/lua/pl/Map.lua  
  inflating: penlight-1.3.2/lua/pl/types.lua  
  inflating: penlight-1.3.2/lua/pl/Date.lua  
  inflating: penlight-1.3.2/lua/pl/operator.lua  
  inflating: penlight-1.3.2/lua/pl/import_into.lua  
  inflating: penlight-1.3.2/lua/pl/app.lua  
  inflating: penlight-1.3.2/lua/pl/tablex.lua  
  inflating: penlight-1.3.2/lua/pl/test.lua  
  inflating: penlight-1.3.2/lua/pl/text.lua  
  inflating: penlight-1.3.2/lua/pl/input.lua  
  inflating: penlight-1.3.2/lua/pl/url.lua  
  inflating: penlight-1.3.2/lua/pl/seq.lua  
  inflating: penlight-1.3.2/lua/pl/array2d.lua  
  inflating: penlight-1.3.2/lua/pl/OrderedMap.lua  
  inflating: penlight-1.3.2/lua/pl/data.lua  
  inflating: penlight-1.3.2/lua/pl/List.lua  
  inflating: penlight-1.3.2/lua/pl/sip.lua  
  inflating: penlight-1.3.2/lua/pl/strict.lua  
Updating manifest for /usr/local/lib/luarocks/rocks-5.2
penlight 1.3.2-2 is now built and installed in /usr/local (license: MIT/X11)


Error: Could not satisfy dependency: totem 

Iterating through data

I am using torch-hdf5 to exchange big chunks of data between Python and Lua. I want to iterate through the data without loading everything first. I am able to do it with the code below but I am accessing directly (undocumented) member variables. Is there a better way?

h5 = hdf5.open('file.h5', 'r')
for k,c in pairs(h5._rootGroup._children) do
  local d = h5:read(k):all()
end

Unable to write table to hdf5 file

Any idea on why simple test case like below will fail? Thanks.

questions = {{22,33}, {44,55}}
h5 = hdf5.open( 'train_data.h5', 'w')
h5:write('/trainset', questions )
h5:close()

error installing hdf5

I've been having lot of difficulty getting the data into torch - I've managed to use csv to bring in the data, but would be great if I can get hdf5 working. Is this working currently? Here's what I run into when I try to install on Ubuntu.. similar issues on mac.

(Even though I do have libhdf5-serial-dev hdf5-tools installed, it doesn't find them.. so I built them in a new folder.. but no dice... here's what I tried.)

shashi@buntu:~$ sudo luarocks install hdf5
Installing https://raw.github.com/torch/rocks/master/hdf5-0-0.rockspec...
Using https://raw.github.com/torch/rocks/master/hdf5-0-0.rockspec... switching to 'build' mode

Error: Could not find expected file libhdf5.a, or libhdf5.so, or libhdf5.so.* for hdf5 -- you may have to install hdf5 in your system and/or pass hdf5_DIR or hdf5_LIBDIR to the luarocks command. Example: luarocks install hdf5 hdf5_DIR=/usr/local
shashi@buntu:~$ sudo luarocks install hdf5 hdf5_LIBDIR=/home/shashi/Applications/hdf5_dir/hdf5/lib

Error: Invalid assignment: hdf5_LIBDIR=/home/shashi/Applications/hdf5_dir/hdf5/lib

Installing on Lua 5.3

Currently, there is an incompatible dependency in BitOp for Lua 5.3. Is bitop strictly necessary?

Open method returns nil

Hi all,
I am trying to read an HDF5 file using Torch.
I tried with the sample code:

`require 'hdf5'

local myFile = hdf5.open('/path/to/read.h5', 'r')

local data = myFile:read('/path/to/data'):all()

myFile:close()`

The second line works without any error but it returns nil.

When I use this it is okay:

`hdf5.open('test.hdf5')'

[HDF5File: (16777223 / FILE) /pfs/bh/proj/EvolvingAI/mnorouzz/Serengiti/test.hdf5]`

But I cannot put it in a variable.
I tried to reinstall the package, but I git the same results.
What is the problem?
Thanks

module 'util' not found

I tried installing on a fresh Docker image based on Ubuntu 14.04. I followed these instructions, which succeeded without incident:

sudo apt-get install libhdf5-serial-dev hdf5-tools
git clone [email protected]:deepmind/torch-hdf5.git
cd torch-hdf5
luarocks make hdf5-0-0.rockspec LIBHDF5_LIBDIR="/usr/lib/x86_64-linux-gnu/"

In a th session, I then get the following error:

require 'hdf5'
/usr/local/share/lua/5.1/fn/init.lua:1: module 'util' not found:
no field package.preload['util']
no file './util.lua'
no file '/usr/local/share/luajit-2.0.3/util.lua'
no file '/usr/local/share/lua/5.1/util.lua'
no file '/usr/local/share/lua/5.1/util/init.lua'
no file './util.so'
no file '/usr/local/lib/lua/5.1/util.so'
no file '/usr/local/lib/lua/5.1/loadall.so'
stack traceback:
[C]: in function 'require'
/usr/local/share/lua/5.1/fn/init.lua:1: in main chunk
[C]: in function 'require'
/usr/local/share/lua/5.1/logroll/init.lua:6: in main chunk
[C]: in function 'require'
/usr/local/share/lua/5.1/hdf5/init.lua:13: in main chunk
[C]: in function 'require'
stdin:1: in main chunk
[C]: at 0x004061d0

Support CentOS 6?

As the title, does it support CentOS 6.4? From the manual, it only mentioned Ubuntu and OS X.

File mode r+

Unclear in the docs how to append to an existing HDF5 file. In the Python library h5py, this is done via the opening a file in mode r+, but this results in a file writing error (Lua installed via the current default torch distro):

/Users/Alex/torch/install/share/lua/5.1/hdf5/group.lua:115: Error writing data b to [HDF5Group 33554433 /]
stack traceback:
    [C]: in function 'error'
    /Users/Alex/torch/install/share/lua/5.1/hdf5/group.lua:115: in function '_writeData'
    /Users/Alex/torch/install/share/lua/5.1/hdf5/group.lua:292: in function '_write_or_append'
    /Users/Alex/torch/install/share/lua/5.1/hdf5/group.lua:255: in function </Users/Alex/torch/install/share/lua/5.1/hdf5/group.lua:254>
    /Users/Alex/torch/install/share/lua/5.1/hdf5/file.lua:69: in function '_write_or_append'
    /Users/Alex/torch/install/share/lua/5.1/hdf5/file.lua:43: in function 'write'
    [string "_RESULT={f:write("b", torch.zeros(3))}"]:1: in main chunk
    [C]: in function 'xpcall'
    /Users/Alex/torch/install/share/lua/5.1/trepl/init.lua:650: in function 'repl'
    ...Alex/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:199: in main chunk
    [C]: at 0x010dcbbbd0

Also tried mode "a", but it just didn't write anything. Suggestions? Happy to work on the library itself if necessary, just point me to where to dive in.

Reading failes for blosc compressed files

I was trying to read from a hdf5 file I created with pytables. To be more specific from an EArray that was compressed using blosc compression. I get the following error message:

HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 140702278149952:
  #000: ../../../src/H5Dio.c line 173 in H5Dread(): can't read data
    major: Dataset
    minor: Read failed
  #001: ../../../src/H5Dio.c line 545 in H5D__read(): can't read data
    major: Dataset
    minor: Read failed
  #002: ../../../src/H5Dchunk.c line 1861 in H5D__chunk_read(): unable to read raw data chunk
    major: Low-level I/O
    minor: Read failed
  #003: ../../../src/H5Dchunk.c line 2891 in H5D__chunk_lock(): data pipeline read failed
    major: Data filters
    minor: Filter operation failed
  #004: ../../../src/H5Z.c line 1358 in H5Z_pipeline(): required filter 'blosc' is not registered
    major: Data filters
    minor: Read failed
  #005: ../../../src/H5PL.c line 297 in H5PL_load(): search in paths failed
    major: Plugin for dynamically loaded library
    minor: Can't get value
  #006: ../../../src/H5PL.c line 401 in H5PL__find(): can't open directory
    major: Plugin for dynamically loaded library
    minor: Can't open directory or file

The test provided for reading and writing work fine. I'm owner of the file. I can read the dimensions of data but not the data itself.

Using hd5f to train a network

I have saved my file to .h5, how can I use that to train a network? What is the difference between hd5f file and hd5f data?

Bugs during running torch-hdf5 on CentOS 6 with HDF5 1.8.15

Hi, I am running torch-hdf5 on CentOS 6 with HDF5 1.8.15 for loading ImageNet 2012 dataset.
But there are two bugs happening.

Bug one

When exiting the program improperly (Ctrl-C or bugs), starting the program again will directly meet a

Segmentation fault

without any other information after the first partial reading (but data size can be correctly collected).

Bug two

When running the program, there would be bugs as below happening randomly.

HDF5-DIAG: Error detected in HDF5 (1.8.15-patch1) thread 0:
  #000: H5T.c line 1846 in H5Tget_class(): not a datatype
    major: Invalid arguments to routine
    minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.15-patch1) thread 0:
  #000: H5T.c line 2104 in H5Tget_size(): not a datatype
    major: Invalid arguments to routine
    minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.15-patch1) thread 0:
  #000: H5Tnative.c line 112 in H5Tget_native_type(): not a data type
    major: Invalid arguments to routine
    minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.15-patch1) thread 0:
  #000: H5T.c line 1846 in H5Tget_class(): not a datatype
    major: Invalid arguments to routine
    minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.15-patch1) thread 0:
  #000: H5T.c line 2104 in H5Tget_size(): not a datatype
    major: Invalid arguments to routine
    minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.15-patch1) thread 0:
  #000: H5Tnative.c line 112 in H5Tget_native_type(): not a data type
    major: Invalid arguments to routine
    minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.15-patch1) thread 0:
  #000: H5T.c line 1846 in H5Tget_class(): not a datatype
    major: Invalid arguments to routine
    minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.15-patch1) thread 0:
  #000: H5T.c line 2104 in H5Tget_size(): not a datatype
    major: Invalid arguments to routine
    minor: Inappropriate type
Epoch: Testing [2][24576/25600] 
/om/user/chengxuz/torch/install/bin/luajit: ...chengxuz/torch/install/share/lua/5.1/threads/threads.lua:264: 
[thread 1 callback] /om/user/chengxuz/torch/install/share/lua/5.1/hdf5/ffi.lua:335: Reading data of class NO_CLASS(50331743) is unsupported
stack traceback:
    [C]: in function 'error'
    /om/user/chengxuz/torch/install/share/lua/5.1/hdf5/ffi.lua:335: in function '_getTorchType'
    ...er/chengxuz/torch/install/share/lua/5.1/hdf5/dataset.lua:85: in function 'getTensorFactory'
    ...er/chengxuz/torch/install/share/lua/5.1/hdf5/dataset.lua:135: in function 'partial'
    ...engxuz/robustness/imagenet-multiGPU.torch/dlprovider.lua:100: in function 'sample'
    ...ser/chengxuz/robustness/imagenet-multiGPU.torch/test.lua:39: in function <...ser/chengxuz/robustness/imagenet-multiGPU.torch/test.lua:38>
    [C]: in function 'xpcall'
    ...chengxuz/torch/install/share/lua/5.1/threads/threads.lua:231: in function 'callback'
    ...r/chengxuz/torch/install/share/lua/5.1/threads/queue.lua:65: in function <...r/chengxuz/torch/install/share/lua/5.1/threads/queue.lua:41>
    [C]: in function 'pcall'
    ...r/chengxuz/torch/install/share/lua/5.1/threads/queue.lua:40: in function 'dojob'
    [string "  local Queue = require 'threads.queue'..."]:15: in main chunk

As can be seen from the bug information, I am currently combining reading through hdf5 cached files to imagenet-multiGPU.torch. This bug happens randomly. But as I am reading a large dataset batch by batch, the problem would actually definitely happen during my training.

module 'bit' not found

Hi,

I've been trying to switch my torch install from luaJIT to lua5.2, because of luaJIT's 2GB memory limit. However, upon switching, torch-hdf5 won't load anymore, complaining that the module 'bit' is not present.

.../torch/install/share/lua/5.2/trepl/init.lua:384: module 'bit' not found:No LuaRocks module found for bit

Is it possible I'm doing something wrong?

Thanks

no hdf5 after require

After I require 'hdf5', there's no hdf5 variable (Ubuntu 14.04):

th> require 'hdf5'
{
create_type : function: 0x40daffb8
fortran_s1 : cdata<struct 3517>: 0x40202088
u64le : cdata<struct 3517>: 0x40d25658
create_plist : function: 0x41bb38c8
uint64 : cdata<struct 3517>: 0x411aabe8
schar : cdata<struct 3517>: 0x411ad4a0
i16be : cdata<struct 3517>: 0x41a0c390
i32le : cdata<struct 3517>: 0x41bf0808
ullong : cdata<struct 3517>: 0x40d34098
b32le : cdata<struct 3517>: 0x41bdf3d8
short : cdata<struct 3517>: 0x41bab578
b8le : cdata<struct 3517>: 0x41baa450
int32 : cdata<struct 3517>: 0x41be3fc0
long : cdata<struct 3517>: 0x41a18060
u8le : cdata<struct 3517>: 0x40d37350
u64be : cdata<struct 3517>: 0x40d1fd40
ref_dsetreg : cdata<struct 3517>: 0x41bbf500
ushort : cdata<struct 3517>: 0x41a23390
i8le : cdata<struct 3517>: 0x41a13808
double : cdata<struct 3517>: 0x411ab1e8
uchar : cdata<struct 3517>: 0x41bf4198
i16le : cdata<struct 3517>: 0x411aaf30
i64be : cdata<struct 3517>: 0x41bc3940
uint32 : cdata<struct 3517>: 0x40db6420
create_simple_space : function: 0x40f44160
c_s1 : cdata<struct 3517>: 0x41bc4320
char : cdata<struct 3517>: 0x40d25880
llong : cdata<struct 3517>: 0x40d2d3f8
f32le : cdata<struct 3517>: 0x41bdc940
b8be : cdata<struct 3517>: 0x411c60c8
int64 : cdata<struct 3517>: 0x40dc7f28
f64le : cdata<struct 3517>: 0x41bb6570
i32be : cdata<struct 3517>: 0x41a19078
u32be : cdata<struct 3517>: 0x41be2970
u16be : cdata<struct 3517>: 0x41a17308
int16 : cdata<struct 3517>: 0x4120b968
i64le : cdata<struct 3517>: 0x40d222e8
float : cdata<struct 3517>: 0x411aaea0
int8 : cdata<struct 3517>: 0x40d352b8
uint8 : cdata<struct 3517>: 0x4120b190
ref_obj : cdata<struct 3517>: 0x40dc8260
vlen_reclaim : function: 0x40f43bd0
int : cdata<struct 3517>: 0x41a0b068
u32le : cdata<struct 3517>: 0x41bdefc8
get_libversion : function: 0x40f3ad08
f64be : cdata<struct 3517>: 0x41be1df8
b32be : cdata<struct 3517>: 0x41bdc668
b16le : cdata<struct 3517>: 0x41bdef60
b16be : cdata<struct 3517>: 0x411ac2b8
create_space : function: 0x40f43eb8
b32 : cdata<struct 3517>: 0x41baff70
is_hdf5 : function: 0x40f3af10
opaque : cdata<struct 3517>: 0x40dc11d0
uint16 : cdata<struct 3517>: 0x41ba95f8
b16 : cdata<struct 3517>: 0x41bb0c78
u16le : cdata<struct 3517>: 0x41a0f8a8
ulong : cdata<struct 3517>: 0x41a0f538
u8be : cdata<struct 3517>: 0x41a0e690
b64 : cdata<struct 3517>: 0x41bb7060
b64be : cdata<struct 3517>: 0x411aaac8
b8 : cdata<struct 3517>: 0x41bdcba0
open_file : function: 0x40f3aec0
i8be : cdata<struct 3517>: 0x40d36820
f32be : cdata<struct 3517>: 0x401f6c10
uint : cdata<struct 3517>: 0x41206ca8
create_file : function: 0x40f3adf0
b64le : cdata<struct 3517>: 0x40db5f20
}
[0.0123s]
th> hdf5
[0.0001s]
th>

extraneous warning on opening HDF5 library

Every time the HDF5 library is opened, it gives a message like this:

Mon Mar 24 14:55:16 2014 WARN Logger: changing loglevel from DEBUG to WARN

This clutters up log files and is pretty much useless to users. It would be nice to suppress this message.

Segfault when loading HDF5 package from worker thread

The following snippet causes a segfault:

-- crashes if following line is commented out
--require('hdf5')
local threads = require 'threads'
threadPool = threads.Threads(1,function(threadid) require('hdf5') end )
threadPool:addjob(function()  -- do work
end )
threadPool:synchronize()
threadPool:terminate()

"Could NOT find HDF5"

I get the following error when trying to execute 'luarocks make hdf5-0-0.rockspec'. hdf5 was installed with brew here: /usr/local/Cellar/hdf5/1.8.16_1

cmake -E make_directory build;
cd build;
cmake .. -DCMAKE_BUILD_TYPE=Release -DCMAKE_PREFIX_PATH="/Users/letterbomb/torch/install/bin/.." -DCMAKE_INSTALL_PREFIX="/Users/letterbomb/torch/install/lib/luarocks/rocks/hdf5/0-0"; 
make

-- Found Torch7 in /Users/letterbomb/torch/install
-- HDF5: Using hdf5 compiler wrapper to determine C configuration
-- HDF5: Using hdf5 compiler wrapper to determine CXX configuration
CMake Error at /usr/local/Cellar/cmake/3.6.0/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:148 (message):
  Could NOT find HDF5 (missing: HDF5_HL_LIBRARIES) (found suitable version
  "1.8.16", minimum required is "1.8")
Call Stack (most recent call first):
  /usr/local/Cellar/cmake/3.6.0/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:388 (_FPHSA_FAILURE_MESSAGE)
  /usr/local/Cellar/cmake/3.6.0/share/cmake/Modules/FindHDF5.cmake:707 (find_package_handle_standard_args)
  CMakeLists.txt:4 (FIND_PACKAGE)


-- Configuring incomplete, errors occurred!
See also "/Users/letterbomb/torch/torch-hdf5/build/CMakeFiles/CMakeOutput.log".
make: *** No targets specified and no makefile found.  Stop.

Error: Build error: Failed building.

Unsupported HDF5 version 1.10.0

I get an error: Unsupported HDF5 version 1.10.0 when importing hdf5 with torch or luajit. Does the package still only supports 1.8? Will it be updated anytime soon?

read/write error on Mac OS X

Hello,
I'm having some issues writing and reading to hdf5 files.

Installed using instructions. (also tried hdf5-20-0 from luarocks install.) HDF5 is ver. 1.8.16 (also tried with 1.8.14).
Below is the exact input I've used:


Aleksandrs-MacBook-Pro:~ aleksandrspiridonov$ luajit
LuaJIT 2.1.0-beta1 -- Copyright (C) 2005-2015 Mike Pall. http://luajit.org/


|_ | | |
| | ___ _ __ | |
| |/ _ | '**/ _| ' \
| | (
) | | | (| | | |
/
/|
| _**|| ||

JIT: ON SSE2 SSE3 SSE4.1 fold cse dce fwd dse narrow loop abc sink fuse
th> require 'torch'
th> require 'hdf5'
th> local myFile = hdf5.open('/Users/aleksandrspiridonov/t7test/h5test6.h5','w')
th> myFile:write('dataset1', torch.rand(5, 5))
stdin:1: attempt to index global 'myFile' (a nil value)
stack traceback:
stdin:1: in main chunk
[C]: at 0x0109fb1ba0
th>


myFile:close() throws the same error. The files do get created but have a zero size.
Has anyone come across this before or recognize the error?

Error on write- missing declaration for symbol 'H5P_CLS_DATASET_CREATE_g'

Reading is fine but I got following error on writing (myFile:write) on Mac OS,

Example Code:
require 'hdf5'
local myFile = hdf5.open('write.h5', 'w')
myFile:write("/data/abc", torch.rand(5, 5))
myFile:close()

Error Message:
/usr/local/bin/luajit: /usr/local/share/lua/5.1/hdf5/datasetOptions.lua:21: missing declaration for symbol 'H5P_CLS_DATASET_CREATE_g'
stack traceback:
[C]: in function '__index'
/usr/local/share/lua/5.1/hdf5/datasetOptions.lua:21: in function '__init'
/usr/local/share/lua/5.1/torch/init.lua:50: in function </usr/local/share/lua/5.1/torch/init.lua:46>
[C]: in function 'DataSetOptions'
/usr/local/share/lua/5.1/hdf5/group.lua:70: in function '_writeData'
/usr/local/share/lua/5.1/hdf5/group.lua:175: in function 'write'
/usr/local/share/lua/5.1/hdf5/file.lua:60: in function 'write'
scripts/working.lua:49: in main chunk
[C]: in function 'dofile'
/usr/local/lib/luarocks/rocks/trepl/scm-1/bin/th:129: in main chunk
[C]: at 0x010b1c5100

Loading HDF5 is Slow

I'm using hdf5 format file for transferring data between python and torch on ubuntu 14.04.
Somehow just opening big size (2GB~) hdf5 takes ages on th environment.
(not loading any data on memory, just f = hdf5.open('file.h5', 'r') )

Is this only case for me? Any suggestions?

Out of IDs after reading many files

I'm using many data files in hdf5 format to train a neural network. After running for many epochs over a few hours, it crashes with an error

HDF5-DIAG: Error detected in HDF5 (1.8.16) thread 140335388788608:
  #000: H5F.c line 608 in H5Fopen(): unable to atomize file handle
    major: Object atom
    minor: Unable to register new atom
  #001: H5I.c line 921 in H5I_register(): no IDs available in type
    major: Object atom
    minor: Out of IDs for group

It seems to be a known(?) bug, and exists in both 1.8.14 and 1.8.16
https://stackoverflow.com/questions/35522633/hdf5-consumes-all-resource-ids-for-dataspaces-and-exits-c-api

I can reproduce it with this if I just let it run for a while (to be precise, around 2^24 = 16777216 iterations)

require 'hdf5'
require 'xlua'
local N = 20000000
local n = '/tmp/test.h5'
local f = hdf5.open(n, 'w')
f:write('/data', torch.rand(1))
f:close()

for i=1,N do
  local f = hdf5.open(n, 'r')
  f:read('/data'):all()
  f:close()
  xlua.progress(i, N)
end

Any ideas? Should I just not use hdf5?

A way to check entry exist in hdf5?

Hi, all! Is there a way to check whether an entry exists in hdf5 file? I've went through the source code, but did't find one. Any link would be very helpful. Anyway, one of the simplist way is just return nil in all/partial function if status < 0.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.