Code Monkey home page Code Monkey logo

kaldi-onnx's People

Contributors

llhe avatar tonymou avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kaldi-onnx's Issues

ValueError: could not convert string to float: '<BiasParams>'

Hi, when convert the mace model, most of the models are successful, but this model fails.

[Info] nnet with left_context|right_context: 0|0
[Info] convert the kaldi type model ==> onnx model
2021-01-08 20:19:56,786 root INFO frames per chunk: 51, left-context: 0, right-context: 0, modulus: 1
Traceback (most recent call last):
  File "/home/test/workstation/kaldi-onnx/converter/utils.py", line 268, in read_matrix
    f = float(tok)
ValueError: could not convert string to float: '<BiasParams>'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "converter/convert.py", line 362, in <module>
    main()
  File "converter/convert.py", line 321, in main
    onnx_model, configs, trans_model = converter.run()
  File "converter/convert.py", line 81, in run
    self.parse_configs()
  File "converter/convert.py", line 173, in parse_configs
    self._components, self._transition_model = parser.run()

final.mace.zip

converter net3 model error!

finished parsing nnet3 (110) components.
left_context: 1, right context: 1
model has 146 nodes.
start making ONNX model.
Traceback (most recent call last):
File "converter/convert.py", line 202, in
main()
File "converter/convert.py", line 188, in main
onnx_model = converter.run()
File "converter/convert.py", line 71, in run
onnx_model = g.run()
File "/data/aiwork/kaldi-onnx/converter/graph.py", line 113, in run
onnx_model = self.make_model()
File "/data/aiwork/kaldi-onnx/converter/graph.py", line 747, in make_model
value_info=internal_inputs)
TypeError: make_graph() got an unexpected keyword argument 'value_info'

Convert failed

When I convert nnet3 model, it came up this problem.
Seems that the converter cannot handle component correctly.

`
Traceback (most recent call last):
File "/home/kaldi-onnx/converter/utils.py", line 268, in read_matrix
f = float(tok)
ValueError: could not convert string to float: ''

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "converter/convert.py", line 364, in
main()
File "converter/convert.py", line 324, in main
onnx_model, configs, trans_model = converter.run()
File "converter/convert.py", line 81, in run
self.parse_configs()
File "converter/convert.py", line 173, in parse_configs
self._components, self._transition_model = parser.run()
File "/home/kaldi-onnx/converter/parser.py", line 617, in run
self.parse_component_lines()
File "/home/kaldi-onnx/converter/parser.py", line 1073, in parse_component_lines
component_type)
File "/home/kaldi-onnx/converter/parser.py", line 1109, in read_component
action_dict)
File "/home/kaldi-onnx/converter/parser.py", line 1148, in read_generic
obj, pos = func(line, pos, line_buffer)
File "/home/kaldi-onnx/converter/utils.py", line 273, in read_matrix
.format(sys.argv[0], pos, tok), file=sys.stderr)
File "/usr/lib/python3.5/logging/init.py", line 1308, in error
self._log(ERROR, msg, args, **kwargs)
TypeError: _log() got an unexpected keyword argument 'file'
`

convert中的参数问题

--left-context
--right-context
这两个参数是什么?怎么传 工程里没有给到相关的示例 感谢回复

ReplaceIndex and Splice

I have use Kaldi-ONNX Converter make an onnx model.
And I use ONNX Model Viewer see this
fig.

I wanna know what process did functions ReplaceIndex and Splice do to make ivector from 1x3x100 to 1x244x100 and input from 1x320x40 to 1x244x120?

Here is my onnx modelmodel_name_244.zip.
Thanks.

ubuntu install error

root@waytronic-server:~/下载/kaldi-onnx# sudo pip install -r requirements.txt
/usr/local/lib/python2.7/dist-packages/pip/_vendor/requests/init.py:83: RequestsDependencyWarning: Old version of cryptography ([1, 2, 3]) may cause slowdown.
warnings.warn(warning, RequestsDependencyWarning)
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already satisfied: setuptools==18.5 in /usr/local/lib/python2.7/dist-packages (from -r requirements.txt (line 1)) (18.5)
Requirement already satisfied: six==1.11.0 in /usr/local/lib/python2.7/dist-packages (from -r requirements.txt (line 2)) (1.11.0)
Requirement already satisfied: numpy==1.14.1 in /usr/local/lib/python2.7/dist-packages (from -r requirements.txt (line 3)) (1.14.1)
Collecting onnx==1.5.0 (from -r requirements.txt (line 4))
Using cached https://files.pythonhosted.org/packages/1a/a7/a7aecbfe51f01f754b579ab3fbefaa53f03abdd8cc205cc9b0996089df34/onnx-1.5.0.tar.gz
Requirement already satisfied: protobuf in /usr/local/lib/python2.7/dist-packages (from onnx==1.5.0->-r requirements.txt (line 4)) (3.8.0)
Requirement already satisfied: typing>=3.6.4 in /usr/local/lib/python2.7/dist-packages (from onnx==1.5.0->-r requirements.txt (line 4)) (3.6.6)
Requirement already satisfied: typing-extensions>=3.6.2.1 in /usr/local/lib/python2.7/dist-packages (from onnx==1.5.0->-r requirements.txt (line 4)) (3.7.2)
Building wheels for collected packages: onnx
Building wheel for onnx (setup.py) ... error
ERROR: Complete output from command /usr/bin/python -u -c 'import setuptools, tokenize;file='"'"'/tmp/pip-install-BUJ5mr/onnx/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-udi8fk --python-tag cp27:
ERROR: fatal: Not a git repository (or any of the parent directories): .git
running bdist_wheel
running build
running build_py
running create_version
running cmake_build
-- The C compiler identification is GNU 5.4.0
-- The CXX compiler identification is GNU 5.4.0
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
CMake Error at CMakeLists.txt:226 (message):
Protobuf compiler not found
Call Stack (most recent call first):
CMakeLists.txt:257 (relative_protobuf_generate_cpp)

-- Configuring incomplete, errors occurred!
See also "/tmp/pip-install-BUJ5mr/onnx/.setuptools-cmake-build/CMakeFiles/CMakeOutput.log".
Traceback (most recent call last):
File "", line 1, in
File "/tmp/pip-install-BUJ5mr/onnx/setup.py", line 331, in
'backend-test-tools = onnx.backend.test.cmd_tools:main',
File "/usr/lib/python2.7/distutils/core.py", line 151, in setup
dist.run_commands()
File "/usr/lib/python2.7/distutils/dist.py", line 953, in run_commands
self.run_command(cmd)
File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/usr/lib/python2.7/dist-packages/wheel/bdist_wheel.py", line 179, in run
self.run_command('build')
File "/usr/lib/python2.7/distutils/cmd.py", line 326, in run_command
self.distribution.run_command(command)
File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/usr/lib/python2.7/distutils/command/build.py", line 128, in run
self.run_command(cmd_name)
File "/usr/lib/python2.7/distutils/cmd.py", line 326, in run_command
self.distribution.run_command(command)
File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/tmp/pip-install-BUJ5mr/onnx/setup.py", line 206, in run
self.run_command('cmake_build')
File "/usr/lib/python2.7/distutils/cmd.py", line 326, in run_command
self.distribution.run_command(command)
File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/tmp/pip-install-BUJ5mr/onnx/setup.py", line 192, in run
subprocess.check_call(cmake_args)
File "/usr/lib/python2.7/subprocess.py", line 541, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '[u'/usr/bin/cmake', u'-DPYTHON_INCLUDE_DIR=/usr/include/python2.7', u'-DPYTHON_EXECUTABLE=/usr/bin/python', u'-DBUILD_ONNX_PYTHON=ON', u'-DCMAKE_EXPORT_COMPILE_COMMANDS=ON', u'-DONNX_NAMESPACE=onnx', u'-DPY_EXT_SUFFIX=', u'-DCMAKE_BUILD_TYPE=Release', u'-DONNX_ML=1', '/tmp/pip-install-BUJ5mr/onnx']' returned non-zero exit status 1

ERROR: Failed building wheel for onnx
Running setup.py clean for onnx
Failed to build onnx
Installing collected packages: onnx
Running setup.py install for onnx ... error
ERROR: Complete output from command /usr/bin/python -u -c 'import setuptools, tokenize;file='"'"'/tmp/pip-install-BUJ5mr/onnx/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' install --record /tmp/pip-record-Kj6olq/install-record.txt --single-version-externally-managed --compile:
ERROR: fatal: Not a git repository (or any of the parent directories): .git
running install
running build
running build_py
running create_version
running cmake_build
CMake Error at CMakeLists.txt:226 (message):
Protobuf compiler not found
Call Stack (most recent call first):
CMakeLists.txt:257 (relative_protobuf_generate_cpp)

-- Configuring incomplete, errors occurred!
See also "/tmp/pip-install-BUJ5mr/onnx/.setuptools-cmake-build/CMakeFiles/CMakeOutput.log".
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/tmp/pip-install-BUJ5mr/onnx/setup.py", line 331, in <module>
    'backend-test-tools = onnx.backend.test.cmd_tools:main',
  File "/usr/lib/python2.7/distutils/core.py", line 151, in setup
    dist.run_commands()
  File "/usr/lib/python2.7/distutils/dist.py", line 953, in run_commands
    self.run_command(cmd)
  File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
    cmd_obj.run()
  File "/usr/local/lib/python2.7/dist-packages/setuptools/command/install.py", line 61, in run
    return orig.install.run(self)
  File "/usr/lib/python2.7/distutils/command/install.py", line 601, in run
    self.run_command('build')
  File "/usr/lib/python2.7/distutils/cmd.py", line 326, in run_command
    self.distribution.run_command(command)
  File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
    cmd_obj.run()
  File "/usr/lib/python2.7/distutils/command/build.py", line 128, in run
    self.run_command(cmd_name)
  File "/usr/lib/python2.7/distutils/cmd.py", line 326, in run_command
    self.distribution.run_command(command)
  File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
    cmd_obj.run()
  File "/tmp/pip-install-BUJ5mr/onnx/setup.py", line 206, in run
    self.run_command('cmake_build')
  File "/usr/lib/python2.7/distutils/cmd.py", line 326, in run_command
    self.distribution.run_command(command)
  File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
    cmd_obj.run()
  File "/tmp/pip-install-BUJ5mr/onnx/setup.py", line 192, in run
    subprocess.check_call(cmake_args)
  File "/usr/lib/python2.7/subprocess.py", line 541, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '[u'/usr/bin/cmake', u'-DPYTHON_INCLUDE_DIR=/usr/include/python2.7', u'-DPYTHON_EXECUTABLE=/usr/bin/python', u'-DBUILD_ONNX_PYTHON=ON', u'-DCMAKE_EXPORT_COMPILE_COMMANDS=ON', u'-DONNX_NAMESPACE=onnx', u'-DPY_EXT_SUFFIX=', u'-DCMAKE_BUILD_TYPE=Release', u'-DONNX_ML=1', '/tmp/pip-install-BUJ5mr/onnx']' returned non-zero exit status 1
----------------------------------------

ERROR: Command "/usr/bin/python -u -c 'import setuptools, tokenize;file='"'"'/tmp/pip-install-BUJ5mr/onnx/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' install --record /tmp/pip-record-Kj6olq/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-install-BUJ5mr/onnx/

Sum Descriptor parser error

We use , to split config in function parse_sum_descp. It will fail if this descriptor refers to other descriptor, for example

Sum(Scale(0.66, tdnnf7.dropout), tdnnf8.dropout)

Problem with compiling offset feedback to fast-lstmp layer

Let's say i have a nnet3 that looks something like this:

input-node name=input dim=40
component-node name=tdnn1.affine component=lda.tdnn1.affine input=Append(Offset(input, -1), input, Offset(input, 1))
component-node name=tdnn1.relu component=tdnn1.relu input=tdnn1.affine
component-node name=tdnn1.batchnorm component=tdnn1.batchnorm input=tdnn1.relu
component-node name=tdnn2.affine component=tdnn2.affine input=Append(Offset(tdnn1.batchnorm, -1), tdnn1.batchnorm, Offset(tdnn1.batchnorm, 1))
component-node name=tdnn2.relu component=tdnn2.relu input=tdnn2.affine
component-node name=tdnn2.batchnorm component=tdnn2.batchnorm input=tdnn2.relu
dim-range-node name=lstm1.c input-node=lstm1.lstm_nonlin dim-offset=0 dim=512
dim-range-node name=lstm1.m input-node=lstm1.lstm_nonlin dim-offset=512 dim=512
component-node name=lstm1.rp component=lstm1.W_rp input=lstm1.m
dim-range-node name=lstm1.r input-node=lstm1.rp dim-offset=0 dim=128
component-node name=output.affine component=output.affine input=lstm1.rp
component-node name=output.log-softmax component=output.log-softmax input=output.affine
output-node name=output input=Offset(output.log-softmax, 5) objective=linear
component-node name=lstm1.W_all component=lstm1.W_all input=Append(tdnn2.batchnorm, IfDefined(Offset(lstm1.r, -3)))
component-node name=lstm1.lstm_nonlin component=lstm1.lstm_nonlin input=Append(lstm1.W_all, IfDefined(Offset(lstm1.c, -3)))

When converting this model, the graph compilation gets stuck in reorder_nodes method in converter/graph.py. The reason is for example, the IfDefined node lstm1.c-3 of the offset feedback cannot be reordered since there's a looped dependency with the lstm1.lstm_nonlin node. Since i don't really understand the logic of code line 407 in converter/graph.py, i was wondering if anyone can help me understand this problem better.

Thanks in advance.

convert error

hi
I got error when I run the command as follow:
python converter/convert.py --input=models/kaldi/final.mdl --output=models/onnx/final.onnx --chunk-size=20 --nnet-type=2

The error message is:
......
File "/root/kaldi-onnx/converter/parser.py", line 195, in run
line = next(self._line_buffer)
File "/usr/lib/python3.6/codecs.py", line 321, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xff in position 233: invalid start byte

My python version is python3.6 , and the final.mdl is kaldi's example model timit which is from https://github.com/XiaoMi/mace-models/tree/master/kaldi-models.

how can I fix it?

please help me,thanks.

run in linux x86_64 server

if if i wan't to run cvte model on linux server, will it work?

mace support run on linux x86_64 server?

Undefined local variable

in converter/graph.py and line 155, there is an undefined local variable "input"?I do not believe this! Can someone help me?

spec augment component not supported

Hi it seems the tool has not supported spec augment component yet. Models trained with it failed to be converted and the error message goes like:
2022-10-17 14:30:43,254 parser INFO Component: SpecAugmentTimeMaskComponent not supported yet.
2022-10-17 14:30:43,254 parser ERROR ../kaldi-onnx/converter/convert.py: error reading component with name spec-augment.time-mask at position 15
Is this project still being maintained and will this component be supported in the future? Thanks.

image

Conversion fail for subsampling > 1

python converter/convert.py --input=text.mdl --output=final.onnx --trans-model=final.trans --conf=final.conf --chunk-size=50 --left-context=39 --right-context=39 --modulus=3 --subsample-factor=3 --nnet-type=3
2021-02-25 13:42:05,502 root INFO frames per chunk: 51, left-context: 39, right-context: 39, modulus: 3
2021-02-25 13:42:43,973 parser INFO finished parsing nnet3 (105) components.
2021-02-25 13:42:44,071 graph INFO Prepare Graph.
2021-02-25 13:42:44,090 graph INFO Inference dependencies.
2021-02-25 13:42:44,094 graph INFO Inference indexes
Traceback (most recent call last):
File "converter/convert.py", line 364, in
main()
File "converter/convert.py", line 324, in main
onnx_model, configs, trans_model = converter.run()
File "converter/convert.py", line 99, in run
onnx_model = g.run()
File "/home/ricky/kaldi-onnx/converter/graph.py", line 121, in run
self.add_subsample_nodes()
File "/home/ricky/kaldi-onnx/converter/graph.py", line 155, in add_subsample_nodes
subsample_name = input + '.subsample.' + node.name
UnboundLocalError: local variable 'input' referenced before assignment

A check error

Traceback (most recent call last):
  File "converter/convert.py", line 200, in <module>
    main()
  File "converter/convert.py", line 186, in main
    onnx_model = converter.run()
  File "converter/convert.py", line 56, in run
    self.parse_configs()
  File "converter/convert.py", line 79, in parse_configs
    self._components = parser.run()
  File "/data1/rukuang/github/kaldi-onnx/converter/parser.py", line 581, in run
    self.parse_configs()
  File "/data1/rukuang/github/kaldi-onnx/converter/parser.py", line 610, in parse_configs
    parsed_input = self.parse_input_descriptor(input)
  File "/data1/rukuang/github/kaldi-onnx/converter/parser.py", line 646, in parse_input_descriptor
    input_name, comp = self.parse_descriptor(type, input_str, sub_components)
  File "/data1/rukuang/github/kaldi-onnx/converter/parser.py", line 686, in parse_descriptor
    return self.parse_append_descp(input, sub_components)
  File "/data1/rukuang/github/kaldi-onnx/converter/parser.py", line 723, in parse_append_descp
    splice_indexes = splice_continous_numbers(offset_indexes)
  File "/data1/rukuang/github/kaldi-onnx/converter/utils.py", line 74, in splice_continous_numbers
    kaldi_check(nums >= 2,
TypeError: unorderable types: list() >= int()

It seems to miss the len() around nums.

Check ONNX model after generate it

I found that input info is added twice in make_model function.

input_with_initializers = []
initializers_names = []

for initializer in initializers:
      val = helper.make_tensor_value_info(initializer.name,
                                                initializer.data_type,
                                                self.make_onnx_shape(
                                                    initializer.dims))
     input_with_initializers.append(val)  # first, add all input info with initializers.
     initializers_names.append(initializer.name)

input_with_initializers.extend(list(self._model_inputs.values()))  # second, add input info without initializers. Unfortunately, there is somethings wrong that we add all input info.

Convert nnet3 encountered infinite loop

hi,
When I convert nnet3 model, encountered the problem; i follow the Usage cmd ,No error occurred; run convert.py script , No results for more than 10 hours; debug code find the infinite loop problem ; I don't know which step was wrong;
- run script steps:
1、pip install -r requirements.txt
2、./nnet3-copy --binary=false --prepare-for-test=true AM/final.mdl AM/text.mdl
3、 python converter/convert.py --input=./AM/text.mdl --output=./AM/text.onnx --trans-model=./AM/text.trans --conf=./AM/configuration.conf --chunk-size=40 --left-context=20 --right-context=0 --modulus=1 --subsample-factor=1 --nnet-type=3

note:/AM/text.mdl model structure:2 layer lstm + 2 layer nn

- infinite loop code
`def reorder_nodes(self, ifdefine=True):

   while len(nodes_need_check) > 0:
        for node in nodes_need_check:
            depend_inputs = [input for input in node.inputs
                             if input not in node.consts]
            if set(depend_inputs) <= set(checked_names)\
                    or (node.type == KaldiOpType.IfDefined.name
                        and ifdefine and 'IfDefined' in node.inputs[0]):
                updated_nodes.append(node)
                checked_names.append(node.name)
                nodes_need_check.remove(node)
    self._nodes = updated_nodes
    for node in self._nodes:
        del node.nexts[:]
    for node in self._nodes:
        self._nodes_by_name[node.name] = node
        for input in node.inputs:
            if input in self._nodes_by_name:
                input_node = self._nodes_by_name[input]
                input_node.nexts.append(node.name)`

转换模型时的--left-context和--right-context如何设置?

我实验跑的是的sre16/v2里面的xvector,network.xconfig使用的是默认的,转换模型时的--left-context=7,--right-context=7这样对吗?
I did the experiment of xvector in sre16/v2 and the network.xconfig used the default. When converting the model, the --left-context=7 and --right-context=7 are right?

How to get ivector input

I noticed that there are two inputs to the ONNX model ("input" and "ivector"). I was wondering how I can obtain / extract the ivector tensor from my Kaldi model to a numpy array ?

Thanks in advance.

Untitled

UnicodeDecodeError: 'ascii' codec can't decode byte 0x80

how to set the param when converting ? I don't have the trans model but all in the mdl, and what does config file mean or module mean?

Traceback (most recent call last):
  File "converter/convert.py", line 364, in <module>
    main()
  File "converter/convert.py", line 324, in main
    onnx_model, configs, trans_model = converter.run()
  File "converter/convert.py", line 81, in run
    self.parse_configs()
  File "converter/convert.py", line 173, in parse_configs
    self._components, self._transition_model = parser.run()
  File "/root/kaldi-onnx/converter/parser.py", line 614, in run
    line = self.parse_transition_model()
  File "/root/kaldi-onnx/converter/parser.py", line 625, in parse_transition_model
    line = next(self._line_buffer)
  File "/usr/lib64/python3.6/encodings/ascii.py", line 26, in decode
    return codecs.ascii_decode(input, self.errors)[0]
UnicodeDecodeError: 'ascii' codec can't decode byte 0x80 in position 544: ordinal not in range(128)

Error when validate model with DynamicLSTM node

I have use Kaldi-ONNX Converter make an onnx model successful.
Below is my model:
test1 onnx

But when I validate this model, I get error:
F mace/core/net_def_adapter.cc:458] Check failed: ws_->GetTensor(op_def->input(i)) != nullptr && ws_->GetTensor(op_def->input(i))->is_weight() Tensor lstm1.r_trunc.IfDefined of lstm1.lstm_nonlin.fused is not allocated by Workspace ahead

Exception Error: invalid output indexes values

Hello when I run the converter I got some error
my command and the error are below
python converter/convert.py --input=./test/text.mdl
--output=./test/final.onnx
--trans-model=./test/transition_model.mdl
--conf=./test/configuration.conf
--chunk-size=20
--left-context=3
--right-context=3
--modulus=1
--subsample-factor=3
--nnet-type=3

Traceback (most recent call last):
File "converter/convert.py", line 364, in
main()
File "converter/convert.py", line 324, in main
onnx_model, configs, trans_model = converter.run()
File "converter/convert.py", line 99, in run
onnx_model = g.run()
File "/home/kli/kaldi-onnx/converter/graph.py", line 119, in run
self.inference_dependencies()
File "/home/kli/kaldi-onnx/converter/graph.py", line 331, in inference_dependencies
self.infer_node_dependencies(node, output_indexes)
File "/home/kli/kaldi-onnx/converter/graph.py", line 321, in infer_node_dependencies
current_dependencies)
File "/home/kli/kaldi-onnx/converter/graph.py", line 304, in infer_node_dependencies
self._subsample_factor)
File "/home/kli/kaldi-onnx/converter/node.py", line 303, in inference_dependencies
kaldi_check(len(output_indexes) > 0, "invalid output indexes values.")
File "/home/kli/kaldi-onnx/converter/utils.py", line 28, in kaldi_check
raise Exception(msg)
Exception: invalid output indexes values

Any help would be appriciated

No module named onnx

kaldi-onnx# pip3 install -r requirements.txt
Requirement already satisfied: setuptools==18.5 in /usr/local/lib/python3.5/dist-packages (from -r requirements.txt (line 1)) (18.5)
Requirement already satisfied: six==1.11.0 in /usr/local/lib/python3.5/dist-packages (from -r requirements.txt (line 2)) (1.11.0)
Requirement already satisfied: numpy==1.14.1 in /usr/local/lib/python3.5/dist-packages (from -r requirements.txt (line 3)) (1.14.1)
Requirement already satisfied: onnx==1.5.0 in /usr/local/lib/python3.5/dist-packages (from -r requirements.txt (line 4)) (1.5.0)
Requirement already satisfied: typing-extensions>=3.6.2.1 in /usr/local/lib/python3.5/dist-packages (from onnx==1.5.0->-r requirements.txt (line 4)) (3.7.2)
Requirement already satisfied: typing>=3.6.4 in /usr/local/lib/python3.5/dist-packages (from onnx==1.5.0->-r requirements.txt (line 4)) (3.6.6)
Requirement already satisfied: protobuf in /usr/local/lib/python3.5/dist-packages (from onnx==1.5.0->-r requirements.txt (line 4)) (3.8.0)
root@waytronic-server:~/下载/kaldi-onnx# python ./converter/convert.py --input=./mymodels/final-txt.mdl --output=./mymodels/final.onnx --chunk-size=20 --nnet-type=2
Traceback (most recent call last):
  File "./converter/convert.py", line 30, in <module>
    from graph import Graph
  File "/root/下载/kaldi-onnx/converter/graph.py", line 23, in <module>
    from onnx import defs, helper, checker, numpy_helper, onnx_pb
ImportError: No module named onnx
root@waytronic-server:~/下载/kaldi-onnx# 

Exception: Concat inputs' chunk size are not match

Hello,
When I ran convert.py, an exception was raised: "Exception: Concat inputs' chunk size are not match."

If I print the chunk sizes to be compared, it will be:
output_chunk=60
input_shape[-2]=60
output_chunk=56
input_shape[-2]=56
output_chunk=44
input_shape[-2]=47

I trained the nnet3 model following the librispeech script, so I think the model's architecture would be quite typical.
Does someone have any idea?


2019-11-19 14:26:51,443 parser INFO finished parsing nnet3 (55) components.
2019-11-19 14:26:51,543 graph INFO left_context: 30, right context: 12
Traceback (most recent call last):
File "converter/convert.py", line 218, in
main()
File "converter/convert.py", line 199, in main
onnx_model = converter.run()
File "converter/convert.py", line 73, in run
onnx_model = g.run()
File "/home/henrik.chen/github/kaldi-onnx/converter/graph.py", line 111, in run
self.infer_shapes()
File "/home/henrik.chen/github/kaldi-onnx/converter/graph.py", line 201, in infer_shapes
node.infer_shape(self._shapes)
File "/home/henrik.chen/github/kaldi-onnx/converter/node.py", line 343, in infer_shape
"Concat inputs' chunk size are not match.")
File "/home/henrik.chen/github/kaldi-onnx/converter/utils.py", line 24, in kaldi_check
raise Exception(msg)
Exception: Concat inputs' chunk size are not match.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.