Traceback (most recent call last):
File "nck/entrypoint.py", line 86, in <module>
app()
File "/.../nautilus-connectors-kit/nautilus-env/lib/python3.8/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/.../nautilus-connectors-kit/nautilus-env/lib/python3.8/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/.../nautilus-connectors-kit/nautilus-env/lib/python3.8/site-packages/click/core.py", line 1164, in invoke
return _process_result(rv)
File "/.../nautilus-connectors-kit/nautilus-env/lib/python3.8/site-packages/click/core.py", line 1101, in _process_result
value = ctx.invoke(self.result_callback, value,
File "/.../nautilus-connectors-kit/nautilus-env/lib/python3.8/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "nck/entrypoint.py", line 68, in run
writer.write(stream)
File "/.../nautilus-connectors-kit/nck/writers/console_writer.py", line 44, in write
buffer = file.read(1024)
File "/.../nautilus-connectors-kit/nck/streams/stream.py", line 114, in readinto
chunk = self.leftover or encode(next(iterable))
File "/.../nautilus-connectors-kit/nck/utils/file_reader.py", line 36, in sdf_to_njson_generator
for line in dict_reader:
File "/usr/local/Cellar/[email protected]/3.8.5/Frameworks/Python.framework/Versions/3.8/lib/python3.8/csv.py", line 111, in __next__
row = next(self.reader)
_csv.Error: field larger than field limit (131072)
From the error message I was able to set a csv.field_size_limit above the 131072 default limit.
The line is the following and was added to the nck/utils/file_reader.py file(replace 1000000 by another limit to discuss) :
csv.field_size_limit(10000000)
Even if it worked I noticed that a field was containing an outrageous number of ids. I think that before setting this new csv.field_size_limit it could be interesting to check if there is no mistake in the process that would cause a field to contain way more ids than what it really should have.