Comments (18)
Hi, are you using the default data/
directory in the project root or have you followed the instructions here for setting up your own data/
path ?
from superpoint_transformer.
The reason for this error is that, unless specified otherwise, the program will use the data/
in the project root directory.
But, in order to find the absolute path to your project, the pyrootutils
library is used. As stated in your error message :
FileNotFoundError: Project root directory not found. Indicators: ['.git', 'pyproject.toml']
the pyrootutils
library needs to find of some specific files to identify your root directory:
.git
pyproject.toml
The fact that this failed is probably because you do not have a .git
in your repository. I am guessing you did not download the code using git clone ...
?
I just updated the code for pyrootutils
to also identify your project root directory based on the README.md
. Can you please let me know if this solves your issue ?
from superpoint_transformer.
Thank you for your reply!
The previous error was resolved, but I still don't know which directory the dataset should be placed in in your project. I noticed an error in import src. data. Do I want to create a data file in the src directory and then place the dataset in it?
Traceback (most recent call last):
File "src/train.py", line 49, in <module>
from src import utils
File "/home/zhaojing/code/superpoint_transformer-master/src/__init__.py", line 2, in <module>
import src.data
ModuleNotFoundError: No module named 'src.data'
from superpoint_transformer.
There was a hotfix an hour ago, pull a fresh version of the repo and this bug should be fixed.
from superpoint_transformer.
Thank you!
Previous issue resolved,but there is still a lack of necessary module
Traceback (most recent call last):
File "src/train.py", line 49, in <module>
from src import utils
File "/home/zhaojing/code/superpoint_transformer-master/src/__init__.py", line 2, in <module>
import src.data
File "/home/zhaojing/code/superpoint_transformer-master/src/data/__init__.py", line 1, in <module>
from .csr import *
File "/home/zhaojing/code/superpoint_transformer-master/src/data/csr.py", line 5, in <module>
from src.utils import tensor_idx, is_sorted, indices_to_pointers, \
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/__init__.py", line 11, in <module>
from .neighbors import *
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/neighbors.py", line 3, in <module>
from src.dependencies.FRNN import frnn
ModuleNotFoundError: No module named 'src.dependencies'
from superpoint_transformer.
Hi @jing-zhao9 thanks a lot for being our first beta-tester
I just pushed a commit which should fix the src/dependencies
error, would you mind pulling the latest version and checking again ?
from superpoint_transformer.
You did not add any Python files in src/dependencies
I still reported the an error
Traceback (most recent call last):
File "/home/zhaojing/code/superpoint_transformer-master/src/train.py", line 49, in <module>
from src import utils
File "/home/zhaojing/code/superpoint_transformer-master/src/__init__.py", line 2, in <module>
import src.data
File "/home/zhaojing/code/superpoint_transformer-master/src/data/__init__.py", line 1, in <module>
from .csr import *
File "/home/zhaojing/code/superpoint_transformer-master/src/data/csr.py", line 5, in <module>
from src.utils import tensor_idx, is_sorted, indices_to_pointers, \
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/__init__.py", line 11, in <module>
from .neighbors import *
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/neighbors.py", line 3, in <module>
from src.dependencies.FRNN import frnn
ModuleNotFoundError: No module named 'src.dependencies.FRNN'
from superpoint_transformer.
This is the expected behaviour, the src/dependencies
is initially empty before you run install.sh
. The FRNN
library is missing because it has been installed when the src/dependencies
directory structure was missing.
Please delete your spt
conda env and re-run install.sh
.
from superpoint_transformer.
When I was training, I encountered the following issues
multiprocessing.pool.RemoteTraceback:
"""
Traceback (most recent call last):
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/multiprocessing/pool.py", line 125, in worker
result = (True, func(*args, **kwds))
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/multiprocessing/pool.py", line 51, in starmapstar
return list(itertools.starmap(args[0], args[1]))
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/multiprocessing.py", line 41, in apply_args_and_kwargs
return fn(*args, **kwargs)
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 199, in read_s3dis_room
alignments = pd.read_csv(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 912, in read_csv
return _read(filepath_or_buffer, kwds)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 577, in _read
parser = TextFileReader(filepath_or_buffer, **kwds)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 1407, in __init__
self._engine = self._make_engine(f, self.engine)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 1661, in _make_engine
self.handles = get_handle(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pandas/io/common.py", line 859, in get_handle
handle = open(
FileNotFoundError: [Errno 2] No such file or directory: '/home/zhaojing/code/superpoint_transformer-master/data/s3dis/raw/Area_1/Area_1_alignmentAngle.txt'
"""
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/utils.py", line 45, in wrap
metric_dict, object_dict = task_func(cfg=cfg)
File "src/train.py", line 114, in train
trainer.fit(model=model, datamodule=datamodule, ckpt_path=cfg.get("ckpt_path"))
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 531, in fit
call._call_and_handle_interrupt(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/call.py", line 42, in _call_and_handle_interrupt
return trainer_fn(*args, **kwargs)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 570, in _fit_impl
self._run(model, ckpt_path=ckpt_path)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 927, in _run
self._data_connector.prepare_data()
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/connectors/data_connector.py", line 94, in prepare_data
call._call_lightning_datamodule_hook(trainer, "prepare_data")
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/call.py", line 160, in _call_lightning_datamodule_hook
return fn(*args, **kwargs)
File "/home/zhaojing/code/superpoint_transformer-master/src/datamodules/base.py", line 144, in prepare_data
self.dataset_class(
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 255, in __init__
super().__init__(*args, val_mixed_in_train=True, **kwargs)
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 193, in __init__
super().__init__(root, transform, pre_transform, pre_filter)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/data/in_memory_dataset.py", line 57, in __init__
super().__init__(root, transform, pre_transform, pre_filter, log)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/data/dataset.py", line 97, in __init__
self._process()
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 493, in _process
self.process()
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 528, in process
self._process_single_cloud(p)
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 544, in _process_single_cloud
data = self.read_single_raw_cloud(raw_path)
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 330, in read_single_raw_cloud
return read_s3dis_area(
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 75, in read_s3dis_area
batch = Batch.from_data_list(starmap_with_kwargs(
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/multiprocessing.py", line 36, in starmap_with_kwargs
out = pool.starmap(apply_args_and_kwargs, args_for_starmap)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/multiprocessing/pool.py", line 372, in starmap
return self._map_async(func, iterable, starmapstar, chunksize).get()
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/multiprocessing/pool.py", line 771, in get
raise self._value
FileNotFoundError: [Errno 2] No such file or directory: '/home/zhaojing/code/superpoint_transformer-master/data/s3dis/raw/Area_1/Area_1_alignmentAngle.txt'
[2023-06-17 10:25:52,506][src.utils.utils][INFO] - Closing loggers...
[2023-06-17 10:25:52,506][src.utils.utils][INFO] - Closing wandb!
wandb: Waiting for W&B process to finish... (success).
wandb: You can sync this run to the cloud by running:
wandb: wandb sync /home/zhaojing/code/superpoint_transformer-master/logs/train/runs/2023-06-17_10-25-24/wandb/offline-run-20230617_102527-lgqrcs8f
wandb: Find logs at: ./logs/train/runs/2023-06-17_10-25-24/wandb/offline-run-20230617_102527-lgqrcs8f/logs
Error executing job with overrides: ['experiment=s3dis', 'datamodule.fold=5']
multiprocessing.pool.RemoteTraceback:
"""
Traceback (most recent call last):
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/multiprocessing/pool.py", line 125, in worker
result = (True, func(*args, **kwds))
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/multiprocessing/pool.py", line 51, in starmapstar
return list(itertools.starmap(args[0], args[1]))
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/multiprocessing.py", line 41, in apply_args_and_kwargs
return fn(*args, **kwargs)
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 199, in read_s3dis_room
alignments = pd.read_csv(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 912, in read_csv
return _read(filepath_or_buffer, kwds)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 577, in _read
parser = TextFileReader(filepath_or_buffer, **kwds)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 1407, in __init__
self._engine = self._make_engine(f, self.engine)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 1661, in _make_engine
self.handles = get_handle(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pandas/io/common.py", line 859, in get_handle
handle = open(
FileNotFoundError: [Errno 2] No such file or directory: '/home/zhaojing/code/superpoint_transformer-master/data/s3dis/raw/Area_1/Area_1_alignmentAngle.txt'
"""
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "src/train.py", line 139, in main
metric_dict, _ = train(cfg)
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/utils.py", line 48, in wrap
raise ex
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/utils.py", line 45, in wrap
metric_dict, object_dict = task_func(cfg=cfg)
File "src/train.py", line 114, in train
trainer.fit(model=model, datamodule=datamodule, ckpt_path=cfg.get("ckpt_path"))
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 531, in fit
call._call_and_handle_interrupt(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/call.py", line 42, in _call_and_handle_interrupt
return trainer_fn(*args, **kwargs)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 570, in _fit_impl
self._run(model, ckpt_path=ckpt_path)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 927, in _run
self._data_connector.prepare_data()
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/connectors/data_connector.py", line 94, in prepare_data
call._call_lightning_datamodule_hook(trainer, "prepare_data")
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/call.py", line 160, in _call_lightning_datamodule_hook
return fn(*args, **kwargs)
File "/home/zhaojing/code/superpoint_transformer-master/src/datamodules/base.py", line 144, in prepare_data
self.dataset_class(
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 255, in __init__
super().__init__(*args, val_mixed_in_train=True, **kwargs)
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 193, in __init__
super().__init__(root, transform, pre_transform, pre_filter)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/data/in_memory_dataset.py", line 57, in __init__
super().__init__(root, transform, pre_transform, pre_filter, log)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/data/dataset.py", line 97, in __init__
self._process()
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 493, in _process
self.process()
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 528, in process
self._process_single_cloud(p)
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 544, in _process_single_cloud
data = self.read_single_raw_cloud(raw_path)
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 330, in read_single_raw_cloud
return read_s3dis_area(
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 75, in read_s3dis_area
batch = Batch.from_data_list(starmap_with_kwargs(
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/multiprocessing.py", line 36, in starmap_with_kwargs
out = pool.starmap(apply_args_and_kwargs, args_for_starmap)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/multiprocessing/pool.py", line 372, in starmap
return self._map_async(func, iterable, starmapstar, chunksize).get()
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/multiprocessing/pool.py", line 771, in get
raise self._value
FileNotFoundError: [Errno 2] No such file or directory: '/home/zhaojing/code/superpoint_transformer-master/data/s3dis/raw/Area_1/Area_1_alignmentAngle.txt'
Set the environment variable HYDRA_FULL_ERROR=1 for a complete stack trace.
from superpoint_transformer.
I take it that the previous issue was solved with the last commit ?
The problem you are facing now is that you probably downloaded the S3DIS Aligned version. The S3DIS dataset comes in two flavors: aligned Stanford3dDataset_v1.2_Aligned_Version.zip
and non-aligned Stanford3dDataset_v1.2.zip
. Our code is expecting the dataset to stem from the non-aligned Stanford3dDataset_v1.2.zip
file, which contains these Area_i/Area_i_alignmentAngle.txt
files. The reason for this is that SPT operates on the whole building floors at once and not on individual rooms like most methods. To reconstruct the entire floors, these room orientation files are needed.
It is my fault for not making this clear enough, I will update the code and documentation to make this more explicit. Thanks for pointing this out 🙏 !
from superpoint_transformer.
The previous issue has been resolved, but I encountered a new issue after replacing the dataset
[2023-06-17 20:26:06,316][src.utils.utils][ERROR] -
Traceback (most recent call last):
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/connectionpool.py", line 714, in urlopen
httplib_response = self._make_request(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/connectionpool.py", line 403, in _make_request
self._validate_conn(conn)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/connectionpool.py", line 1053, in _validate_conn
conn.connect()
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/connection.py", line 419, in connect
self.sock = ssl_wrap_socket(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/util/ssl_.py", line 449, in ssl_wrap_socket
ssl_sock = _ssl_wrap_socket_impl(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/ssl.py", line 500, in wrap_socket
return self.sslsocket_class._create(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/ssl.py", line 1040, in _create
self.do_handshake()
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/ssl.py", line 1309, in do_handshake
self._sslobj.do_handshake()
ssl.SSLZeroReturnError: TLS/SSL connection has been closed (EOF) (_ssl.c:1131)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/requests/adapters.py", line 486, in send
resp = conn.urlopen(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/connectionpool.py", line 798, in urlopen
retries = retries.increment(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/util/retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='drive.google.com', port=443): Max retries exceeded with url: /uc?id=0BweDykwS9vIobkVPN0wzRzFwTDg&export=download (Caused by SSLError(SSLZeroReturnError(6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1131)')))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/utils.py", line 45, in wrap
metric_dict, object_dict = task_func(cfg=cfg)
File "src/train.py", line 114, in train
trainer.fit(model=model, datamodule=datamodule, ckpt_path=cfg.get("ckpt_path"))
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 531, in fit
call._call_and_handle_interrupt(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/call.py", line 42, in _call_and_handle_interrupt
return trainer_fn(*args, **kwargs)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 570, in _fit_impl
self._run(model, ckpt_path=ckpt_path)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 927, in _run
self._data_connector.prepare_data()
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/connectors/data_connector.py", line 94, in prepare_data
call._call_lightning_datamodule_hook(trainer, "prepare_data")
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/call.py", line 160, in _call_lightning_datamodule_hook
return fn(*args, **kwargs)
File "/home/zhaojing/code/superpoint_transformer-master/src/datamodules/base.py", line 144, in prepare_data
self.dataset_class(
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 255, in __init__
super().__init__(*args, val_mixed_in_train=True, **kwargs)
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 193, in __init__
super().__init__(root, transform, pre_transform, pre_filter)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/data/in_memory_dataset.py", line 57, in __init__
super().__init__(root, transform, pre_transform, pre_filter, log)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/data/dataset.py", line 94, in __init__
self._download()
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/data/dataset.py", line 199, in _download
self.download()
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 447, in download
self.download_dataset()
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 293, in download_dataset
self.download_zip()
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 323, in download_zip
gdown.download(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/gdown/download.py", line 161, in download
res = sess.get(url, stream=True, verify=verify)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/requests/sessions.py", line 602, in get
return self.request("GET", url, **kwargs)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/requests/sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/requests/sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/requests/adapters.py", line 517, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='drive.google.com', port=443): Max retries exceeded with url: /uc?id=0BweDykwS9vIobkVPN0wzRzFwTDg&export=download (Caused by SSLError(SSLZeroReturnError(6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1131)')))
[2023-06-17 20:26:06,318][src.utils.utils][INFO] - Closing loggers...
[2023-06-17 20:26:06,319][src.utils.utils][INFO] - Closing wandb!
wandb: Waiting for W&B process to finish... (success).
wandb: You can sync this run to the cloud by running:
wandb: wandb sync /home/zhaojing/code/superpoint_transformer-master/logs/train/runs/2023-06-17_20-25-51/wandb/offline-run-20230617_202558-4aymg4y0
wandb: Find logs at: ./logs/train/runs/2023-06-17_20-25-51/wandb/offline-run-20230617_202558-4aymg4y0/logs
Error executing job with overrides: ['experiment=s3dis', 'datamodule.fold=5']
Traceback (most recent call last):
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/connectionpool.py", line 714, in urlopen
httplib_response = self._make_request(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/connectionpool.py", line 403, in _make_request
self._validate_conn(conn)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/connectionpool.py", line 1053, in _validate_conn
conn.connect()
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/connection.py", line 419, in connect
self.sock = ssl_wrap_socket(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/util/ssl_.py", line 449, in ssl_wrap_socket
ssl_sock = _ssl_wrap_socket_impl(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/ssl.py", line 500, in wrap_socket
return self.sslsocket_class._create(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/ssl.py", line 1040, in _create
self.do_handshake()
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/ssl.py", line 1309, in do_handshake
self._sslobj.do_handshake()
ssl.SSLZeroReturnError: TLS/SSL connection has been closed (EOF) (_ssl.c:1131)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/requests/adapters.py", line 486, in send
resp = conn.urlopen(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/connectionpool.py", line 798, in urlopen
retries = retries.increment(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/urllib3/util/retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='drive.google.com', port=443): Max retries exceeded with url: /uc?id=0BweDykwS9vIobkVPN0wzRzFwTDg&export=download (Caused by SSLError(SSLZeroReturnError(6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1131)')))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "src/train.py", line 139, in main
metric_dict, _ = train(cfg)
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/utils.py", line 48, in wrap
raise ex
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/utils.py", line 45, in wrap
metric_dict, object_dict = task_func(cfg=cfg)
File "src/train.py", line 114, in train
trainer.fit(model=model, datamodule=datamodule, ckpt_path=cfg.get("ckpt_path"))
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 531, in fit
call._call_and_handle_interrupt(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/call.py", line 42, in _call_and_handle_interrupt
return trainer_fn(*args, **kwargs)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 570, in _fit_impl
self._run(model, ckpt_path=ckpt_path)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 927, in _run
self._data_connector.prepare_data()
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/connectors/data_connector.py", line 94, in prepare_data
call._call_lightning_datamodule_hook(trainer, "prepare_data")
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/call.py", line 160, in _call_lightning_datamodule_hook
return fn(*args, **kwargs)
File "/home/zhaojing/code/superpoint_transformer-master/src/datamodules/base.py", line 144, in prepare_data
self.dataset_class(
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 255, in __init__
super().__init__(*args, val_mixed_in_train=True, **kwargs)
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 193, in __init__
super().__init__(root, transform, pre_transform, pre_filter)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/data/in_memory_dataset.py", line 57, in __init__
super().__init__(root, transform, pre_transform, pre_filter, log)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/data/dataset.py", line 94, in __init__
self._download()
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/data/dataset.py", line 199, in _download
self.download()
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 447, in download
self.download_dataset()
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 293, in download_dataset
self.download_zip()
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 323, in download_zip
gdown.download(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/gdown/download.py", line 161, in download
res = sess.get(url, stream=True, verify=verify)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/requests/sessions.py", line 602, in get
return self.request("GET", url, **kwargs)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/requests/sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/requests/sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/requests/adapters.py", line 517, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='drive.google.com', port=443): Max retries exceeded with url: /uc?id=0BweDykwS9vIobkVPN0wzRzFwTDg&export=download (Caused by SSLError(SSLZeroReturnError(6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1131)')))
Set the environment variable HYDRA_FULL_ERROR=1 for a complete stack trace.
from superpoint_transformer.
Hi, the latest commit should clarify things as per how to download and set up the datasets. The previously-encountered SSL error should no longer be here.
Please let me know if this solves your problem and if you run into other issue. Thanks for your perseverance !
from superpoint_transformer.
I used your latest commit and delete my spt conda env and re-run install.sh.But the following issues occurred
cc1plus: warning: command-line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
cc1plus: warning: command-line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
cc1plus: warning: command-line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
Traceback (most recent call last):
File "scripts/setup_dependencies.py", line 100, in <module>
os.chdir(osp.join(DEPENDENCIES_DIR, 'parallel_cut_pursuit/python'))
FileNotFoundError: [Errno 2] No such file or directory: '/home/zhaojing/code/superpoint_transformer-master/src/dependencies/parallel_cut_pursuit/python'
🚀 Successfully installed SPT
(spt) zhaojing@zhaojing-System-Product-Name:~/code/superpoint_transformer-master$ # Train SPT on S3DIS Fold 5
(spt) zhaojing@zhaojing-System-Product-Name:~/code/superpoint_transformer-master$ python src/train.py experiment=s3dis datamodule.fold=5
Traceback (most recent call last):
File "src/train.py", line 49, in <module>
from src import utils
File "/home/zhaojing/code/superpoint_transformer-master/src/__init__.py", line 3, in <module>
import src.datasets
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/__init__.py", line 2, in <module>
from .base import *
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 17, in <module>
from src.transforms import NAGSelectByKey, NAGRemoveKeys, SampleXYTiling, \
File "/home/zhaojing/code/superpoint_transformer-master/src/transforms/__init__.py", line 10, in <module>
from .partition import *
File "/home/zhaojing/code/superpoint_transformer-master/src/transforms/partition.py", line 17, in <module>
from cp_kmpp_d0_dist import cp_kmpp_d0_dist
ModuleNotFoundError: No module named 'cp_kmpp_d0_dist'
from superpoint_transformer.
Hi, the previous issue has been resolved. May I ask if the uncompressed dataset Standford3DDataset_v1.2.zip placed in the s3dis folder?
from superpoint_transformer.
Thank you very much for your guidance. I think the project may only have the last step left to run, and an error message indicates that my CUDA out of memory. May I ask how much CUDA memory is required for your project to run?
Extracting /home/zhaojing/code/superpoint_transformer-master/data/s3dis/Stanford3dDataset_v1.2.zip
Processing...
0%| | 0/10 [03:41<?, ?it/s]
[2023-06-18 16:46:30,636][src.utils.utils][ERROR] -
Traceback (most recent call last):
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/utils.py", line 45, in wrap
metric_dict, object_dict = task_func(cfg=cfg)
File "src/train.py", line 114, in train
trainer.fit(model=model, datamodule=datamodule, ckpt_path=cfg.get("ckpt_path"))
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 531, in fit
call._call_and_handle_interrupt(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/call.py", line 42, in _call_and_handle_interrupt
return trainer_fn(*args, **kwargs)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 570, in _fit_impl
self._run(model, ckpt_path=ckpt_path)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 927, in _run
self._data_connector.prepare_data()
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/connectors/data_connector.py", line 94, in prepare_data
call._call_lightning_datamodule_hook(trainer, "prepare_data")
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/call.py", line 160, in _call_lightning_datamodule_hook
return fn(*args, **kwargs)
File "/home/zhaojing/code/superpoint_transformer-master/src/datamodules/base.py", line 144, in prepare_data
self.dataset_class(
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 258, in __init__
super().__init__(*args, val_mixed_in_train=True, **kwargs)
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 193, in __init__
super().__init__(root, transform, pre_transform, pre_filter)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/data/in_memory_dataset.py", line 57, in __init__
super().__init__(root, transform, pre_transform, pre_filter, log)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/data/dataset.py", line 97, in __init__
self._process()
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 493, in _process
self.process()
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 528, in process
self._process_single_cloud(p)
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 559, in _process_single_cloud
nag = self.pre_transform(data)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/transforms/compose.py", line 24, in __call__
data = transform(data)
File "/home/zhaojing/code/superpoint_transformer-master/src/transforms/transforms.py", line 23, in __call__
return self._process(x)
File "/home/zhaojing/code/superpoint_transformer-master/src/transforms/sampling.py", line 150, in _process
data = _group_data(
File "/home/zhaojing/code/superpoint_transformer-master/src/transforms/sampling.py", line 255, in _group_data
hist = atomic_to_histogram(item, cluster, n_bins=n_bins)
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/metrics.py", line 73, in atomic_to_histogram
hist = scatter_add(item, idx, dim=0)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_scatter/scatter.py", line 29, in scatter_add
return scatter_sum(src, index, dim, out, dim_size)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_scatter/scatter.py", line 19, in scatter_sum
size[dim] = int(index.max()) + 1
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 4.27 GiB (GPU 0; 11.77 GiB total capacity; 8.82 GiB already allocated; 1.98 GiB free; 9.27 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
[2023-06-18 16:46:30,648][src.utils.utils][INFO] - Closing loggers...
[2023-06-18 16:46:30,649][src.utils.utils][INFO] - Closing wandb!
wandb: Waiting for W&B process to finish... (success).
wandb: You can sync this run to the cloud by running:
wandb: wandb sync /home/zhaojing/code/superpoint_transformer-master/logs/train/runs/2023-06-18_16-36-37/wandb/offline-run-20230618_163641-zlgrh3u7
wandb: Find logs at: ./logs/train/runs/2023-06-18_16-36-37/wandb/offline-run-20230618_163641-zlgrh3u7/logs
Error executing job with overrides: ['experiment=s3dis', 'datamodule.fold=5']
Traceback (most recent call last):
File "src/train.py", line 139, in main
metric_dict, _ = train(cfg)
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/utils.py", line 48, in wrap
raise ex
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/utils.py", line 45, in wrap
metric_dict, object_dict = task_func(cfg=cfg)
File "src/train.py", line 114, in train
trainer.fit(model=model, datamodule=datamodule, ckpt_path=cfg.get("ckpt_path"))
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 531, in fit
call._call_and_handle_interrupt(
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/call.py", line 42, in _call_and_handle_interrupt
return trainer_fn(*args, **kwargs)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 570, in _fit_impl
self._run(model, ckpt_path=ckpt_path)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 927, in _run
self._data_connector.prepare_data()
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/connectors/data_connector.py", line 94, in prepare_data
call._call_lightning_datamodule_hook(trainer, "prepare_data")
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/pytorch_lightning/trainer/call.py", line 160, in _call_lightning_datamodule_hook
return fn(*args, **kwargs)
File "/home/zhaojing/code/superpoint_transformer-master/src/datamodules/base.py", line 144, in prepare_data
self.dataset_class(
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/s3dis.py", line 258, in __init__
super().__init__(*args, val_mixed_in_train=True, **kwargs)
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 193, in __init__
super().__init__(root, transform, pre_transform, pre_filter)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/data/in_memory_dataset.py", line 57, in __init__
super().__init__(root, transform, pre_transform, pre_filter, log)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/data/dataset.py", line 97, in __init__
self._process()
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 493, in _process
self.process()
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 528, in process
self._process_single_cloud(p)
File "/home/zhaojing/code/superpoint_transformer-master/src/datasets/base.py", line 559, in _process_single_cloud
nag = self.pre_transform(data)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_geometric/transforms/compose.py", line 24, in __call__
data = transform(data)
File "/home/zhaojing/code/superpoint_transformer-master/src/transforms/transforms.py", line 23, in __call__
return self._process(x)
File "/home/zhaojing/code/superpoint_transformer-master/src/transforms/sampling.py", line 150, in _process
data = _group_data(
File "/home/zhaojing/code/superpoint_transformer-master/src/transforms/sampling.py", line 255, in _group_data
hist = atomic_to_histogram(item, cluster, n_bins=n_bins)
File "/home/zhaojing/code/superpoint_transformer-master/src/utils/metrics.py", line 73, in atomic_to_histogram
hist = scatter_add(item, idx, dim=0)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_scatter/scatter.py", line 29, in scatter_add
return scatter_sum(src, index, dim, out, dim_size)
File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/torch_scatter/scatter.py", line 19, in scatter_sum
size[dim] = int(index.max()) + 1
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 4.27 GiB (GPU 0; 11.77 GiB total capacity; 8.82 GiB already allocated; 1.98 GiB free; 9.27 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
Set the environment variable HYDRA_FULL_ERROR=1 for a complete stack trace.
from superpoint_transformer.
Hi, our standard configs for this project assume a 32G GPU (eg NVIDIA V100). However, it is possible to run SPT on smaller GPUs. We have tested with an 11G GPU, with some small impact on time and performance.
I just pushed a new commit with configs in configs/experiments
for training on 11G-GPU. I tested preprocessing and training on my own 11G device, these should work for your 12G GPU. Make sure you do not have any other process running on your device though.
Besides, you can find a new Troubleshooting section in the README, with some tips if you run into more CUDA OOM errors.
Hope that helps !
from superpoint_transformer.
Have you managed to run the code using the new configs ?
from superpoint_transformer.
I close this issue for now. Feel free to re-open it if relevant.
from superpoint_transformer.
Related Issues (20)
- how to extracting the features? HOT 2
- Problem with processing of the pointcloud HOT 2
- Issue with loading checkpoints HOT 13
- fatal error: eigen3/Eigen/Dense: No such file or directory HOT 4
- [enviroment configuration] Compilation of third-party libraries (eg `FRNN`) HOT 6
- Got error "Indices must be dense" using my own dataset HOT 3
- [Environment Configuration] SPT and SPG Environment Compatibility HOT 3
- I have downloaded the data, how to obtain the preprocessed data HOT 3
- [Checkpoint Loading] Issues on loading s3dis ckpt weights when running 'demo_s3dis.ipynb' HOT 14
- Inference on own data HOT 1
- training not working with Dales dataset HOT 6
- Issue while running inference on custom data HOT 4
- How to train my own dataset HOT 2
- The number of items in the Loss function is different from the number of stages matching HOT 1
- run train.py HOT 3
- Environment configuration issues HOT 8
- S3DIS running [Errno 111] Connection refused')': /api/4504800232407040/envelope/
- Inference with pretrain model HOT 6
- Output prediction resolution & PyTorch Lightning `predict` behavior HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from superpoint_transformer.