Code Monkey home page Code Monkey logo

points2poly's Introduction

Points2Poly


Introduction

Points2Poly is an implementation of the paper Reconstructing Compact Building Models from Point Clouds Using Deep Implicit Fields, which incorporates learnable implicit surface representation into explicitly constructed geometry.

Due to clutter concerns, the core module is separately maintained in the abspy repository (also available as a PyPI package), while this repository acts as a wrapper with additional sources and instructions in particular for building reconstruction.

Prerequisites

The prerequisites are two-fold: one from abspy with functionalities on vertex group, cell complex, and adjacency graph; the other one from points2surf that facilitates occupancy estimation.

Clone this repository with submodules:

git clone --recurse-submodules https://github.com/chenzhaiyu/points2poly

In case you already cloned the repository but forgot --recurse-submodules:

git submodule update --init

Requirements from abspy

Follow the instruction to install abspy with its dependencies, while abspy itself can be easily installed via PyPI:

# local version (stable)
pip install ./abspy

# PyPI version (latest)
pip install abspy

Requirements from points2surf

Install the dependencies for points2surf:

pip install -r points2surf/requirements.txt

For training, make sure CUDA is available and enabled. Navigate to points2surf/README.md for more details on its requirements.

In addition, install dependencies for logging:

pip install -r requirements.txt

Getting started

Reconstruction demo

Download a mini dataset of 6 buildings from the Helsinki 3D city models, and a pre-trained full-view model:

python download.py dataset_name='helsinki_mini' model_name='helsinki_fullview'

Run reconstruction on the mini dataset:

python reconstruct.py dataset_name='helsinki_mini' model_name='helsinki_fullview'

Evaluate the reconstruction results by Hausdorff distance:

python evaluate.py dataset_name='helsinki_mini'

The reconstructed building models and statistics can be found under ./outputs/helsinki_mini/reconstructed.

Helsinki dataset

Download the Helsinki dataset from OneDrive, including meshes, point clouds, and queries with distances.

Custom dataset

Reconstruction from custom point clouds

  • Convert point clouds into NumPy binary files (.npy). Place point cloud files (e.g., .ply, .obj, .stl and .off) under ./datasets/{dataset_name}/00_base_pc then run points2surf/make_pc_dataset.py, or manually do the conversion.

  • Extract planar primitives from point clouds with Mapple. In Mapple, use Point Cloud - RANSAC primitive extraction to extract planar primitives, then save the vertex group files (.vg or .bvg) into ./datasets/{dataset_name}/06_vertex_group.

  • Run reconstruction the same way as that in the demo. Notice that, however, you might need to retrain a model that conforms to your data's characteristics.

Make training data

Prepare meshes and place them under datasets/{dataset_name} that mimic the structure of the provided data. Refer to this instruction for creating training data through BlenSor simulation.

TODOs

  • Separate abspy/points2surf from points2poly wrappers
  • Config with hydra
  • Short tutorial on how to get started
  • Host generated data

License

MIT

Acknowledgement

The implementation of Points2Poly has greatly benefited from Points2Surf. In addition, the implementation of the abspy submodule is backed by great open-source libraries inlcuding SageMath, NetworkX, and Easy3D.

Citation

If you use Points2Poly in a scientific work, please consider citing the paper:

@article{chen2022points2poly,
  title = {Reconstructing compact building models from point clouds using deep implicit fields},
  journal = {ISPRS Journal of Photogrammetry and Remote Sensing},
  volume = {194},
  pages = {58-73},
  year = {2022},
  issn = {0924-2716},
  doi = {https://doi.org/10.1016/j.isprsjprs.2022.09.017},
  url = {https://www.sciencedirect.com/science/article/pii/S0924271622002611},
  author = {Zhaiyu Chen and Hugo Ledoux and Seyran Khademi and Liangliang Nan}
}

points2poly's People

Contributors

chenzhaiyu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

points2poly's Issues

poor result

Hi
I'm trying to make building models using your project.
but cannot get a decent result like your examples
can i get any advice or tip?

this is my input pointcloud building
image

these are RANSAC parameters ( all default )
image

before detect Planars, i run Estimate normals with deafult setting

this is the RANSAC result. it looks good to me
image

After saving the ply and vg files, run make_pc_dataset.py script and reconstruct.py
all parameters in config.yaml are default except normalize and append_bottom

cell_complex.visualize() result
image

this is final result
All structures on the roof of the building have disappeared.
image

reconstructing interior walls

Hi there!

Firstly, I'd like to express my appreciation for the fantastic work on the Points2Poly—it's truly impressive!

Now, onto my main question: I recently attempted to run Points2Poly on a simple structure with a water-tight layout and one internal wall. However, the output file didn't include the internal wall. I'm curious to know if it's possible to reconstruct internal walls and components like stairs using Points2Poly. If so, could you guide me on the necessary modifications in the code and where specifically in the codebase I should make these changes?

here is the simple model that i tried:

  1. this is my original model made of simple geometries.
image
  1. here is the the internal wall.
image
  1. here is the point cloud:
image
  1. here is the output with the cut cross section of wall area.
image

as you can see there is no wall in the output. can you explain this please!

Running the command `python reconstruct.py dataset_name='helsinki_mini' model_name='helsinki_fullview'` results in the error: `ModuleNotFoundError: No module named 'sage.all'`.

The from sage.all import polytopes, QQ, RR, Polyhedron statement in complex.py is causing an error: ModuleNotFoundError: No module named 'sage.all'. My computer is running Windows 10, and I'm using PyCharm as my development platform.
The version of Sage I installed is only 0.0.0. Is this because of the Windows operating system?

Cannot find 05_query_pts folder

Hi, for the custom dataset, from my point clouds I am able to create the npy files. Would you be able to explain how the folder 05_query_pts should be created? make_pc_dataset.py only creates the .npy and .xyz.npy files. How would I create the files in the folder 05_query_pts? When I run reconstruct.py, it cannot find 05_query_pts folder.
Thank you.

Can't product result

Hi, I'm learning your method but I have some question, can I get any tips?
this is my environment
1667569735103
(1)
I download helsinki_mini datasets and helsinki_fullview_model,and I run reconstructed.py to get six *obj building though it reports some warnings.
1667572842758
reconstruct.log
but I can't get result when I run evaluate.py
2 (1)
2 (2)
(2)
I try to make building models from my point cloud.
I reconstruct a 3D model of a district using UAV and I extract the building by terrasolid. The cavities are caused by trees. this is my input building in mapple.
1667571632683
this is the RANSAC result.(all default)
1667571753960
I follow the step in readme.md to product my datasets and I run reconstruct.py. It seems running but I can't get the ‘reconstructed’ folder.
I only change the dataset name in config
1667573092494
reconstruct2.log

Alternative to using mapple?

Hi, this is amazing work! I had a question for the custom dataset part, is there an alternative python library/tool to use instead of mapple? I am using windows and the mapple executable for windows seem to be outdated. Is there any other alternative to this especially that is python based? Thank you.

If I must need to retrain a model to reconstruct from custom point clouds?

Your research is excellent! I'm interested in your work. when I use your code to reconstruct a real-world building from point cloud,the code can construct cell complex.

[INFO] - cell complex constructed: 0.12 s
[INFO] - number of planes: 6
[INFO] - number of cells: 27

but a ERROR will arise as follows:

[INFO] - cut performed: 0.00 s
[INFO] - cut_value: 0.00
[INFO] - number of extracted cells: 0
[ERROR] - no reachable cells. aborting

Whether this error is caused by the fact that I used your helsinki_fullview model,and the model does not apply to my point clouds?
Do you have any advice for me to solve the problem?

how to reconstruct my own data

Your work is excellent, and I managed to run through the examples. I would like to ask how to reconstruct my own point cloud file. Now I only have some .PLY files.

Crash during reconstruction when creating cubes

Hi,
First of all, thank you very much for providing the code for Points2Poly!
I have been trying to use your work in a docker environment but have systematic crashes when trying to use the reconstruction script on the Helsinki dataset.

Here is the error I get

Traceback (most recent call last):
  File "reconstruct.py", line 51, in reconstruct_full
    cell_complex = create_cell_complex(filepath,
  File "/app/utils.py", line 65, in create_cell_complex
    cell_complex = CellComplex(vertex_group.planes, vertex_group.aabbs, vertex_group.obbs, vertex_group.points_grouped,
  File "/home/appuser/.local/lib/python3.8/site-packages/abspy/complex.py", line 83, in __init__
    self.cells = [self._construct_initial_cell()]  # list of QQ
  File "/home/appuser/.local/lib/python3.8/site-packages/abspy/complex.py", line 105, in _construct_initial_cell
    return polytopes.cube(intervals=[[QQ(self.initial_bound[0][i]), QQ(self.initial_bound[1][i])] for i in range(3)])
TypeError: cube() got an unexpected keyword argument 'intervals'

It seems to be correlated with sagemath, but I am not familiar with this lib, and the documentation still mentions the "intervals" keyword, so I guess the real issue maybe elsewhere. Do you have any hints regarding this error?

Thanks a lot!

For reproducibility, I show the dockerfile and the list of instructions done hereafter.

Dockerfile:

FROM nvidia/cuda:11.3.1-devel-ubuntu20.04

ENV DEBIAN_FRONTEND noninteractive

WORKDIR /app
COPY . /app

RUN apt-get -y update \
    && apt-get install -y software-properties-common \
    && apt-get -y update \
    && add-apt-repository universe
RUN apt-get -y update
RUN apt-get -y install nano git python3 python3-pip wget libgl1-mesa-glx libegl1-mesa libxrandr2 libxrandr2 libxss1 libxcursor1 libxcomposite1 libasound2 libxi6 libxtst6 curl sagemath

RUN adduser -u 5678 --disabled-password --gecos "" appuser && chown -R appuser /app
USER appuser

CMD ["python3", "main.py"]

List of commands once the docker is launched:

pip install ./abspy
pip install -r points2surf/requirements.txt
pip install -r requirements.txt
python download.py dataset_name='helsinki_mini' model_name='helsinki_fullview'
python reconstruct.py dataset_name='helsinki_mini' model_name='helsinki_fullview'

Test own .ply file

I think your work is very meaningful.How can I run my own .ply file? In the demo of reconstruct.py, the dataset need 04_pts and 06_vertex_group,if I only have a .ply file,if I can get the pts and vertex_group data?

Performance on indoor planes

Thank you for this amazing work.

I was wondering if this algorithm works well with indoor planes. I have a indoor dataset captured using rgbd camera. I want to simplify the mesh into planar segments. I previously tried polyfit and was only able to get a watertight planar mesh around the structure of the building rather than indoor planar segments like walls and desks. Could you help me with this? Thank you in advance.

Use My point cloud data

"Hello, I have point cloud data in pcd format. Can I convert the point cloud to ply format and use your reconstruction model to reconstruct the point cloud?"

ValueError: point coordinates are needed for plane refinement

Hi, thanks a lot for sharing such a brilliant work!
I want to run the demo code and test its function after preparing the dependencies as the instructions said. When I run:
python reconstruct.py dataset_name='helsinki_mini' model_name='helsinki_fullview'
then I meet:

[2023-12-13 21:30:14,735][root][INFO] - processing /home/why/points2poly/datasets/helsinki_mini/06_vertex_group/2.vg
Error executing job with overrides: ['dataset_name=helsinki_mini', 'model_name=helsinki_fullview']
Traceback (most recent call last):
  File "/home/why/points2poly/reconstruct.py", line 51, in reconstruct_full
    cell_complex = create_cell_complex(filepath,
  File "/home/why/points2poly/utils.py", line 69, in create_cell_complex
    cell_complex.refine_planes(theta=theta, epsilon=epsilon)
  File "/home/why/anaconda3/envs/point/lib/python3.10/site-packages/abspy/complex.py", line 127, in refine_planes
    raise ValueError('point coordinates are needed for plane refinement')
ValueError: point coordinates are needed for plane refinement

Set the environment variable HYDRA_FULL_ERROR=1 for a complete stack trace.

I have no idea about this error. Could you please share your thoughts about it?

Regards.

Helsinki dataset

Helsinki dataset cannot be downloaded from OneDrive. Has the link expired?

GPU and dataset

Thank you for the great work!
What is the minimum performance of the graphics card used in your experiment? And how to handle my own building point cloud data set for training?

Trained Helsinki no-bottom model

Hello,
I would like to run the current approach on a custom dataset. My dataset of buildings contain no bottoms for the buildings. You mentioned in the paper that for the real world Shenzhen dataset you used Helsinki no-bottom model. Can you please provide the trained model?

Also I expect that I will use it the same way as provided in the readme. Just adding the model url and name in the config file, am I correct?

TypeError: Caught TypeError in DataLoader worker process 0.

(point2poly) [lilonghui_liujing@gpu01 points2poly]$ python reconstruct.py dataset_name='helsinki_mini' model_name='helsinki_fullview'
/share/home/lilonghui/lilonghui_liujing/zhengxin/environment/anaconda3/envs/point2poly/lib/python3.9/site-packages/sklearn/utils/multiclass.py:14: DeprecationWarning: Please use spmatrix from the scipy.sparse namespace, the scipy.sparse.base namespace is deprecated.
from scipy.sparse.base import spmatrix
/share/home/lilonghui/lilonghui_liujing/zhengxin/environment/anaconda3/envs/point2poly/lib/python3.9/site-packages/sklearn/utils/optimize.py:18: DeprecationWarning: Please use line_search_wolfe2 from the scipy.optimize namespace, the scipy.optimize.linesearch namespace is deprecated.
from scipy.optimize.linesearch import line_search_wolfe2, line_search_wolfe1
/share/home/lilonghui/lilonghui_liujing/zhengxin/environment/anaconda3/envs/point2poly/lib/python3.9/site-packages/sklearn/utils/optimize.py:18: DeprecationWarning: Please use line_search_wolfe1 from the scipy.optimize namespace, the scipy.optimize.linesearch namespace is deprecated.
from scipy.optimize.linesearch import line_search_wolfe2, line_search_wolfe1
[2023-07-04 09:36:53,885][root][INFO] - processing /share/home/lilonghui/lilonghui_liujing/zhengxin/points2poly/datasets/helsinki_mini/06_vertex_group/5.vg
[2023-07-04 09:36:54,723][root][INFO] - refining planar primitives
[2023-07-04 09:36:54,732][root][INFO] - 8 pairs of planes merged
[2023-07-04 09:36:54,732][root][INFO] - prioritising planar primitives
[2023-07-04 09:36:54,733][root][INFO] - constructing cell complex
100%|███████████████████████████████████████████████████████████████████████████████| 23/23 [00:01<00:00, 12.46it/s]
[2023-07-04 09:36:56,581][root][INFO] - cell complex constructed: 1.85 s
[2023-07-04 09:36:56,581][root][INFO] - number of planes: 23
[2023-07-04 09:36:56,581][root][INFO] - number of cells: 129
[2023-07-04 09:36:56,718][root][INFO] - processing /share/home/lilonghui/lilonghui_liujing/zhengxin/points2poly/datasets/helsinki_mini/06_vertex_group/4.vg
[2023-07-04 09:36:56,796][root][INFO] - refining planar primitives
[2023-07-04 09:36:56,804][root][INFO] - 8 pairs of planes merged
[2023-07-04 09:36:56,804][root][INFO] - prioritising planar primitives
[2023-07-04 09:36:56,805][root][INFO] - constructing cell complex
100%|███████████████████████████████████████████████████████████████████████████████| 27/27 [00:03<00:00, 6.80it/s]
[2023-07-04 09:37:00,778][root][INFO] - cell complex constructed: 3.97 s
[2023-07-04 09:37:00,779][root][INFO] - number of planes: 27
[2023-07-04 09:37:00,804][root][INFO] - number of cells: 235
[2023-07-04 09:37:00,960][root][INFO] - processing /share/home/lilonghui/lilonghui_liujing/zhengxin/points2poly/datasets/helsinki_mini/06_vertex_group/6.vg
[2023-07-04 09:37:01,037][root][INFO] - refining planar primitives
[2023-07-04 09:37:01,073][root][INFO] - 34 pairs of planes merged
[2023-07-04 09:37:01,073][root][INFO] - prioritising planar primitives
[2023-07-04 09:37:01,076][root][INFO] - constructing cell complex
100%|███████████████████████████████████████████████████████████████████████████████| 53/53 [00:06<00:00, 8.21it/s]
[2023-07-04 09:37:07,530][root][INFO] - cell complex constructed: 6.45 s
[2023-07-04 09:37:07,530][root][INFO] - number of planes: 53
[2023-07-04 09:37:07,530][root][INFO] - number of cells: 387
[2023-07-04 09:37:07,680][root][INFO] - processing /share/home/lilonghui/lilonghui_liujing/zhengxin/points2poly/datasets/helsinki_mini/06_vertex_group/3.vg
[2023-07-04 09:37:07,756][root][INFO] - refining planar primitives
[2023-07-04 09:37:07,786][root][INFO] - 24 pairs of planes merged
[2023-07-04 09:37:07,786][root][INFO] - prioritising planar primitives
[2023-07-04 09:37:07,788][root][INFO] - constructing cell complex
100%|███████████████████████████████████████████████████████████████████████████████| 31/31 [00:02<00:00, 14.68it/s]
[2023-07-04 09:37:09,901][root][INFO] - cell complex constructed: 2.11 s
[2023-07-04 09:37:09,901][root][INFO] - number of planes: 31
[2023-07-04 09:37:09,901][root][INFO] - number of cells: 116
[2023-07-04 09:37:10,020][root][INFO] - processing /share/home/lilonghui/lilonghui_liujing/zhengxin/points2poly/datasets/helsinki_mini/06_vertex_group/2.vg
[2023-07-04 09:37:10,113][root][INFO] - refining planar primitives
[2023-07-04 09:37:10,144][root][INFO] - 22 pairs of planes merged
[2023-07-04 09:37:10,144][root][INFO] - prioritising planar primitives
[2023-07-04 09:37:10,146][root][INFO] - constructing cell complex
100%|███████████████████████████████████████████████████████████████████████████████| 35/35 [00:02<00:00, 17.20it/s]
[2023-07-04 09:37:12,182][root][INFO] - cell complex constructed: 2.04 s
[2023-07-04 09:37:12,183][root][INFO] - number of planes: 35
[2023-07-04 09:37:12,183][root][INFO] - number of cells: 130
[2023-07-04 09:37:12,263][root][INFO] - processing /share/home/lilonghui/lilonghui_liujing/zhengxin/points2poly/datasets/helsinki_mini/06_vertex_group/1.vg
[2023-07-04 09:37:12,299][root][INFO] - refining planar primitives
[2023-07-04 09:37:12,307][root][INFO] - 12 pairs of planes merged
[2023-07-04 09:37:12,307][root][INFO] - prioritising planar primitives
[2023-07-04 09:37:12,308][root][INFO] - constructing cell complex
100%|███████████████████████████████████████████████████████████████████████████████| 15/15 [00:01<00:00, 14.48it/s]
[2023-07-04 09:37:13,344][root][INFO] - cell complex constructed: 1.04 s
[2023-07-04 09:37:13,344][root][INFO] - number of planes: 15
[2023-07-04 09:37:13,344][root][INFO] - number of cells: 78
evaluating on dataset helsinki_mini
Random Seed: 40938661
getting information for 6 shapes
evaluating 1075 patches
0%| | 0/6 [00:00<?, ?it/s]Warning: imp_surf_dist_ms must be converted to float32
Warning: imp_surf_dist_ms must be converted to float32
Warning: imp_surf_dist_ms must be converted to float32
Warning: imp_surf_dist_ms must be converted to float32
Warning: imp_surf_dist_ms must be converted to float32
Warning: imp_surf_dist_ms must be converted to float32
0%| | 0/6 [00:00<?, ?it/s]
Error executing job with overrides: ['dataset_name=helsinki_mini', 'model_name=helsinki_fullview']
Traceback (most recent call last):
File "/share/home/lilonghui/lilonghui_liujing/zhengxin/points2poly/reconstruct.py", line 71, in reconstruct_full
infer_sdf(cfg)
File "/share/home/lilonghui/lilonghui_liujing/zhengxin/points2poly/utils.py", line 166, in infer_sdf
points_to_surf_eval.points_to_surf_eval(cfg)
File "/share/home/lilonghui/lilonghui_liujing/zhengxin/points2poly/points2surf/source/points_to_surf_eval.py", line 358, in points_to_surf_eval
for batch_data in tqdm(dataloader):
File "/share/home/lilonghui/lilonghui_liujing/zhengxin/environment/anaconda3/envs/point2poly/lib/python3.9/site-packages/tqdm/std.py", line 1176, in iter
for obj in iterable:
File "/share/home/lilonghui/lilonghui_liujing/zhengxin/environment/anaconda3/envs/point2poly/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 633, in next
data = self._next_data()
File "/share/home/lilonghui/lilonghui_liujing/zhengxin/environment/anaconda3/envs/point2poly/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 1345, in _next_data
return self._process_data(data)
File "/share/home/lilonghui/lilonghui_liujing/zhengxin/environment/anaconda3/envs/point2poly/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 1371, in _process_data
data.reraise()
File "/share/home/lilonghui/lilonghui_liujing/zhengxin/environment/anaconda3/envs/point2poly/lib/python3.9/site-packages/torch/_utils.py", line 644, in reraise
raise exception
TypeError: Caught TypeError in DataLoader worker process 0.
Original Traceback (most recent call last):
File "/share/home/lilonghui/lilonghui_liujing/zhengxin/environment/anaconda3/envs/point2poly/lib/python3.9/site-packages/torch/utils/data/_utils/worker.py", line 308, in _worker_loop
data = fetcher.fetch(index)
File "/share/home/lilonghui/lilonghui_liujing/zhengxin/environment/anaconda3/envs/point2poly/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 51, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/share/home/lilonghui/lilonghui_liujing/zhengxin/environment/anaconda3/envs/point2poly/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 51, in
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/share/home/lilonghui/lilonghui_liujing/zhengxin/points2poly/points2surf/source/data_loader.py", line 359, in getitem
get_patch_points(shape=shape, query_point=imp_surf_query_point_ms)
File "/share/home/lilonghui/lilonghui_liujing/zhengxin/points2poly/points2surf/source/data_loader.py", line 335, in get_patch_points
patch_pts_ids = point_cloud.get_patch_kdtree(
File "/share/home/lilonghui/lilonghui_liujing/zhengxin/points2poly/points2surf/source/base/point_cloud.py", line 174, in get_patch_kdtree
pts_dists_ms, patch_pts_ids = kdtree.query(x=query_point, k=points_per_patch, n_jobs=n_jobs)
File "_ckdtree.pyx", line 786, in scipy.spatial._ckdtree.cKDTree.query
File "_ckdtree.pyx", line 388, in scipy.spatial._ckdtree.get_num_workers
TypeError: Unexpected keyword argument {'n_jobs': 1}

Set the environment variable HYDRA_FULL_ERROR=1 for a complete stack trace.

How to better obtain planar primitives.

Hello, this is a great job. However, I have encountered a problem while reconstructing my own data, "Extract planar primitives from point clouds with Mapple." When I need to reconstruct a lot of data on my own, manually obtaining planar primitives will take a lot of time in this step. Is there any other way to complete this task instead of doing it manually? Looking forward to your answer.

python reconstruct.py dataset_name='helsinki_mini' model_name='helsinki_fullview'

Describe the bug
A clear and concise description of what the bug is.

To Reproduce
Steps to reproduce the behavior:

  1. Input '...'
  2. Run '....'
  3. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Environment
Describe the environment (e.g., package version) if pertinent.

Additional context
Add any other context about the problem here.

from sage.all import polytopes, QQ, RR, Polyhedron

Do I need to install sage? sage looks so big.

(myria3d) root@e7c988bbfef7:/data/points2poly# python reconstruct.py dataset_name='helsinki_mini' model_name='helsinki_fullview'
Traceback (most recent call last):
File "/data/points2poly/reconstruct.py", line 29, in
from utils import create_cell_complex, create_query_points, extract_surface, infer_sdf
File "/data/points2poly/utils.py", line 14, in
from abspy import VertexGroup, CellComplex, AdjacencyGraph
File "/root/miniconda3/envs/myria3d/lib/python3.9/site-packages/abspy/init.py", line 5, in
from .complex import *
File "/root/miniconda3/envs/myria3d/lib/python3.9/site-packages/abspy/complex.py", line 28, in
from sage.all import polytopes, QQ, RR, Polyhedron
ModuleNotFoundError: No module named 'sage'
(myria3d) root@e7c988bbfef7:/data/points2poly#

Overflow

I'm getting this overflow warning/error and it's also happening in helsinki_mini example:
points2poly/utils.py:32: RuntimeWarning: overflow encountered in exp
return 1 / (1 + np.exp(-x))

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.