ucdvision / compact3d Goto Github PK
View Code? Open in Web Editor NEWOfficial implementation of "Compact3D".
Home Page: https://ucdvision.github.io/compact3d/
License: MIT License
Official implementation of "Compact3D".
Home Page: https://ucdvision.github.io/compact3d/
License: MIT License
When I'm training and reach about 23% where save_kmeans() is called, I get a division by zero error on line 244 in train_kmeans.py
244 n_bits = int(np.ceil(np.log2(len(kmeans.cls_ids))))
I checked the traceback and kmeans.cls_ids returned an empty tensor array similar to its instantiation in line 20 in kmeans_quantize.py
20 self.cls_ids = torch.empty(0)
Love this work! Wanted to ask if there was a checkpoint or pretrained model available.
in train_kemeans.py, line 72 and 73:
lambda2 = args.lambda2
reg = args.reg
but args has no attribute lambda2 and reg.
Hello - I'd like to run Compact3D with opacity_reg
parameter after the recent update on July 31, 2024. However, this change seems to be breaking since when I run the bash script run.sh
, I get the following error when the execution hits the opacity regularization at the 16000th iteration:
[ITER 16000] Evaluating train: L1 0.01976204663515091 PSNR 30.19481582641602 [08/08 17:43:24]
Num Gaussians: 616981 [08/08 17:43:24]
Traceback (most recent call last):
File "train_kmeans.py", line 410, in <module>
args.checkpoint_iterations, args.start_checkpoint, args.debug_from, args)
File "train_kmeans.py", line 225, in training
gaussians.prune(0.005, scene.cameras_extent, size_threshold)
AttributeError: 'GaussianModel' object has no attribute 'prune'
It seems like the version of the method with opacity regularization uses a method named prune
under the GaussianModel
which is not available within this or the original 3D gaussian splatting code base.
if 'scale_rot' in quantized_params:
kmeans_scrot_q.forward_scale_rot(gaussians, assign=assign)
def forward_scale_rot(self, gaussian, assign=False):
"""Combine both scaling and rotation for a single k-Means"""
if self.vec_dim == 0:
self.vec_dim = gaussian._rotation.shape[1] + gaussian._scaling.shape[1]
feat_scaled = torch.cat([self.rescale(gaussian._scaling), self.rescale(gaussian._rotation)], 1)
feat = torch.cat([gaussian._scaling, gaussian._rotation], 1)
if assign:
self.cluster_assign(feat, feat_scaled)
else:
self.update_centers(feat)
sampled_centers = torch.gather(self.centers, 0, self.nn_index.unsqueeze(-1).repeat(1, self.vec_dim))
gaussian._scaling_q = gaussian._scaling - gaussian._scaling.detach() + sampled_centers[:, :3]
gaussian._rotation_q = gaussian._rotation - gaussian._rotation.detach() + sampled_centers[:, 3:]
This configuration exists in the code. What is the effect?
I think it may difficult to quant??
Hi,
I'd like to compare the storage memory usage for both quantized and non-quantized model.
I was thinking of comparing the size of the checkpoint file from non-quantized training (using train.py from the original 3DGS paper) and the files that are being loaded from the --load_quant option.
However, since files used from --load_quant option are not the only files for restoring the scene, I cannot compute the exact storage memory required for restoring a scene.
So, I'd like to know if there is any way to compare the storage memory usage for both quantized and non-quantized models.
Thanks.
The default "quant_params" is [sh dc rot sc]. I found the size of the model not still large with the default quant_params. It may be not the parameters of the paper. Can you provide the quant_params of the paper?
Nice work! I want to know how to monitor the memory-use, I have done experiments on 3dgs and compGS, but I don't know how to compare their memory usage.
have you tried pos quant?
Hello,
Would it be possible to provide details on how to reconstruct the 3DGS model from the compressed file and view it in SIBR viewer?
I have downloaded your MipNerf-360 trained models here but I'm unable to view it with SIBR viewer
Using this command I use:
SIBR_gaussianViewer_app.exe -m "C:\Users\Downloads\compgs_4k\e041_a_kmeans_512_10it_st15000_tot30000_fr100diff_last5k\mipnerf360\bonsai\"
Note that it is not about memory issue - I managed to visualize uncompressed models (bigger file size) of original 3DGS models.
Is there something I need to do before viewing the compressed trained models? Thank you
I noticed in the paper it says, 'Moreover, since the Gaussians are a set of non-ordered elements, we compress the representation further by sorting the Gaussians based on one of the quantized parameters and storing the indices using the Run-Length-Encoding (RLE) method.' However, I didn't find the code for sorting an index in the code. Can you please tell me where the specific implementation of this section is?
Hey, Would you mind providing me with the reconstruction or restoring python scripty for you compression method. I mean converting point_cloud.ply, kmeans_centers.pth, kmeans_args.npy, and kmeans_inds.bin into one point_cloud.ply file. In that case, I can directly open the rebuildt 3DGS with SIBR viewer.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.