I Need Help with Processing Large Point Cloud Data Efficiently in CloudCompare
Posted: Thu Sep 26, 2024 9:16 am
Hello there,
I have been working with CloudCompare for some time now; primarily for analyzing point cloud data from terrestrial laser scans. Lately; I have been dealing with larger datasets; and I have started running into performance issues; particularly during import; rendering; and applying certain filters like subsampling and segmentation.
I am working on a machine with decent specs: 32 GB RAM; an NVIDIA RTX 3060 GPU, and an Intel i7 processor. Despite this; the performance seems to degrade significantly as soon as the point cloud exceeds about 100 million points. My workflow mainly involves importing point clouds; aligning them; applying a few basic transformations, and running segmentation and distance calculations between clouds.
The point clouds take a long time to load or display when I pan or zoom in.
Applying subsampling algorithms like random sampling to reduce the number of points often results in very slow performance.
Running cloud to cloud distance comparisons with large datasets seems to take ages.
Also; I have gone through this post; https://www.cloudcompare.org/forum/viewtopic.php?tableau=1414 which definitely helped me out a lot.
I was wondering if anyone had experience optimizing CloudCompare for handling larger datasets? Is there anything I can do to improve performance; such as specific settings within the software; or is my hardware the bottleneck here?
Thanks in advance for your help and assistance.
I have been working with CloudCompare for some time now; primarily for analyzing point cloud data from terrestrial laser scans. Lately; I have been dealing with larger datasets; and I have started running into performance issues; particularly during import; rendering; and applying certain filters like subsampling and segmentation.
I am working on a machine with decent specs: 32 GB RAM; an NVIDIA RTX 3060 GPU, and an Intel i7 processor. Despite this; the performance seems to degrade significantly as soon as the point cloud exceeds about 100 million points. My workflow mainly involves importing point clouds; aligning them; applying a few basic transformations, and running segmentation and distance calculations between clouds.
The point clouds take a long time to load or display when I pan or zoom in.
Applying subsampling algorithms like random sampling to reduce the number of points often results in very slow performance.
Running cloud to cloud distance comparisons with large datasets seems to take ages.
Also; I have gone through this post; https://www.cloudcompare.org/forum/viewtopic.php?tableau=1414 which definitely helped me out a lot.
I was wondering if anyone had experience optimizing CloudCompare for handling larger datasets? Is there anything I can do to improve performance; such as specific settings within the software; or is my hardware the bottleneck here?
Thanks in advance for your help and assistance.