3DMASC paper code
-
- Posts: 5
- Joined: Thu Dec 05, 2024 10:18 am
3DMASC paper code
Hi everyone,
Is the python code developed for the 3DMASC paper available? I fail to find it. It would be very useful as a tutorial to getting started with 3DMASC, as well as for reproducibility and benchmarking for potential optimizations. The examples/Clouds_23_3dmasc_hands_on.ipynb in the Github repo does not include the full workflow (e.g. parameter file, feature optimization, ...) and the description at https://lidar-platform.readthedocs.io/e ... dmasc.html doesn't point to the parameter file in the paper.
Thanks in advance!
FG
Is the python code developed for the 3DMASC paper available? I fail to find it. It would be very useful as a tutorial to getting started with 3DMASC, as well as for reproducibility and benchmarking for potential optimizations. The examples/Clouds_23_3dmasc_hands_on.ipynb in the Github repo does not include the full workflow (e.g. parameter file, feature optimization, ...) and the description at https://lidar-platform.readthedocs.io/e ... dmasc.html doesn't point to the parameter file in the paper.
Thanks in advance!
FG
-
- Posts: 46
- Joined: Tue Dec 01, 2020 1:21 pm
Re: 3DMASC paper code
The code you find in the lidar-platform library IS the one developed for the 3DMASC paper. If you look for a kind of all-inclusive script or notebook which would allow you to rebuild the results of the article I am not sure that it exists somewhere. But the article, the dataset (https://lidar.univ-rennes.fr/3dmasc-datasets), the library and the documentation (which is not perfect but not so bad ;)) should help you to progress. And yes, we should think of building a tutorial for the plugin.
I will ask one other contributor to the project for such a script/notebook, we never know.
I will ask one other contributor to the project for such a script/notebook, we never know.
-
- Posts: 5
- Joined: Thu Dec 05, 2024 10:18 am
Re: 3DMASC paper code
Hi Paul,
Thanks for allocating the time/resources. Yes, I meant an all-inclusive sript/notebook illustrating the workflow step-by-step. I believe such material would ease the onboarding to 3DMASC and resolve many doubts/questions before hitting the forum.
In the meantime, could you please point me to where can I find the final parameter file used in the published article? The parameter file combined with the great dataset/library/documentation should allow me to keep going ;)
Thanks!
FG
Thanks for allocating the time/resources. Yes, I meant an all-inclusive sript/notebook illustrating the workflow step-by-step. I believe such material would ease the onboarding to 3DMASC and resolve many doubts/questions before hitting the forum.
In the meantime, could you please point me to where can I find the final parameter file used in the published article? The parameter file combined with the great dataset/library/documentation should allow me to keep going ;)
Thanks!
FG
Re: 3DMASC paper code
Hello,
The jupyter notebook actually contains the complete workflow, including feature selection, and goes through each part step by step (feature computation, classification using different algorithms, prediction confidence analysis, SHAP analysis, feature selection).
We updated the notebook after your feedback and improved its documentation, I hope this newer version helps you get familiar with the method.
We might follow your suggestion in the future and publish a notebook to reproduce the paper's results. Until then, this notebook provides a generic tutorial to apply the method as described in the paper to any data. Combined with the code and the documentation, we think it provides a solid foundation for the advanced use of 3DMASC :)
Regarding the parameter files, we need to re-upload them, and that won't be possible before January... In the meantime, the documentation and the information in the paper—such as the final sets of features and scales—can serve as a starting point if you need to create a parameter file urgently.
The jupyter notebook actually contains the complete workflow, including feature selection, and goes through each part step by step (feature computation, classification using different algorithms, prediction confidence analysis, SHAP analysis, feature selection).
We updated the notebook after your feedback and improved its documentation, I hope this newer version helps you get familiar with the method.
We might follow your suggestion in the future and publish a notebook to reproduce the paper's results. Until then, this notebook provides a generic tutorial to apply the method as described in the paper to any data. Combined with the code and the documentation, we think it provides a solid foundation for the advanced use of 3DMASC :)
Regarding the parameter files, we need to re-upload them, and that won't be possible before January... In the meantime, the documentation and the information in the paper—such as the final sets of features and scales—can serve as a starting point if you need to create a parameter file urgently.
-
- Posts: 5
- Joined: Thu Dec 05, 2024 10:18 am
Re: 3DMASC paper code
Hi mletard,
Thank you the updates and considering publishing the notebook allowing the reproduction of the paper's results!
The parameter files will be a very nice xmas present :) I look forward to them!
FG
Thank you the updates and considering publishing the notebook allowing the reproduction of the paper's results!
The parameter files will be a very nice xmas present :) I look forward to them!
FG
-
- Posts: 46
- Joined: Tue Dec 01, 2020 1:21 pm
-
- Posts: 5
- Joined: Thu Dec 05, 2024 10:18 am
Re: 3DMASC paper code
Hi again,
I am trying to reproduce the paper results using python. Specifically, I am playing with the AIN data.
I am using the following code to compute the features :
data_path = r'D:\DATA\Pointclouds\AIN'
pc1 = os.path.join(data_path, "green_532nm_all.laz")
pc2 = os.path.join(data_path, "NIR_1064nm_all.laz")
ctx = ?
pcx = ?
parameters = os.path.join(data_path, "3DMASC_paper_parameter_file.txt") # 3DMASC parameter
clouds = (pc1, pc2, ctx, pcx)
out = cc.q3dmasc(clouds, parameters, only_features=True, verbose=True)
Could you please indicate what files from the 3DMASC datasets webpage did you use as ctx, pcx in the paper?
Thanks in advance!
FG
I am trying to reproduce the paper results using python. Specifically, I am playing with the AIN data.
I am using the following code to compute the features :
data_path = r'D:\DATA\Pointclouds\AIN'
pc1 = os.path.join(data_path, "green_532nm_all.laz")
pc2 = os.path.join(data_path, "NIR_1064nm_all.laz")
ctx = ?
pcx = ?
parameters = os.path.join(data_path, "3DMASC_paper_parameter_file.txt") # 3DMASC parameter
clouds = (pc1, pc2, ctx, pcx)
out = cc.q3dmasc(clouds, parameters, only_features=True, verbose=True)
Could you please indicate what files from the 3DMASC datasets webpage did you use as ctx, pcx in the paper?
Thanks in advance!
FG
Re: 3DMASC paper code
FernandoGalan wrote: ↑Fri Jan 24, 2025 2:45 pm Hi again,
I am trying to reproduce the paper results using python. Specifically, I am playing with the AIN data.
I am using the following code to compute the features :
data_path = r'D:\DATA\Pointclouds\AIN'
pc1 = os.path.join(data_path, "green_532nm_all.laz")
pc2 = os.path.join(data_path, "NIR_1064nm_all.laz")
ctx = ?
pcx = ?
parameters = os.path.join(data_path, "3DMASC_paper_parameter_file.txt") # 3DMASC parameter
clouds = (pc1, pc2, ctx, pcx)
out = cc.q3dmasc(clouds, parameters, only_features=True, verbose=True)
Could you please indicate what files from the 3DMASC datasets webpage did you use as ctx, pcx in the paper?
Thanks in advance!
FG
For the ctx and pcx files, I used the "reference" point clouds from the 3DMASC dataset. Typically, ctx is the context cloud (like a general background scene), and pcx is the cloud you're specifically analyzing. You can find those files on the 3DMASC webpage under the specific dataset you're working with.
-
- Posts: 46
- Joined: Tue Dec 01, 2020 1:21 pm
Re: 3DMASC paper code
Hi,
In the parameter file related to the article, PC1 and PC2 are HD data which are used to compute features in the location specified by PCX.
PC1, PC2, PCX and CTX are just labels, you can use the names you want (except CORE and TEST which have specific roles).
What is important is i) the line core_points: PCX which tells to 3DMASC to compute the features at PCX points and ii) the way you formulate the features using the right labels.
CTX is a third cloud which is used to compute context based features. In the present case, it is the NIR cloud with all points classified as 1. You will have to build it : open the NIR buffered HD cloud with CloudCompare and add a classification field with the value 1 for all points.
So:
pc1 => take the buffered version of the HD data (lighter than all the HD points), e.g. 'green_532nm_buffer.laz'
pc2 => take the buffered version of the HD data (lighter than all the HD points), e.g. 'NIR_1064nm_buffer.laz'
ctx => a context cloud which shall contain a Classification, mind the upper case, field with points of class 1 in the present case (because of the suffix _1 in feature: DZ10_SC0_PCX_CTX_1 which should be better formulated feature: DZ10_SC0_CTX_1 with the last version of the syntax).
pcx => the points where you want to compute the features (because of the line core_points: PCX), e.g. 'green_train_5classes_2000samples.laz' or 'green_train_13classes_2000samples.laz' when working with the train samples
I will check that with the authors but I think that it is OK.
In the parameter file related to the article, PC1 and PC2 are HD data which are used to compute features in the location specified by PCX.
PC1, PC2, PCX and CTX are just labels, you can use the names you want (except CORE and TEST which have specific roles).
What is important is i) the line core_points: PCX which tells to 3DMASC to compute the features at PCX points and ii) the way you formulate the features using the right labels.
CTX is a third cloud which is used to compute context based features. In the present case, it is the NIR cloud with all points classified as 1. You will have to build it : open the NIR buffered HD cloud with CloudCompare and add a classification field with the value 1 for all points.
So:
pc1 => take the buffered version of the HD data (lighter than all the HD points), e.g. 'green_532nm_buffer.laz'
pc2 => take the buffered version of the HD data (lighter than all the HD points), e.g. 'NIR_1064nm_buffer.laz'
ctx => a context cloud which shall contain a Classification, mind the upper case, field with points of class 1 in the present case (because of the suffix _1 in feature: DZ10_SC0_PCX_CTX_1 which should be better formulated feature: DZ10_SC0_CTX_1 with the last version of the syntax).
pcx => the points where you want to compute the features (because of the line core_points: PCX), e.g. 'green_train_5classes_2000samples.laz' or 'green_train_13classes_2000samples.laz' when working with the train samples
I will check that with the authors but I think that it is OK.