qCANUPO (classifier files, etc.)

For any question about plugins!
daniel
Site Admin
Posts: 7710
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: qCANUPO (classifier files, etc.)

Post by daniel »

First, here are simple tests:
- can you try with less scales?
- can you try on a subsampled version of the cloud?

(if it works, then it' probably a memory issue... otherwise it might be a bug ;)
Daniel, CloudCompare admin
leifeiyi
Posts: 4
Joined: Fri Mar 08, 2019 5:25 pm

Re: qCANUPO (classifier files, etc.)

Post by leifeiyi »

Hello, first of all, thank you for providing such a good classifier. I recently read the article about this classifier, but there are some doubts. I hope you can help me answer:
1. The multi-scale dimension feature space made in the first half refers to the position of the point set in the triangle after principal component analysis?
2. How is the feature space described in combination with the maximum separability plane?
3. What is the best combination of scales in the paper that is weighted for the separability of each scale?
For the construction of the classifier, I understand that the triangle obtained by the principal component analysis in the first half is used to find a scale with maximum separability. Then, at this scale, SVM is used to classify the sample to find the best super. Plane, so in my understanding, the SVM finally classifies the set of points in the triangle obtained from the principal component analysis. I don't know what I understand is right.
I look forward to your reply, thank you very much!
daniel
Site Admin
Posts: 7710
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: qCANUPO (classifier files, etc.)

Post by daniel »

I'm not the author of the article, but I'll try to answer as best as I can:
1 - the multi-scale dimension feature space if this position in the triangle but for multiple scales (so you have one position (expressed as 2 barycentric coordinates inside the triangle) multiplied by the number of scales.
2 - not sure about this one ;)
3 - you have to take choose scales that have different 'behaviors' (i.e. linear, planar or 3D) either for the class you want to isolate, or for the others. For instance to isolate something flat at all scales, you'd better take small, medium and large scales, so that other features that would be flat at some scales (like a rock) will be differentiated thanks to the other scales (where the become more '3D'). The SVM is really applied on the 2 * number of scales vector space. Only the representation of the hyper-plane is represented as 2D for easier manipulation/understanding.
Daniel, CloudCompare admin
a1ortega
Posts: 1
Joined: Wed Jun 19, 2019 11:59 am

Re: qCANUPO (classifier files, etc.)

Post by a1ortega »

Does anyone have a link to the .pqm classifier files?? All the links I've found on this page sofar are broken.

Thanks
daniel
Site Admin
Posts: 7710
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: qCANUPO (classifier files, etc.)

Post by daniel »

Yes, sadly it seems those files have disappeared after the reorganization of Dimitri's page (https://geosciences.univ-rennes1.fr/int ... itri-lague).

I guess you could ask him directly?
Daniel, CloudCompare admin
mdd87
Posts: 4
Joined: Thu Sep 05, 2019 7:50 am

Re: qCANUPO (classifier files, etc.)

Post by mdd87 »

Hi guys!
I read the tutorial Classification of point clouds using the CANUPO software suite v1.2 (april 2013) and I´ve tried to follow the steps.I have had a problema in the step: "performing the classification" because I don´t have the folder Pointcloud_Class or I don´t found it. Please I need help! I´m new in CloudCompare.

Thank you so much!
daniel
Site Admin
Posts: 7710
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: qCANUPO (classifier files, etc.)

Post by daniel »

Not sure which tutorial you are referring to... Maybe it's equivalent to the 'Class' scalar field?
Daniel, CloudCompare admin
mdd87
Posts: 4
Joined: Thu Sep 05, 2019 7:50 am

Re: qCANUPO (classifier files, etc.)

Post by mdd87 »

I can´t remember where I could download it, but it has written this:

Performing the classification
In the PointCloud_Class directory you should have something like that :
The Classifiers directory contains the classifiers .prm files. The .svg and .ppm are just used to visualize the classifier and eventually change it with a vector drawing software such as Inkscape (see the section "create your own classifier"). With each classifier should be a copy of the batch that was used to create it or a readme with a list of the corresponding scales to be used. Hence, for a classification you need : the name of the .prm file and the corresponding set of scales (this will likely change soon)
V1.2 : now you just have to give the name of the classifier .prm in canupo. It will automatically seek the corresponding scales. The old way of giving the scales in the command line still works. You can use prm_info.exe to get informations on the classifier scales.
The Data directory should contain your DATA files and CORE files.
The Exe directory contains the various executables
The Msc directory contains the multiscale characteritiscs of a point cloud. Given that in the classification script building the .msc is by far the longest part of the process, delete it only when you're happy with the classification. Indeed, if the classification is not correct, you can change the classifier (see Building your own classifier) and simply reapply the classifier to the .msc file by doing only the classify.exe part of the script. This is extremely fast. The only limitation is that your new classifier must use the exact same scales than the one in the .msc file.
The Training_sets directory is used for training the classifier (see Building your own classifier).
It also contains batch files (.bat) that you can modify by right clicking on it and choose edit. If you double click on it, it will actually run the executables.
The Classification_only.bat is ready to operate on the Otira dataset provided
You now need to modify the Classification_only.bat file to change the names of the DATA and CORE files, as well as the Classifier name and associated scales if needed. If this is your first time, try on a small dataset to have rapid results rather than throwing in a full scene that would require some time (up to 1 - 2 hours for very large scenes at full resolution, and depending on your computer).
Once you've done these changes, you double click on the batch and you wait...
Resulting file
In the results directory you should now have a .txt file (the name is automatically filled as a combination of the DATA and CORE names) with 8 fields that you can open in CC :
X Y Z CLASS CONFIDENCE N1 N2 ANGLE
4
CLASS : contains the attribute number of the class (something defined during the classifier construction). Typically this will be 1 and 2 (it can be 0 if you have changed the arguments of classify and have introduced directly a separation of the classes with a confidence level (see description of the soft))
CONFIDENCE : a measure of the confidence of the classification process (see Brodu and Lague, 2012 for more details). It varies between 0.5 (poor classification, i.e. random) to 1 (excellent confidence).
N1 : number of neighbouring points around the point at the minimum scale used by the classifier
N2 : number of neighbouring points around the point at the maximum scale used by the classifier
ANGLE : angle with respect to the horizontal defined at the largest scale used by the classifier
Separating the classes
Load your result file in CC. Your first scalar field will contain the attribute. Use edit->scalar field->filter (or directly the icon) to create subset cloud of your results. For instance vegetation layer (min:1-max :1) and no_veget layer (min:2-max:2).
Cleaning the classes
Confidence level
You can display the 2nd scalar and visually check where the classification was efficient or not. Play with the display slider to find what is the confidence level that you want to set to remove badly classified data. When you change this slider, points will turn grey when they are not in the range of the slider. Then you can directly use the scalar field filter icon to keep only the well classified points.
Density threshold
Sometimes the classification have a high level of confidence, yet it is visually wrong. This can arise in regions of the cloud where the point density was not enough to actually compute completely the multiscale parameters. In that case filtering the data as a function of N2 can be useful (say that it needs a minimum of 20 to 50 points to get a good results...it's something to try as it varies depending on the scene).
Batch processing of the result file
By using the filter.exe program you can easily filter your result file using different conditions, rather than manually going through the filters in CC. For instance, if you want to extract the class attribute 2 (rock surface for instance), with a level of confidence of 0.95, a minimum number of points of 50 at the largest scale then you can add the following command-line at the end of your batch :
filter.exe name_of_your_resultfile.txt filtered_file.txt 4:1.9:2.1 5:0.9:1 7:50:100000
Hence, assuming that you don't use a core point file, and you have tested all your parameters for filtering, the only thing you have to change is the name of the data file in your batch ! If you've learned a bit of batching, you can then launch the processing of several files automatically with exactly the same set of parameters.
daniel
Site Admin
Posts: 7710
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: qCANUPO (classifier files, etc.)

Post by daniel »

Ah, so that's the documentation of the legacy "canupo" tool developed by N. Brodu (and D.Lague). It was a command line tool, totally separate from CloudCompare.

Then we ported it to a plugin in CloudCompare, but some features were not kept and both tools are quite different today.
Daniel, CloudCompare admin
mdd87
Posts: 4
Joined: Thu Sep 05, 2019 7:50 am

Re: qCANUPO (classifier files, etc.)

Post by mdd87 »

ok! thank you! but know I don´t know how to classify my point cloud. For example if I have a big point cloud about a mountain. Can I take a bit of it for try make my classification? Then I don´t know what scale to use. Does exists any document where explain how to use scales? Furthermore, I don´t know if my own classification I could use it in other point cloud? In summary I need to know how use scales and how to apply a classifier file. Thank you so much.
Post Reply