Calculating mean value and standard deviation from Command Line Mode

Feel free to ask any question here
Post Reply
michaeladamsen
Posts: 5
Joined: Sun Nov 15, 2020 6:07 pm

Calculating mean value and standard deviation from Command Line Mode

Post by michaeladamsen »

Hi :-)

I'm trying to write a script that can calculate the mean value and standard deviation of the scalar field in a cloud. Here after, using -FILTER_SF, all points which scalar field values exceed 3 times the standard deviation, on both sides of the mean value, will be extracted to one cloud, and the remaining to another.

Is this possible in command line mode? Or is it only possible if you use the C++ library?

Best regards, Michael
daniel
Site Admin
Posts: 7717
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Calculating mean value and standard deviation from Command Line Mode

Post by daniel »

I don't think this is possible with the command line mode indeed... And if one wanted to improve the code, one would need to add a specific option of the 'Filter by SF' command to filter with +/N sigmas I guess.
Daniel, CloudCompare admin
Post Reply