How to inspect the results
Discontinuity Set Extractor (DSE) is programmed by Adrián Riquelme for testing part of his PdD studies. Its aim is to extract discontinuity sets from a rock mass. The input data is a 3D point cloud, which can be acquired by means of a 3D laser scanner (LiDAR or TLS), digital photogrammetry techniques (such as SfM) or synthetic data. It applies a proposed methodology to semi-automatically identify points members of an unorganised 3D point cloud that are arranged in 3D space by planes.
Discontinuity Set Extractor (DSE) is an open source software programmen in MATLAB. It can currently run on Windows, MAC or Linux.
You can donwnload the source files to execute them form its Github website.
You are free to use them for any purposes, including commercially or for education. Please remember to cite this software using the provided reference at the end of this website. This freedom is being defined by the GNU General Public License (GPL).
The GUI is designed to follow this mehodology. Buttons and boxes are initially disabled. As the user applies the method, the program enables the following buttons.
DSE is designed to import ASCII files (UTF8). The extension of the file is irrelevant. The file must contain one point per file (the first file must also be a point, not the headers of the columns). Columns are separated per spaces or tabs. Only the first three columns are loaded, so it does not matter the extra info of the file.
Another options is to mark the cell 'Load puntos.txt' and press the button 'load'. It will load the file puntos.txt if exists in the running directory.
Once the file is loaded, the program will display the number of loaded points.
To test purposes, the software allows to generate artefacts. These are simple figures which are defined by their dimensions (a,b,c), the number of sides if needed (n), the separation between points (inc) and the inserted standard deviation of the error (error).
To generate an artifact, start a new project and mark create artefact . Then, configure the dimensions if needed and select the geometry. Press the buttor 'View 3DPC' and inspect the geometry. The software will create a .txt file in the directory where the program is runnin. Now you can analyze that figure.
Calculate normal vectors
In this step, the program searches the knn nearest neighbours of every point. The knn number is inserted in the box 'knn'. By default it is set to 30 neighbours.
Two options are available: (1) to perform a coplanarity test and, if the test is passed, calculate the normal vector associated to the point, and (2) to calculate the normal vector without performing the coplanarity test. The first option may clean non coplanar points, and outliers might be removed. This test may take a long time, so if the cloud is considered 'clean', the second option can be used and the time would be reduced.
If the coplanarity test is conducted, it is needed to use a parameter named tolerance, which is set to 0.2 by default. 0 means that no point will pass the test, and 1 means that all points will pass the test.
From Riquelme et al 2014
Once this step is finished, the program will assign the associated normal vector to each point. No result is exported yet, as this info is stored in the memory. As the normal vectors are already calculated, the program calculates the corresponding pole in a stereonet for every point. The poles can be visualized in the plot frame pressing the button 'Stereo Poles'.
The statistical analysis calculates the non-parametric function of the poles in a stereonet using the Kernel Density Estimation method (Botev). The number of bins refers to how detailed will be the estimation. A high number would show underired results, and a low number underestimated results. By default, this value is set to 64 (2^6), which usually works fine.
Two boxes must be filled before performing the statistical analysis. The density function will present several peaks or local maximums. The program extracts automatically the n maximums that present the highest density value (i.e. the most significant poles). A good strategy is to set this value to 10 and then perform an inspection and edition of the principal poles. The second box refers to the minimum angle that two poles can present. This angle is measured as the angle between two vectors (arcsin of the scalar product).
Once the statistical analsys is performed, the proposed principal poles are shown in the table and the plot button is enabled.
The user can modify the aforementioned parameters and repeat the process. Additionally, a menu allows to manually edit the principal poles. The user can edit the number or relative position of the principal poles, remove or add a principal pole, and even edit their coordinates.
As the principal poles are extracted, the next step is to assign a principal pole to each point. A requirement must be satisfied: for each principal pole, only those normal vectors (and thus their associated 3D points) that measure an angle with respect to the pole minor than a specific value are assigned. In other words, it is like a cone which axis is the vector of the normal pole. If we put all the vectors over that principal pole vector, only the points inside the cone are accepted.
The program allows to export the classified point cloud to visualize if the result is OK, pressing the 'TXT' button.
Save state of calculation
Universidad de Alicante
Carretera San Vicente s/n
03690 San Vicente del Raspeig
Tel: (+34) 96 590 3707Fax: (+34) 96 590 3464