CGAL 4.12 - Classification
User Manual

This component implements the algorithm described in [4] (section 2), generalized to handle different types of data, multiple features and multiple labels. It classifies a data set into a user-defined set of labels, such as ground, vegetation and buildings. A flexible API is provided so that users can classify any type of data which they can index and for which they can compute relevant features, compute their own local features on the input data set and define their own labels.

Note
This component depends on the Boost libraries Serialization and IO Streams (compiled with the GZIP dependency).

# Package Organization

Classification of data sets is achieved as follows (see Figure Figure 74.1):

• some analysis is performed on the input data set;
• features are computed based on this analysis;
• a set of labels (for example: ground, building, vegetation) is defined by the user;
• a classifier is defined and trained: from the set of values taken by the features at an input item, it measures the likelihood of this item to belong to one label or another;
• classification is computed itemwise using the classifier;
• additional regularization can be used by smoothing either locally or globally through a Graph Cut [1] approach.

Figure 74.1 Organization of the package.

This package is designed to be easily extended by users: more specifically, features and labels can be defined by users to handle any data they need to classify.

Currently, CGAL only provides data structures to handle classification of point sets.

# Data Structures

## Analysis

Classification is based on the computation of local features. These features can take advantage of shared data structures that are precomputed and stored separately.

CGAL provides the following structures:

• Point_set_neighborhood stores spatial searching structures and provides adapted queries for points;
• Local_eigen_analysis precomputes covariance matrices on local neighborhoods of points and stores the associated eigenvectors and eigenvalues;
• Planimetric_grid is a 2D grid used for digital terrain modeling.

Most of these features depend on a scale parameter. CGAL provides a method to estimate the average spacing based on a number of neighbors (see CGAL::compute_average_spacing()), which usually provides satisfying results in the absence of noise. In the presence of noise, CGAL::estimate_global_range_scale() provides an estimation of the smallest scale such that the point set has the local dimension of a surface (this method is both robust to noise and outliers, see Result).

The eigen analysis can be used to estimate normals. Note however that this analysis (based on Principal Component Analysis) might not be robust to a high level of noise. CGAL also provides more robust normal estimation functions (see for example CGAL::jet_estimate_normals()).

The following code snippet shows how to instantiate such data structures from an input PLY point set (the full example is given at the end of the manual).

int main (int argc, char** argv)
{
std::string filename (argc > 1 ? argv[1] : "data/b9.ply");
std::ifstream in (filename.c_str());
std::vector<Point> pts;
std::cerr << "Reading input" << std::endl;
if (!in
{
std::cerr << "Error: cannot read " << filename << std::endl;
return EXIT_FAILURE;
}
float grid_resolution = 0.34f;
std::cerr << "Computing useful structures" << std::endl;
Iso_cuboid_3 bbox = CGAL::bounding_box (pts.begin(), pts.end());
Planimetric_grid grid (pts, Pmap(), bbox, grid_resolution);
Neighborhood neighborhood (pts, Pmap());
Local_eigen_analysis eigen (pts, Pmap(), neighborhood.k_neighbor_query(6));

## Label Set

A label represents how an item should be classified, for example: vegetation, building, road, etc. In CGAL, a label has a name and is simply identified by a Label_handle. Note that names are not used for identification: two labels in the same set can have the same name (but not the same handle).

The following code snippet shows how to add labels to the classification object.

Label_set labels;

## Feature Set

Features are defined as scalar fields that associate each input item with a specific value. Note that in order to limit memory consumption, we use the type float for these scalar values (as well as for every floating point value in this package). A feature has a name and is identified by a Feature_handle.

CGAL provides some predefined features that are relevant for classification of point sets:

• Distance_to_plane measures how far away a point is from a locally estimated plane;
• Elevation computes the local distance to an estimation of the ground;
• Vertical_dispersion computes how noisy the point set is on a local Z-cylinder;
• Verticality compares the local normal vector to the vertical vector.

For more details about how these different features can help to identify one label or the other, please refer to their associated reference manual pages. In addition, CGAL also provides features solely based on the eigenvalues [6] of the local covariance matrix:

Finally, if the input data set has additional properties, these can also be used as features. For example, CGAL provides the following features:

• Echo_scatter uses the number of returns (echo) provided by most LIDAR scanners if available;
• Hsv uses input color information if available;
• Simple_feature uses any property map applicable to the input range and whose value type is castable to float (useful if an additional property of the input set should be used as is, for example and intensity measurement).

In the following code snippet, a subset of these features are instantiated. Note that all the predefined features can also be automatically generated in multiple scales (see Point Set Feature Generator).

std::cerr << "Computing features" << std::endl;
Feature_set features;
Feature_handle distance_to_plane = features.add<Distance_to_plane> (pts, Pmap(), eigen);
Feature_handle linearity = features.add<Linearity> (pts, eigen);
Feature_handle omnivariance = features.add<Omnivariance> (pts, eigen);
Feature_handle planarity = features.add<Planarity> (pts, eigen);
Feature_handle surface_variation = features.add<Surface_variation> (pts, eigen);
Feature_handle dispersion = features.add<Dispersion> (pts, Pmap(), grid,
Feature_handle elevation = features.add<Elevation> (pts, Pmap(), grid,

Users may want to define their own features, especially if the input data set comes with additional properties that were not anticipated by CGAL. A user-defined feature must inherit from Feature_base and provide a method value() that associates a scalar value to each input item.

The following example shows how to define a feature that discriminates points that lie inside a 2D box from the others:

// User-defined feature that identifies a specific area of the 3D
// space. This feature takes value 1 for points that lie inside the
// area and 0 for the others.
class My_feature : public CGAL::Classification::Feature_base
{
const Point_range& range;
double xmin, xmax, ymin, ymax;
public:
My_feature (const Point_range& range,
double xmin, double xmax, double ymin, double ymax)
: range (range), xmin(xmin), xmax(xmax), ymin(ymin), ymax(ymax)
{
this->set_name ("my_feature");
}
float value (std::size_t pt_index)
{
if (xmin < range[pt_index].x() && range[pt_index].x() < xmax &&
ymin < range[pt_index].y() && range[pt_index].y() < ymax)
return 1.f;
else
return 0.f;
}
};

This feature can then be instantiated from the feature set the same way as the others:

std::cerr << "Computing features" << std::endl;
Feature_set features;
// Feature that identifies points whose x coordinate is between -20
// and 20 and whose y coordinate is between -15 and 15
Feature_handle my_feature = features.add<My_feature> (pts, -20., 20., -15., 15.);

## Point Set Feature Generator

In standard classification of point sets, users commonly want to use all predefined features to get the best result possible. CGAL provides a class Point_set_feature_generator that performs the following operations:

• it estimates the smallest relevant scale;
• it generates all needed analysis structures and provides access to them;
• it generates all possible features (among all the CGAL predefined ones) based on which property maps are available (it uses colors if available, etc.).

Multiple scales that are sequentially larger can be used to increase the quality of the results [3]. If CGAL is linked with Intel TBB, features can be computed in parallel to increase the overall computation speed.

Note that using this class in order to generate features is not mandatory, as features and data structures can all be handled by hand. It is mainly provided to make the specific case of point sets simpler to handle. Users can still add their own features within their feature set.

The following snippet shows how to use the feature generator:

Feature_set features;
std::cerr << "Generating features" << std::endl;
CGAL::Real_timer t;
t.start();
Feature_generator generator (features, pts, pts.point_map(),
5); // using 5 scales
t.stop();
std::cerr << features.size() << " feature(s) generated in " << t.time() << " second(s)" << std::endl;

# Classifiers

Classification relies on a classifier: this classifier is an object that, from the set of values taken by the features at an input item, computes the energy that measures the likelihood of an input item to belong to one label or another. A model of the concept CGAL::Classification::Classifier must take the index of an input item and store the energies associated to each label in a vector. If a classifier returns the value 0 for a pair of label and input item, it means that this item belongs to this label with certainty; large values mean that this item is not likely to belong to this label.

CGAL provides three models for this concept, Sum_of_weighted_features_classifier, ETHZ_random_forest_classifier and OpenCV_random_forest_classifier.

To perform classification based on these classifiers, please refer to Classification Functions.

## Sum of Weighted Features

This first classifier defines the following attributes:

• a weight applied to each feature;
• an effect applied to each pair of feature and label.

For each label, the classifier computes the energy as a sum of features normalized with both their weight and the effect they have on this specific label.

This classifier can be set up by hand but also embeds a training algorithm.

### Weights and Effects

Each feature is assigned a weight that measures its strength with respect to the other features.

Each pair of feature and label is assigned an effect that can either be:

• FAVORING: the label is favored by high values of the feature;
• NEUTRAL: the label is not affected by the feature;
• PENALIZING: the label is favored by low values of the feature.

For example, vegetation is expected to have a high distance to plane and have a color close to green (if colors are available); facades have a low distance to plane and a low verticality; etc.

Let $$x=(x_i)_{i=1..N}$$ be a potential classification result with $$N$$ the number of input items and $$x_i$$ the class of the $$i^{th}$$ item (for example: vegetation, ground, etc.). Let $$f_j(i)$$ be the raw value of the $$j^{th}$$ feature at the $$i^{th}$$ item and $$w_j$$ be the weight of this feature. We define the normalized value $$F_j(x_i) \in [0:1]$$ of the $$j^{th}$$ feature at the $$i^{th}$$ item as follows:

\begin{eqnarray*} F_j(x_i) = & (1 - \min(\max(0,\frac{f_j(i)}{w_j}), 1)) & \mbox{if } f_j \mbox{ favors } x_i \\ & 0.5 & \mbox{if } f_j \mbox{ is neutral for } x_i \\ & \min(\max(0,\frac{f_j(i)}{w_j}), 1) & \mbox{if } f_j \mbox{ penalizes } x_i \end{eqnarray*}

The itemwise energy measures the coherence of the label $$x_i$$ at the $$i^{th}$$ item and is defined as:

$E_{di}(x_i) = \sum_{j = 1..N_f} F_j(x_i)$

The following code snippet shows how to define the weights and effects of features and labels:

std::cerr << "Setting weights" << std::endl;
Classifier classifier (labels, features);
classifier.set_weight (distance_to_plane, 6.75e-2f);
classifier.set_weight (linearity, 1.19f);
classifier.set_weight (omnivariance, 1.34e-1f);
classifier.set_weight (planarity, 7.32e-1f);
classifier.set_weight (surface_variation, 1.36e-1f);
classifier.set_weight (dispersion, 5.45e-1f);
classifier.set_weight (elevation, 1.47e1f);
std::cerr << "Setting effects" << std::endl;
classifier.set_effect (ground, distance_to_plane, Classifier::NEUTRAL);
classifier.set_effect (ground, linearity, Classifier::PENALIZING);
classifier.set_effect (ground, omnivariance, Classifier::NEUTRAL);
classifier.set_effect (ground, planarity, Classifier::FAVORING);
classifier.set_effect (ground, surface_variation, Classifier::PENALIZING);
classifier.set_effect (ground, dispersion, Classifier::NEUTRAL);
classifier.set_effect (ground, elevation, Classifier::PENALIZING);
classifier.set_effect (vegetation, distance_to_plane, Classifier::FAVORING);
classifier.set_effect (vegetation, linearity, Classifier::NEUTRAL);
classifier.set_effect (vegetation, omnivariance, Classifier::FAVORING);
classifier.set_effect (vegetation, planarity, Classifier::NEUTRAL);
classifier.set_effect (vegetation, surface_variation, Classifier::NEUTRAL);
classifier.set_effect (vegetation, dispersion, Classifier::FAVORING);
classifier.set_effect (vegetation, elevation, Classifier::NEUTRAL);
classifier.set_effect (roof, distance_to_plane, Classifier::NEUTRAL);
classifier.set_effect (roof, linearity, Classifier::PENALIZING);
classifier.set_effect (roof, omnivariance, Classifier::FAVORING);
classifier.set_effect (roof, planarity, Classifier::FAVORING);
classifier.set_effect (roof, surface_variation, Classifier::PENALIZING);
classifier.set_effect (roof, dispersion, Classifier::NEUTRAL);
classifier.set_effect (roof, elevation, Classifier::FAVORING);

### Training

Each feature has a specific weight and each pair of feature-label has a specific effect. This means that the number of parameters to set up can quickly explode: if 6 features are used to classify between 4 labels, 30 parameters have to be set up (6 weights + 6x4 feature-label relationships).

Though it is possible to set them up one by one, CGAL also provides a method train() that requires a small set of ground truth items provided by users. More specifically, users must provide, for each label they want to classify, a set of known inliers among the input data set (for example, selecting one roof, one tree and one section of the ground). The training algorithm works as follows:

• for each feature, a range of weights is tested: the effect each feature have on each label is estimated. For a given weight, if a feature has the same effect on each label, it is non-relevant for classification. The range of weights such that the feature is relevant is estimated;
• for each feature, uniformly picked weight values are tested and their effects estimated;
• each inlier provided by the user is classified using this set of weights and effects;
• the mean intersection-over-union (see Evaluation) is used to evaluate the quality of this set of weights and effects;
• the same mechanism is repeated until all features' ranges have been tested. Weights are only changed one by one, the other ones kept to the values that gave the latest best score.

This usually converges to a satisfying solution (see Figure Figure 74.2). The number of trials is user defined, set to 300 by default. Using at least 10 times the number of features is advised (for example, at least 300 iterations if 30 features are used). If the solution is not satisfying, more inliers can be selected, for example, in a region that the user identifies as misclassified with the current configuration. The training algorithm keeps, as initialization, the best weights found at the previous round and carries on trying new weights by taking new inliers into account.

Figure 74.2 Example of evolution of the mean intersection-over-union. The purple curve is the score computed at the current iteration, green curve is the best score found so far.

### Result

Figure Figure 74.3 shows an example of output on a defect-laden point set. The accuracy on this example is 0.97 with a mean intersection-over-union of 0.85 (see section Evaluation).

Figure 74.3 Example of classification on a point set with medium noise and outliers (left: input, right: output). Ground is orange, roofs are pink, vegetation is green. Outliers are classified with an additional label outlier in black.

## ETHZ Random Forest

CGAL provides ETHZ_random_forest_classifier, a classifier based on the Random Forest Template Library developed by Stefan Walk at ETH Zurich [2] (the library is distributed under the MIT license and is included with the CGAL release, the user does not have to install anything more). This classifier uses a ground truth training set to construct several decision trees that are then used to assign a label to each input item.

This classifier cannot be set up by hand and requires a ground truth training set. The training algorithm is significantly faster but usually requires a higher number of inliers than the previous classifier. The training algorithm uses more memory at runtime and the configuration files are larger than those produced by Sum_of_weighted_features_classifier, but the output quality is usually significantly better, especially in the cases where many labels are used (more than five).

An example shows how to use this classifier. For more details about the algorithm, please refer to README provided in the ETH Zurich's code archive.

## OpenCV Random Forest

This last classifier is OpenCV_random_forest_classifier. It uses the OpenCV library, more specifically the Random Trees package.

Note that this classifier usually produces results with a lower quality than ETHZ_random_forest_classifier.

It is provided for the sake of completeness and for testing purposes, but if you are not sure what to use, we advise using the ETHZ Random Forest instead.

An example shows how to use this classifier. For more details about the algorithm, please refer to the official documentation of OpenCV.

# Classification Functions

Classification is performed by minimizing an energy over the input data set that may include regularization. CGAL provides three different methods for classification, ranging from high speed / low quality to low speed / high quality:

• CGAL::Classification::classify()
• CGAL::Classification::classify_with_local_smoothing()
• CGAL::Classification::classify_with_graphcut()

On a point set of 3 millions of points, the first method takes about 4 seconds, the second about 40 seconds and the third about 2 minutes.

Figure 74.4 Top-Left: input point set. Top-Right: raw output classification represented by a set of colors (ground is orange, facades are blue, roofs are pink and vegetation is green). Bottom-Left: output classification using local smoothing. Bottom-Right: output classification using graphcut.

Mathematical details are provided hereafter.

## Raw classification

• CGAL::Classification::classify(): this is the fastest method that provides acceptable but usually noisy results (see Figure Figure 74.4, top-right).

Let $$x=(x_i)_{i=1..N}$$ be a potential classification result with $$N$$ the number of input items and $$x_i$$ the label of the $$i^{th}$$ item (for example: vegetation, ground, etc.). The classification is performed by minimizing the following energy:

$E(x) = \sum_{i = 1..N} E_{di}(x_i)$

This energy is a sum of itemwise energies provided by the classifier and involves no regularization.

The following snippets show how to classify points based on a label set and a classifier. The result is stored in label_indices, following the same order as the input set and providing for each point the index (in the label set) of its assigned label.

std::vector<std::size_t> label_indices;
CGAL::Real_timer t;
t.start();
Classification::classify<Concurrency_tag> (pts, labels, classifier, label_indices);
t.stop();
std::cerr << "Raw classification performed in " << t.time() << " second(s)" << std::endl;
t.reset();

## Local Regularization

• CGAL::Classification::classify_with_local_smoothing(): this method is a tradeoff between quality and efficiency (see Figure Figure 74.4, bottom-left). The minimized energy is defined as follows:

$E(x) = \sum_{i = 1..N} E_{si}(x_i)$

The energy $$E_{si}(x_i)$$ is defined on a small local neighborhood $$Nb(i)$$ of the $$i^{th}$$ item (including itself):

$E_{si}(x_i) = \frac{\sum_{k \in Nb(i)} E_{di}(x_k)}{\left| Nb(i) \right|}$

This allows to eliminate local noisy variations of assigned labels. Increasing the size of the neighborhood increases the noise reduction at the cost of higher computation times.

The following snippets show how to classify points using local smoothing by providing a model of CGAL::Classification::NeighborQuery.

t.start();
Classification::classify_with_local_smoothing<Concurrency_tag>
(pts, Pmap(), labels, classifier,
label_indices);
t.stop();
std::cerr << "Classification with local smoothing performed in " << t.time() << " second(s)" << std::endl;
t.reset();

## Global Regularization (Graph Cut)

• CGAL::Classification::classify_with_graphcut(): this method offers the best quality but requires longer computation time (see Figure Figure 74.4, bottom-right). The total energy that is minimized is the sum of the partial data term $$E_{di}(x_i)$$ and of a pairwise interaction energy defined by the standard Potts model [5] :

$E(x) = \sum_{i = 1..N} E_{di}(x_i) + \gamma \sum_{i \sim j} \mathbf{1}_{x_i \neq x_j}$

where $$\gamma>0$$ is the parameter of the Potts model that quantifies the strengh of the regularization, $$i \sim j$$ represents the pairs of neighboring items and $$\mathbf{1}_{\{.\}}$$ the characteristic function.

A Graph Cut based algorithm (Alpha Expansion) is used to quickly reach an approximate solution close to the global optimum of this energy.

This method allows to consistently segment the input data set in piecewise constant parts and to correct large wrongly classified clusters. Increasing $$\gamma$$ produces more regular result with a constant computation time.

To speed up computations, the input domain can be subdivided into smaller subsets such that several smaller graph cuts are applied instead of a big one. The computation of these smaller graph cuts can be done in parallel. Increasing the number of subsets allows for faster computation times but can also reduce the quality of the results.

The following snippet shows how to classify points using a graph cut regularization providing a model of CGAL::Classification::NeighborQuery, a strengh parameter $$\gamma$$ and a number of subdivisions.

t.start();
Classification::classify_with_graphcut<Concurrency_tag>
(pts, Pmap(), labels, classifier,
neighborhood.k_neighbor_query(12),
0.2f, 4, label_indices);
t.stop();
std::cerr << "Classification with graphcut performed in " << t.time() << " second(s)" << std::endl;

# Evaluation

The class Evaluation allows users to evaluate the reliability of the classification with respect to a provided ground truth. The following measurements are available:

• precision() computes, for one label, the ratio of true positives over the total number of detected positives;
• recall() computes, for one label, the ratio of true positives over the total number of provided inliers of this label;
• f1_score() is the harmonic mean of precision and recall;
• intersection_over_union() computes the ratio of true positives over the union of the detected positives and of the provided inliers;
• accuracy() computes the ratio of all true positives over the total number of provided inliers;
• mean_f1_score();
• mean_intersection_over_union().

All these values range from 0 (poor quality) to 1 (perfect quality).

# Full Examples

## Classification

The following example:

• reads an input file (LIDAR point set in PLY format);
• computes useful structures from this input;
• computes features from the input and the precomputed structures;
• defines 3 labels (vegetation, ground and roof);
• sets up the classification classifier Sum_of_weighted_features_classifier;
• classifies the point set with the 3 different methods (this is for the sake of the example: each method overwrites the previous result, users should only call one of the methods);
• saves the result in a colored PLY format.
#if defined (_MSC_VER) && !defined (_WIN64)
#pragma warning(disable:4244) // boost::number_distance::distance()
// converts 64 to 32 bits integers
#endif
#include <cstdlib>
#include <fstream>
#include <iostream>
#include <string>
#include <CGAL/Simple_cartesian.h>
#include <CGAL/Classification.h>
#include <CGAL/bounding_box.h>
#include <CGAL/Real_timer.h>
typedef CGAL::Parallel_tag Concurrency_tag;
#else
typedef CGAL::Sequential_tag Concurrency_tag;
#endif
typedef Kernel::Point_3 Point;
typedef Kernel::Iso_cuboid_3 Iso_cuboid_3;
typedef std::vector<Point> Point_range;
typedef CGAL::Identity_property_map<Point> Pmap;
namespace Classification = CGAL::Classification;
typedef Classification::Sum_of_weighted_features_classifier Classifier;
typedef Classification::Planimetric_grid<Kernel, Point_range, Pmap> Planimetric_grid;
typedef Classification::Point_set_neighborhood<Kernel, Point_range, Pmap> Neighborhood;
typedef Classification::Local_eigen_analysis Local_eigen_analysis;
typedef Classification::Label_handle Label_handle;
typedef Classification::Feature_handle Feature_handle;
typedef Classification::Label_set Label_set;
typedef Classification::Feature_set Feature_set;
typedef Classification::Feature::Distance_to_plane<Point_range, Pmap> Distance_to_plane;
typedef Classification::Feature::Linearity Linearity;
typedef Classification::Feature::Omnivariance Omnivariance;
typedef Classification::Feature::Planarity Planarity;
typedef Classification::Feature::Surface_variation Surface_variation;
typedef Classification::Feature::Elevation<Kernel, Point_range, Pmap> Elevation;
typedef Classification::Feature::Vertical_dispersion<Kernel, Point_range, Pmap> Dispersion;
int main (int argc, char** argv)
{
std::string filename (argc > 1 ? argv[1] : "data/b9.ply");
std::ifstream in (filename.c_str());
std::vector<Point> pts;
std::cerr << "Reading input" << std::endl;
if (!in
{
std::cerr << "Error: cannot read " << filename << std::endl;
return EXIT_FAILURE;
}
float grid_resolution = 0.34f;
std::cerr << "Computing useful structures" << std::endl;
Iso_cuboid_3 bbox = CGAL::bounding_box (pts.begin(), pts.end());
Planimetric_grid grid (pts, Pmap(), bbox, grid_resolution);
Neighborhood neighborhood (pts, Pmap());
Local_eigen_analysis eigen (pts, Pmap(), neighborhood.k_neighbor_query(6));
std::cerr << "Computing features" << std::endl;
Feature_set features;
Feature_handle distance_to_plane = features.add<Distance_to_plane> (pts, Pmap(), eigen);
Feature_handle linearity = features.add<Linearity> (pts, eigen);
Feature_handle omnivariance = features.add<Omnivariance> (pts, eigen);
Feature_handle planarity = features.add<Planarity> (pts, eigen);
Feature_handle surface_variation = features.add<Surface_variation> (pts, eigen);
Feature_handle dispersion = features.add<Dispersion> (pts, Pmap(), grid,
Feature_handle elevation = features.add<Elevation> (pts, Pmap(), grid,
Label_set labels;
std::cerr << "Setting weights" << std::endl;
Classifier classifier (labels, features);
classifier.set_weight (distance_to_plane, 6.75e-2f);
classifier.set_weight (linearity, 1.19f);
classifier.set_weight (omnivariance, 1.34e-1f);
classifier.set_weight (planarity, 7.32e-1f);
classifier.set_weight (surface_variation, 1.36e-1f);
classifier.set_weight (dispersion, 5.45e-1f);
classifier.set_weight (elevation, 1.47e1f);
std::cerr << "Setting effects" << std::endl;
classifier.set_effect (ground, distance_to_plane, Classifier::NEUTRAL);
classifier.set_effect (ground, linearity, Classifier::PENALIZING);
classifier.set_effect (ground, omnivariance, Classifier::NEUTRAL);
classifier.set_effect (ground, planarity, Classifier::FAVORING);
classifier.set_effect (ground, surface_variation, Classifier::PENALIZING);
classifier.set_effect (ground, dispersion, Classifier::NEUTRAL);
classifier.set_effect (ground, elevation, Classifier::PENALIZING);
classifier.set_effect (vegetation, distance_to_plane, Classifier::FAVORING);
classifier.set_effect (vegetation, linearity, Classifier::NEUTRAL);
classifier.set_effect (vegetation, omnivariance, Classifier::FAVORING);
classifier.set_effect (vegetation, planarity, Classifier::NEUTRAL);
classifier.set_effect (vegetation, surface_variation, Classifier::NEUTRAL);
classifier.set_effect (vegetation, dispersion, Classifier::FAVORING);
classifier.set_effect (vegetation, elevation, Classifier::NEUTRAL);
classifier.set_effect (roof, distance_to_plane, Classifier::NEUTRAL);
classifier.set_effect (roof, linearity, Classifier::PENALIZING);
classifier.set_effect (roof, omnivariance, Classifier::FAVORING);
classifier.set_effect (roof, planarity, Classifier::FAVORING);
classifier.set_effect (roof, surface_variation, Classifier::PENALIZING);
classifier.set_effect (roof, dispersion, Classifier::NEUTRAL);
classifier.set_effect (roof, elevation, Classifier::FAVORING);
// Run classification
std::cerr << "Classifying" << std::endl;
std::vector<std::size_t> label_indices;
CGAL::Real_timer t;
t.start();
Classification::classify<Concurrency_tag> (pts, labels, classifier, label_indices);
t.stop();
std::cerr << "Raw classification performed in " << t.time() << " second(s)" << std::endl;
t.reset();
t.start();
Classification::classify_with_local_smoothing<Concurrency_tag>
(pts, Pmap(), labels, classifier,
label_indices);
t.stop();
std::cerr << "Classification with local smoothing performed in " << t.time() << " second(s)" << std::endl;
t.reset();
t.start();
Classification::classify_with_graphcut<Concurrency_tag>
(pts, Pmap(), labels, classifier,
neighborhood.k_neighbor_query(12),
0.2f, 4, label_indices);
t.stop();
std::cerr << "Classification with graphcut performed in " << t.time() << " second(s)" << std::endl;
// Save the output in a colored PLY format
std::ofstream f ("classification.ply");
f << "ply" << std::endl
<< "format ascii 1.0" << std::endl
<< "element vertex " << pts.size() << std::endl
<< "property float x" << std::endl
<< "property float y" << std::endl
<< "property float z" << std::endl
<< "property uchar red" << std::endl
<< "property uchar green" << std::endl
<< "property uchar blue" << std::endl
for (std::size_t i = 0; i < pts.size(); ++ i)
{
f << pts[i] << " ";
Label_handle label = labels[label_indices[i]];
if (label == ground)
f << "245 180 0" << std::endl;
else if (label == vegetation)
f << "0 255 27" << std::endl;
else if (label == roof)
f << "255 0 170" << std::endl;
else
{
f << "0 0 0" << std::endl;
std::cerr << "Error: unknown classification label" << std::endl;
}
}
std::cerr << "All done" << std::endl;
return EXIT_SUCCESS;
}

## Defining a Custom Feature

The following example shows how to define a custom feature and how to integrate it in the CGAL classification framework.

#if defined (_MSC_VER) && !defined (_WIN64)
#pragma warning(disable:4244) // boost::number_distance::distance()
// converts 64 to 32 bits integers
#endif
#include <cstdlib>
#include <fstream>
#include <iostream>
#include <string>
#include <CGAL/Simple_cartesian.h>
#include <CGAL/Classification.h>
typedef Kernel::Point_3 Point;
typedef Kernel::Iso_cuboid_3 Iso_cuboid_3;
typedef std::vector<Point> Point_range;
typedef CGAL::Identity_property_map<Point> Pmap;
namespace Classification = CGAL::Classification;
typedef Classification::Sum_of_weighted_features_classifier Classifier;
typedef Classification::Point_set_neighborhood<Kernel, Point_range, Pmap> Neighborhood;
typedef Classification::Local_eigen_analysis Local_eigen_analysis;
typedef Classification::Label_handle Label_handle;
typedef Classification::Feature_handle Feature_handle;
typedef Classification::Label_set Label_set;
typedef Classification::Feature_set Feature_set;
typedef Classification::Feature::Sphericity Sphericity;
// User-defined feature that identifies a specific area of the 3D
// space. This feature takes value 1 for points that lie inside the
// area and 0 for the others.
class My_feature : public CGAL::Classification::Feature_base
{
const Point_range& range;
double xmin, xmax, ymin, ymax;
public:
My_feature (const Point_range& range,
double xmin, double xmax, double ymin, double ymax)
: range (range), xmin(xmin), xmax(xmax), ymin(ymin), ymax(ymax)
{
this->set_name ("my_feature");
}
float value (std::size_t pt_index)
{
if (xmin < range[pt_index].x() && range[pt_index].x() < xmax &&
ymin < range[pt_index].y() && range[pt_index].y() < ymax)
return 1.f;
else
return 0.f;
}
};
int main (int argc, char** argv)
{
std::string filename (argc > 1 ? argv[1] : "data/b9.ply");
std::ifstream in (filename.c_str());
std::vector<Point> pts;
std::cerr << "Reading input" << std::endl;
if (!in
{
std::cerr << "Error: cannot read " << filename << std::endl;
return EXIT_FAILURE;
}
Neighborhood neighborhood (pts, Pmap());
Local_eigen_analysis eigen (pts, Pmap(), neighborhood.k_neighbor_query(6));
Label_set labels;
std::cerr << "Computing features" << std::endl;
Feature_set features;
// Feature that identifies points whose x coordinate is between -20
// and 20 and whose y coordinate is between -15 and 15
Feature_handle my_feature = features.add<My_feature> (pts, -20., 20., -15., 15.);
Feature_handle sphericity = features.add<Sphericity> (pts, eigen);
Classifier classifier (labels, features);
std::cerr << "Setting weights" << std::endl;
classifier.set_weight(sphericity, 0.5);
classifier.set_weight(my_feature, 0.25);
std::cerr << "Setting up labels" << std::endl;
classifier.set_effect (a, sphericity, Classifier::FAVORING);
classifier.set_effect (a, my_feature, Classifier::FAVORING);
classifier.set_effect (b, sphericity, Classifier::PENALIZING);
classifier.set_effect (b, my_feature, Classifier::PENALIZING);
std::cerr << "Classifying" << std::endl;
std::vector<std::size_t> label_indices(pts.size(), -1);
Classification::classify_with_graphcut<CGAL::Sequential_tag>
(pts, Pmap(), labels, classifier,
neighborhood.k_neighbor_query(12),
0.5, 1, label_indices);
std::cerr << "All done" << std::endl;
return EXIT_SUCCESS;
}

## Feature Generation and Training

The following example:

• reads a point set with a training set (embedded as a PLY feature label);
• automatically generates features on 5 scales;
• trains the classifier Sum_of_weighted_features_classifier using 800 trials;
• runs the algorithm using the graphcut regularization;
• prints some evaluation measurements;
• saves the configuration of the classifier for further use.
#if defined (_MSC_VER) && !defined (_WIN64)
#pragma warning(disable:4244) // boost::number_distance::distance()
// converts 64 to 32 bits integers
#endif
#include <cstdlib>
#include <fstream>
#include <iostream>
#include <string>
#include <CGAL/Simple_cartesian.h>
#include <CGAL/Classification.h>
#include <CGAL/Point_set_3.h>
#include <CGAL/Point_set_3/IO.h>
#include <CGAL/Real_timer.h>
typedef Kernel::Point_3 Point;
typedef CGAL::Point_set_3<Point> Point_set;
typedef Kernel::Iso_cuboid_3 Iso_cuboid_3;
typedef Point_set::Point_map Pmap;
typedef Point_set::Property_map<int> Imap;
namespace Classification = CGAL::Classification;
typedef Classification::Label_handle Label_handle;
typedef Classification::Feature_handle Feature_handle;
typedef Classification::Label_set Label_set;
typedef Classification::Feature_set Feature_set;
typedef Classification::Sum_of_weighted_features_classifier Classifier;
typedef Classification::Point_set_feature_generator<Kernel, Point_set, Pmap> Feature_generator;
int main (int argc, char** argv)
{
std::string filename (argc > 1 ? argv[1] : "data/b9_training.ply");
std::ifstream in (filename.c_str(), std::ios::binary);
Point_set pts;
std::cerr << "Reading input" << std::endl;
in >> pts;
Imap label_map;
bool lm_found = false;
boost::tie (label_map, lm_found) = pts.property_map<int> ("label");
if (!lm_found)
{
std::cerr << "Error: \"label\" property not found in input file." << std::endl;
return EXIT_FAILURE;
}
std::vector<int> ground_truth;
ground_truth.reserve (pts.size());
std::copy (pts.range(label_map).begin(), pts.range(label_map).end(),
std::back_inserter (ground_truth));
Feature_set features;
std::cerr << "Generating features" << std::endl;
CGAL::Real_timer t;
t.start();
Feature_generator generator (features, pts, pts.point_map(),
5); // using 5 scales
t.stop();
std::cerr << features.size() << " feature(s) generated in " << t.time() << " second(s)" << std::endl;
Label_set labels;
Classifier classifier (labels, features);
std::cerr << "Training" << std::endl;
t.reset();
t.start();
classifier.train<CGAL::Sequential_tag> (ground_truth, 800);
t.stop();
std::cerr << "Done in " << t.time() << " second(s)" << std::endl;
t.reset();
t.start();
std::vector<int> label_indices(pts.size(), -1);
Classification::classify_with_graphcut<CGAL::Sequential_tag>
(pts, pts.point_map(), labels, classifier,
generator.neighborhood().k_neighbor_query(12),
0.2f, 10, label_indices);
t.stop();
std::cerr << "Classification with graphcut done in " << t.time() << " second(s)" << std::endl;
std::cerr << "Precision, recall, F1 scores and IoU:" << std::endl;
Classification::Evaluation evaluation (labels, ground_truth, label_indices);
for (std::size_t i = 0; i < labels.size(); ++ i)
{
std::cerr << " * " << labels[i]->name() << ": "
<< evaluation.precision(labels[i]) << " ; "
<< evaluation.recall(labels[i]) << " ; "
<< evaluation.f1_score(labels[i]) << " ; "
<< evaluation.intersection_over_union(labels[i]) << std::endl;
}
std::cerr << "Accuracy = " << evaluation.accuracy() << std::endl
<< "Mean F1 score = " << evaluation.mean_f1_score() << std::endl
<< "Mean IoU = " << evaluation.mean_intersection_over_union() << std::endl;
std::ofstream fconfig ("config.xml");
classifier.save_configuration (fconfig);
fconfig.close();
std::cerr << "All done" << std::endl;
return EXIT_SUCCESS;
}

## ETHZ Random Forest

The following example shows how to use the classifier ETHZ_random_forest_classifier using an input training set.

#if defined (_MSC_VER) && !defined (_WIN64)
#pragma warning(disable:4244) // boost::number_distance::distance()
// converts 64 to 32 bits integers
#endif
#include <cstdlib>
#include <fstream>
#include <iostream>
#include <string>
#include <CGAL/Simple_cartesian.h>
#include <CGAL/Classification.h>
#include <CGAL/Point_set_3.h>
#include <CGAL/Point_set_3/IO.h>
#include <CGAL/Real_timer.h>
typedef Kernel::Point_3 Point;
typedef CGAL::Point_set_3<Point> Point_set;
typedef Kernel::Iso_cuboid_3 Iso_cuboid_3;
typedef Point_set::Point_map Pmap;
typedef Point_set::Property_map<int> Imap;
typedef Point_set::Property_map<unsigned char> UCmap;
namespace Classification = CGAL::Classification;
typedef Classification::Label_handle Label_handle;
typedef Classification::Feature_handle Feature_handle;
typedef Classification::Label_set Label_set;
typedef Classification::Feature_set Feature_set;
typedef Classification::Point_set_feature_generator<Kernel, Point_set, Pmap> Feature_generator;
int main (int argc, char** argv)
{
std::string filename = "data/b9_training.ply";
if (argc > 1)
filename = argv[1];
std::ifstream in (filename.c_str(), std::ios::binary);
Point_set pts;
std::cerr << "Reading input" << std::endl;
in >> pts;
Imap label_map;
bool lm_found = false;
boost::tie (label_map, lm_found) = pts.property_map<int> ("label");
if (!lm_found)
{
std::cerr << "Error: \"label\" property not found in input file." << std::endl;
return EXIT_FAILURE;
}
std::vector<int> ground_truth;
ground_truth.reserve (pts.size());
std::copy (pts.range(label_map).begin(), pts.range(label_map).end(),
std::back_inserter (ground_truth));
Feature_set features;
std::cerr << "Generating features" << std::endl;
CGAL::Real_timer t;
t.start();
Feature_generator generator (features, pts, pts.point_map(),
5); // using 5 scales
t.stop();
std::cerr << "Done in " << t.time() << " second(s)" << std::endl;
Label_set labels;
std::vector<int> label_indices(pts.size(), -1);
std::cerr << "Using ETHZ Random Forest Classifier" << std::endl;
Classification::ETHZ_random_forest_classifier classifier (labels, features);
std::cerr << "Training" << std::endl;
t.reset();
t.start();
classifier.train (ground_truth);
t.stop();
std::cerr << "Done in " << t.time() << " second(s)" << std::endl;
t.reset();
t.start();
Classification::classify_with_graphcut<CGAL::Sequential_tag>
(pts, pts.point_map(), labels, classifier,
generator.neighborhood().k_neighbor_query(12),
0.2f, 1, label_indices);
t.stop();
std::cerr << "Classification with graphcut done in " << t.time() << " second(s)" << std::endl;
std::cerr << "Precision, recall, F1 scores and IoU:" << std::endl;
Classification::Evaluation evaluation (labels, ground_truth, label_indices);
for (std::size_t i = 0; i < labels.size(); ++ i)
{
std::cerr << " * " << labels[i]->name() << ": "
<< evaluation.precision(labels[i]) << " ; "
<< evaluation.recall(labels[i]) << " ; "
<< evaluation.f1_score(labels[i]) << " ; "
<< evaluation.intersection_over_union(labels[i]) << std::endl;
}
std::cerr << "Accuracy = " << evaluation.accuracy() << std::endl
<< "Mean F1 score = " << evaluation.mean_f1_score() << std::endl
<< "Mean IoU = " << evaluation.mean_intersection_over_union() << std::endl;
// Color point set according to class
UCmap red = pts.add_property_map<unsigned char>("red", 0).first;
UCmap green = pts.add_property_map<unsigned char>("green", 0).first;
UCmap blue = pts.add_property_map<unsigned char>("blue", 0).first;
for (std::size_t i = 0; i < label_indices.size(); ++ i)
{
label_map[i] = label_indices[i]; // update label map with computed classification
Label_handle label = labels[label_indices[i]];
if (label == ground)
{
red[i] = 245; green[i] = 180; blue[i] = 0;
}
else if (label == vegetation)
{
red[i] = 0; green[i] = 255; blue[i] = 27;
}
else if (label == roof)
{
red[i] = 255; green[i] = 0; blue[i] = 170;
}
}
// Write result
std::ofstream f ("classification.ply");
f.precision(18);
f << pts;
std::cerr << "All done" << std::endl;
return EXIT_SUCCESS;
}

## OpenCV Random Forest

The following example shows how to use the classifier OpenCV_random_forest_classifier using an input training set.

#if defined (_MSC_VER) && !defined (_WIN64)
#pragma warning(disable:4244) // boost::number_distance::distance()
// converts 64 to 32 bits integers
#endif
#include <cstdlib>
#include <fstream>
#include <iostream>
#include <string>
#include <CGAL/Simple_cartesian.h>
#include <CGAL/Classification.h>
#include <CGAL/Point_set_3.h>
#include <CGAL/Point_set_3/IO.h>
#include <CGAL/Real_timer.h>
typedef Kernel::Point_3 Point;
typedef CGAL::Point_set_3<Point> Point_set;
typedef Kernel::Iso_cuboid_3 Iso_cuboid_3;
typedef Point_set::Point_map Pmap;
typedef Point_set::Property_map<int> Imap;
typedef Point_set::Property_map<unsigned char> UCmap;
namespace Classification = CGAL::Classification;
typedef Classification::Label_handle Label_handle;
typedef Classification::Feature_handle Feature_handle;
typedef Classification::Label_set Label_set;
typedef Classification::Feature_set Feature_set;
typedef Classification::Point_set_feature_generator<Kernel, Point_set, Pmap> Feature_generator;
int main (int argc, char** argv)
{
std::string filename = "data/b9_training.ply";
if (argc > 1)
filename = argv[1];
std::ifstream in (filename.c_str(), std::ios::binary);
Point_set pts;
std::cerr << "Reading input" << std::endl;
in >> pts;
Imap label_map;
bool lm_found = false;
boost::tie (label_map, lm_found) = pts.property_map<int> ("label");
if (!lm_found)
{
std::cerr << "Error: \"label\" property not found in input file." << std::endl;
return EXIT_FAILURE;
}
std::vector<int> ground_truth;
ground_truth.reserve (pts.size());
std::copy (pts.range(label_map).begin(), pts.range(label_map).end(),
std::back_inserter (ground_truth));
Feature_set features;
std::cerr << "Generating features" << std::endl;
CGAL::Real_timer t;
t.start();
Feature_generator generator (features, pts, pts.point_map(),
5); // using 5 scales
t.stop();
std::cerr << "Done in " << t.time() << " second(s)" << std::endl;
Label_set labels;
std::vector<int> label_indices(pts.size(), -1);
std::cerr << "Using OpenCV Random Forest Classifier" << std::endl;
Classification::OpenCV_random_forest_classifier classifier (labels, features);
std::cerr << "Training" << std::endl;
t.reset();
t.start();
classifier.train (ground_truth);
t.stop();
std::cerr << "Done in " << t.time() << " second(s)" << std::endl;
t.reset();
t.start();
Classification::classify_with_graphcut<CGAL::Sequential_tag>
(pts, pts.point_map(), labels, classifier,
generator.neighborhood().k_neighbor_query(12),
0.2f, 1, label_indices);
t.stop();
std::cerr << "Classification with graphcut done in " << t.time() << " second(s)" << std::endl;
std::cerr << "Precision, recall, F1 scores and IoU:" << std::endl;
Classification::Evaluation evaluation (labels, ground_truth, label_indices);
for (std::size_t i = 0; i < labels.size(); ++ i)
{
std::cerr << " * " << labels[i]->name() << ": "
<< evaluation.precision(labels[i]) << " ; "
<< evaluation.recall(labels[i]) << " ; "
<< evaluation.f1_score(labels[i]) << " ; "
<< evaluation.intersection_over_union(labels[i]) << std::endl;
}
std::cerr << "Accuracy = " << evaluation.accuracy() << std::endl
<< "Mean F1 score = " << evaluation.mean_f1_score() << std::endl
<< "Mean IoU = " << evaluation.mean_intersection_over_union() << std::endl;
// Color point set according to class
UCmap red = pts.add_property_map<unsigned char>("red", 0).first;
UCmap green = pts.add_property_map<unsigned char>("green", 0).first;
UCmap blue = pts.add_property_map<unsigned char>("blue", 0).first;
for (std::size_t i = 0; i < label_indices.size(); ++ i)
{
label_map[i] = label_indices[i]; // update label map with computed classification
Label_handle label = labels[label_indices[i]];
if (label == ground)
{
red[i] = 245; green[i] = 180; blue[i] = 0;
}
else if (label == vegetation)
{
red[i] = 0; green[i] = 255; blue[i] = 27;
}
else if (label == roof)
{
red[i] = 255; green[i] = 0; blue[i] = 170;
}
}
// Write result
std::ofstream f ("classification.ply");
f.precision(18);
f << pts;
std::cerr << "All done" << std::endl;
return EXIT_SUCCESS;
}

# History

This package is based on a research code by Florent Lafarge that was generalized, extended and packaged by Simon Giraudot.