mlpack  2.2.5
Class Hierarchy

Go to the graphical class hierarchy

This inheritance list is sorted roughly, but not completely, alphabetically:
[detail level 12]
 CAugLagrangian< mlpack::optimization::LRSDPFunction< mlpack::optimization::SDP< arma::sp_mat > > >
 CAugLagrangian< mlpack::optimization::LRSDPFunction< SDPType > >
 CAugLagrangianFunction< mlpack::optimization::LRSDPFunction< mlpack::optimization::SDP< arma::sp_mat > > >
 CAugLagrangianFunction< mlpack::optimization::LRSDPFunction< SDPType > >
 Cstatic_visitor
 Ctemplate AuxiliarySplitInfo< ElemType >
 CFastMKS< mlpack::kernel::CosineDistance >
 CFastMKS< mlpack::kernel::EpanechnikovKernel >
 CFastMKS< mlpack::kernel::GaussianKernel >
 CFastMKS< mlpack::kernel::HyperbolicTangentKernel >
 CFastMKS< mlpack::kernel::LinearKernel >
 CFastMKS< mlpack::kernel::PolynomialKernel >
 CFastMKS< mlpack::kernel::TriangularKernel >
 CHMM< distribution::RegressionDistribution >
 CHRectBound< metric::EuclideanDistance, ElemType >
 CHRectBound< MetricType >
 CHRectBound< mlpack::metric::LMetric, ElemType >
 CIPMetric< mlpack::kernel::CosineDistance >
 CIPMetric< mlpack::kernel::EpanechnikovKernel >
 CIPMetric< mlpack::kernel::GaussianKernel >
 CIPMetric< mlpack::kernel::HyperbolicTangentKernel >
 CIPMetric< mlpack::kernel::LinearKernel >
 CIPMetric< mlpack::kernel::PolynomialKernel >
 CIPMetric< mlpack::kernel::TriangularKernel >
 CIsVector< VecType >If value == true, then VecType is some sort of Armadillo vector or subview
 CIsVector< arma::Col< eT > >
 CIsVector< arma::Row< eT > >
 CIsVector< arma::SpCol< eT > >
 CIsVector< arma::SpRow< eT > >
 CIsVector< arma::SpSubview< eT > >
 CIsVector< arma::subview_col< eT > >
 CIsVector< arma::subview_row< eT > >
 CL_BFGS< AugLagrangianFunction< LagrangianFunction > >
 CL_BFGS< AugLagrangianFunction< mlpack::optimization::LRSDPFunction< mlpack::optimization::SDP< arma::sp_mat > > > >
 CL_BFGS< AugLagrangianFunction< mlpack::optimization::LRSDPFunction< SDPType > > >
 CLRSDP< mlpack::optimization::SDP< arma::sp_mat > >
 CLRSDPFunction< mlpack::optimization::SDP< arma::sp_mat > >
 CAdaBoost< WeakLearnerType, MatType >The AdaBoost class
 CAMF< TerminationPolicyType, InitializationRuleType, UpdateRuleType >This class implements AMF (alternating matrix factorization) on the given matrix V
 CAverageInitializationThis initialization rule initializes matrix W and H to root of the average of V, perturbed with uniform noise
 CCompleteIncrementalTermination< TerminationPolicy >This class acts as a wrapper for basic termination policies to be used by SVDCompleteIncrementalLearning
 CGivenInitializationThis initialization rule for AMF simply fills the W and H matrices with the matrices given to the constructor of this object
 CIncompleteIncrementalTermination< TerminationPolicy >This class acts as a wrapper for basic termination policies to be used by SVDIncompleteIncrementalLearning
 CMaxIterationTerminationThis termination policy only terminates when the maximum number of iterations has been reached
 CNMFALSUpdateThis class implements a method titled 'Alternating Least Squares' described in the following paper:
 CNMFMultiplicativeDistanceUpdateThe multiplicative distance update rules for matrices W and H
 CNMFMultiplicativeDivergenceUpdateThis follows a method described in the paper 'Algorithms for Non-negative
 CRandomAcolInitialization< columnsToAverage >This class initializes the W matrix of the AMF algorithm by averaging p randomly chosen columns of V
 CRandomInitializationThis initialization rule for AMF simply fills the W and H matrices with uniform random noise in [0, 1]
 CSimpleResidueTerminationThis class implements a simple residue-based termination policy
 CSimpleToleranceTermination< MatType >This class implements residue tolerance termination policy
 CSVDBatchLearningThis class implements SVD batch learning with momentum
 CSVDCompleteIncrementalLearning< MatType >This class computes SVD using complete incremental batch learning, as described in the following paper:
 CSVDCompleteIncrementalLearning< arma::sp_mat >TODO : Merge this template specialized function for sparse matrix using common row_col_iterator
 CSVDIncompleteIncrementalLearningThis class computes SVD using incomplete incremental batch learning, as described in the following paper:
 CValidationRMSETermination< MatType >This class implements validation termination policy based on RMSE index
 CRandomInitializationThis class is used to initialize randomly the weight matrix
 CBacktraceProvides a backtrace
 CBallBound< MetricType, VecType >Ball bound encloses a set of points at a specific distance (radius) from a specific point (center)
 CBoundTraits< BoundType >A class to obtain compile-time traits about BoundType classes
 CBoundTraits< BallBound< MetricType, VecType > >A specialization of BoundTraits for this bound type
 CBoundTraits< CellBound< MetricType, ElemType > >
 CBoundTraits< HollowBallBound< MetricType, ElemType > >A specialization of BoundTraits for this bound type
 CBoundTraits< HRectBound< MetricType, ElemType > >
 CCellBound< MetricType, ElemType >The CellBound class describes a bound that consists of a number of hyperrectangles
 CHollowBallBound< TMetricType, ElemType >Hollow ball bound encloses a set of points at a specific distance (radius) from a specific point (center) except points at a specific distance from another point (the center of the hole)
 CHRectBound< MetricType, ElemType >Hyper-rectangle bound for an L-metric
 CIsLMetric< MetricType >Utility struct where Value is true if and only if the argument is of type LMetric
 CIsLMetric< metric::LMetric< Power, TakeRoot > >Specialization for IsLMetric when the argument is of type LMetric
 CCFThis class implements Collaborative Filtering (CF)
 CDummyClassThis class acts as a dummy class for passing as template parameter
 CFactorizerTraits< FactorizerType >Template class for factorizer traits
 CFactorizerTraits< mlpack::svd::RegularizedSVD<> >Factorizer traits of Regularized SVD
 CSVDWrapper< Factorizer >This class acts as the wrapper for all SVD factorizers which are incompatible with CF module
 CCLIParses the command line for parameters and holds user-specified parameters
 CCustomImputation< T >A simple custom imputation class
 CDatasetMapper< PolicyType >Auxiliary information for a dataset, including mappings to/from strings and the datatype of each dimension
 CFirstArrayShim< T >A first shim for arrays
 CFirstNormalArrayShim< T >A first shim for arrays without a Serialize() method
 CFirstShim< T >The first shim: simply holds the object and its name
 CHasSerialize< T >
 CHasSerialize< T >::check< U, V, W >
 CHasSerializeFunction< T >
 CImputer< T, MapperType, StrategyType >Given a dataset of a particular datatype, replace user-specified missing value with a variable dependent on the StrategyType and MapperType
 CIncrementPolicyIncrementPolicy is used as a helper class for DatasetMapper
 CListwiseDeletion< T >A complete-case analysis to remove the values containing mappedValue
 CLoadCSVLoad the csv file.This class use boost::spirit to implement the parser, please refer to following link http://theboostcpplibraries.com/boost.spirit for quick review
 CMeanImputation< T >A simple mean imputation class
 CMedianImputation< T >This is a class implementation of simple median imputation
 CMissingPolicyMissingPolicy is used as a helper class for DatasetMapper
 CSecondArrayShim< T >A shim for objects in an array; this is basically like the SecondShim, but for arrays that hold objects that have Serialize() methods instead of serialize() methods
 CSecondNormalArrayShim< T >A shim for objects in an array which do not have a Serialize() function
 CSecondShim< T >The second shim: wrap the call to Serialize() inside of a serialize() function, so that an archive type can call serialize() on a SecondShim object and this gets forwarded correctly to our object's Serialize() function
 CDBSCAN< RangeSearchType, PointSelectionPolicy >DBSCAN (Density-Based Spatial Clustering of Applications with Noise) is a clustering technique described in the following paper:
 CRandomPointSelectionThis class can be used to randomly select the next point to use for DBSCAN
 CDecisionStump< MatType >This class implements a decision stump
 CDTreeA density estimation tree is similar to both a decision tree and a space partitioning tree (like a kd-tree)
 CDiscreteDistributionA discrete distribution where the only observations are discrete observations
 CGammaDistributionThis class represents the Gamma distribution
 CGaussianDistributionA single multivariate Gaussian distribution
 CLaplaceDistributionThe multivariate Laplace distribution centered at 0 has pdf
 CRegressionDistributionA class that represents a univariate conditionally Gaussian distribution
 CDTBRules< MetricType, TreeType >
 CDTBStatA statistic for use with mlpack trees, which stores the upper bound on distance to nearest neighbors and the component which this node belongs to
 CDualTreeBoruvka< MetricType, MatType, TreeType >Performs the MST calculation using the Dual-Tree Boruvka algorithm, using any type of tree
 CEdgePairAn edge pair is simply two indices and a distance
 CUnionFindA Union-Find data structure
 CFastMKS< KernelType, MatType, TreeType >An implementation of fast exact max-kernel search
 CFastMKSModelA utility struct to contain all the possible FastMKS models, for use by the mlpack_fastmks program
 CFastMKSRules< KernelType, TreeType >The FastMKSRules class is a template helper class used by FastMKS class when performing exact max-kernel search
 CFastMKSStatThe statistic used in trees with FastMKS
 CDiagonalConstraintForce a covariance matrix to be diagonal
 CEigenvalueRatioConstraintGiven a vector of eigenvalue ratios, ensure that the covariance matrix always has those eigenvalue ratios
 CEMFit< InitialClusteringType, CovarianceConstraintPolicy >This class contains methods which can fit a GMM to observations using the EM algorithm
 CGMMA Gaussian Mixture Model (GMM)
 CNoConstraintThis class enforces no constraint on the covariance matrix
 CPositiveDefiniteConstraintGiven a covariance matrix, force the matrix to be positive definite
 CHMM< Distribution >A class that represents a Hidden Markov Model with an arbitrary type of emission distribution
 CCosineDistanceThe cosine distance (or cosine similarity)
 CEpanechnikovKernelThe Epanechnikov kernel, defined as
 CExampleKernelAn example kernel function
 CGaussianKernelThe standard Gaussian kernel
 CHyperbolicTangentKernelHyperbolic tangent kernel
 CKernelTraits< KernelType >This is a template class that can provide information about various kernels
 CKernelTraits< CosineDistance >Kernel traits for the cosine distance
 CKernelTraits< EpanechnikovKernel >Kernel traits for the Epanechnikov kernel
 CKernelTraits< GaussianKernel >Kernel traits for the Gaussian kernel
 CKernelTraits< LaplacianKernel >Kernel traits of the Laplacian kernel
 CKernelTraits< SphericalKernel >Kernel traits for the spherical kernel
 CKernelTraits< TriangularKernel >Kernel traits for the triangular kernel
 CKMeansSelection< ClusteringType, maxIterations >Implementation of the kmeans sampling scheme
 CLaplacianKernelThe standard Laplacian kernel
 CLinearKernelThe simple linear kernel (dot product)
 CNystroemMethod< KernelType, PointSelectionPolicy >
 COrderedSelection
 CPolynomialKernelThe simple polynomial kernel
 CPSpectrumStringKernelThe p-spectrum string kernel
 CRandomSelection
 CSphericalKernelThe spherical kernel, which is 1 when the distance between the two argument points is less than or equal to the bandwidth, or 0 otherwise
 CTriangularKernelThe trivially simple triangular kernel, defined by
 CAllowEmptyClustersPolicy which allows K-Means to create empty clusters without any error being reported
 CDualTreeKMeans< MetricType, MatType, TreeType >An algorithm for an exact Lloyd iteration which simply uses dual-tree nearest-neighbor search to find the nearest centroid for each point in the dataset
 CDualTreeKMeansRules< MetricType, TreeType >
 CElkanKMeans< MetricType, MatType >
 CHamerlyKMeans< MetricType, MatType >
 CKillEmptyClustersPolicy which allows K-Means to "kill" empty clusters without any error being reported
 CKMeans< MetricType, InitialPartitionPolicy, EmptyClusterPolicy, LloydStepType, MatType >This class implements K-Means clustering, using a variety of possible implementations of Lloyd's algorithm
 CMaxVarianceNewClusterWhen an empty cluster is detected, this class takes the point furthest from the centroid of the cluster with maximum variance as a new cluster
 CNaiveKMeans< MetricType, MatType >This is an implementation of a single iteration of Lloyd's algorithm for k-means
 CPellegMooreKMeans< MetricType, MatType >An implementation of Pelleg-Moore's 'blacklist' algorithm for k-means clustering
 CPellegMooreKMeansRules< MetricType, TreeType >The rules class for the single-tree Pelleg-Moore kd-tree traversal for k-means clustering
 CPellegMooreKMeansStatisticA statistic for trees which holds the blacklist for Pelleg-Moore k-means clustering (which represents the clusters that cannot possibly own any points in a node)
 CRandomPartitionA very simple partitioner which partitions the data randomly into the number of desired clusters
 CRefinedStartA refined approach for choosing initial points for k-means clustering
 CSampleInitialization
 CKernelPCA< KernelType, KernelRule >This class performs kernel principal components analysis (Kernel PCA), for a given kernel
 CNaiveKernelRule< KernelType >
 CNystroemKernelRule< KernelType, PointSelectionPolicy >
 CLocalCoordinateCodingAn implementation of Local Coordinate Coding (LCC) that codes data which approximately lives on a manifold using a variation of l1-norm regularized sparse coding; in LCC, the penalty on the absolute value of each point's coefficient for each atom is weighted by the squared distance of that point to that atom
 CLogProvides a convenient way to give formatted output
 CColumnsToBlocksTransform the columns of the given matrix into a block format
 CRangeType< T >Simple real-valued range
 CMatrixCompletionThis class implements the popular nuclear norm minimization heuristic for matrix completion problems
 CMeanShift< UseKernel, KernelType, MatType >This class implements mean shift clustering
 CIPMetric< KernelType >The inner product metric, IPMetric, takes a given Mercer kernel (KernelType), and when Evaluate() is called, returns the distance between the two points in kernel space:
 CLMetric< TPower, TTakeRoot >The L_p metric for arbitrary integer p, with an option to take the root
 CMahalanobisDistance< TakeRoot >The Mahalanobis distance, which is essentially a stretched Euclidean distance
 CNaiveBayesClassifier< MatType >The simple Naive Bayes classifier
 CNCA< MetricType, OptimizerType >An implementation of Neighborhood Components Analysis, both a linear dimensionality reduction technique and a distance learning technique
 CSoftmaxErrorFunction< MetricType >The "softmax" stochastic neighbor assignment probability function
 CDrusillaSelect< MatType >
 CFurthestNeighborSortThis class implements the necessary methods for the SortPolicy template parameter of the NeighborSearch class
 CLSHSearch< SortPolicy >The LSHSearch class; this class builds a hash on the reference set and uses this hash to compute the distance-approximate nearest-neighbors of the given queries
 CNearestNeighborSortThis class implements the necessary methods for the SortPolicy template parameter of the NeighborSearch class
 CNeighborSearch< SortPolicy, MetricType, MatType, TreeType, DualTreeTraversalType, SingleTreeTraversalType >The NeighborSearch class is a template class for performing distance-based neighbor searches
 CNeighborSearchRules< SortPolicy, MetricType, TreeType >The NeighborSearchRules class is a template helper class used by NeighborSearch class when performing distance-based neighbor searches
 CNeighborSearchRules< SortPolicy, MetricType, TreeType >::CandidateCmpCompare two candidates based on the distance
 CNeighborSearchStat< SortPolicy >Extra data for each node in the tree
 CNSModel< SortPolicy >The NSModel class provides an easy way to serialize a model, abstracts away the different types of trees, and also reflects the NeighborSearch API
 CNSModelName< SortPolicy >
 CNSModelName< FurthestNeighborSort >
 CNSModelName< NearestNeighborSort >
 CQDAFN< MatType >
 CRAModel< SortPolicy >The RAModel class provides an abstraction for the RASearch class, abstracting away the TreeType parameter and allowing it to be specified at runtime in this class
 CRAQueryStat< SortPolicy >Extra data for each node in the tree
 CRASearch< SortPolicy, MetricType, MatType, TreeType >The RASearch class: This class provides a generic manner to perform rank-approximate search via random-sampling
 CRASearchRules< SortPolicy, MetricType, TreeType >The RASearchRules class is a template helper class used by RASearch class when performing rank-approximate search via random-sampling
 CRAUtil
 CSparseAutoencoder< OptimizerType >A sparse autoencoder is a neural network whose aim to learn compressed representations of the data, typically for dimensionality reduction, with a constraint on the activity of the neurons in the network
 CSparseAutoencoderFunctionThis is a class for the sparse autoencoder objective function
 CAdaDelta< DecomposableFunctionType >Adadelta is an optimizer that uses two ideas to improve upon the two main drawbacks of the Adagrad method:
 CAdam< DecomposableFunctionType >Adam is an optimizer that computes individual adaptive learning rates for different parameters from estimates of first and second moments of the gradients
 CAugLagrangian< LagrangianFunction >The AugLagrangian class implements the Augmented Lagrangian method of optimization
 CAugLagrangianFunction< LagrangianFunction >This is a utility class used by AugLagrangian, meant to wrap a LagrangianFunction into a function usable by a simple optimizer like L-BFGS
 CAugLagrangianTestFunctionThis function is taken from "Practical Mathematical Optimization" (Snyman), section 5.3.8 ("Application of the Augmented Lagrangian Method")
 CExponentialScheduleThe exponential cooling schedule cools the temperature T at every step according to the equation
 CGockenbachFunctionThis function is taken from M
 CGradientDescent< FunctionType >Gradient Descent is a technique to minimize a function
 CL_BFGS< FunctionType >The generic L-BFGS optimizer, which uses a back-tracking line search algorithm to minimize a function
 CLovaszThetaSDPThis function is the Lovasz-Theta semidefinite program, as implemented in the following paper:
 CLRSDP< SDPType >LRSDP is the implementation of Monteiro and Burer's formulation of low-rank semidefinite programs (LR-SDP)
 CLRSDPFunction< SDPType >The objective function that LRSDP is trying to optimize
 CMiniBatchSGD< DecomposableFunctionType >Mini-batch Stochastic Gradient Descent is a technique for minimizing a function which can be expressed as a sum of other functions
 CPrimalDualSolver< SDPType >Interface to a primal dual interior point solver
 CRMSprop< DecomposableFunctionType >RMSprop is an optimizer that utilizes the magnitude of recent gradients to normalize the gradients
 CSA< FunctionType, CoolingScheduleType >Simulated Annealing is an stochastic optimization algorithm which is able to deliver near-optimal results quickly without knowing the gradient of the function being optimized
 CSDP< ObjectiveMatrixType >Specify an SDP in primal form
 CSGD< DecomposableFunctionType >Stochastic Gradient Descent is a technique for minimizing a function which can be expressed as a sum of other functions
 CGDTestFunctionVery, very simple test function which is the composite of three other functions
 CGeneralizedRosenbrockFunctionThe Generalized Rosenbrock function in n dimensions, defined by f(x) = sum_i^{n - 1} (f(i)(x)) f_i(x) = 100 * (x_i^2 - x_{i + 1})^2 + (1 - x_i)^2 x_0 = [-1.2, 1, -1.2, 1, ...]
 CRosenbrockFunctionThe Rosenbrock function, defined by f(x) = f1(x) + f2(x) f1(x) = 100 (x2 - x1^2)^2 f2(x) = (1 - x1)^2 x_0 = [-1.2, 1]
 CRosenbrockWoodFunctionThe Generalized Rosenbrock function in 4 dimensions with the Wood Function in four dimensions
 CSGDTestFunctionVery, very simple test function which is the composite of three other functions
 CWoodFunctionThe Wood function, defined by f(x) = f1(x) + f2(x) + f3(x) + f4(x) + f5(x) + f6(x) f1(x) = 100 (x2 - x1^2)^2 f2(x) = (1 - x1)^2 f3(x) = 90 (x4 - x3^2)^2 f4(x) = (1 - x3)^2 f5(x) = 10 (x2 + x4 - 2)^2 f6(x) = (1 / 10) (x2 - x4)^2 x_0 = [-3, -1, -3, -1]
 CParamDataAids in the extensibility of CLI by focusing potential changes into one structure
 CExactSVDPolicyImplementation of the exact SVD policy
 CPCAType< DecompositionPolicy >This class implements principal components analysis (PCA)
 CQUICSVDPolicyImplementation of the QUIC-SVD policy
 CRandomizedSVDPolicyImplementation of the randomized SVD policy
 CPerceptron< LearnPolicy, WeightInitializationPolicy, MatType >This class implements a simple perceptron (i.e., a single layer neural network)
 CRandomInitializationThis class is used to initialize weights for the weightVectors matrix in a random manner
 CSimpleWeightUpdate
 CZeroInitializationThis class is used to initialize the matrix weightVectors to zero
 CRadicalAn implementation of RADICAL, an algorithm for independent component analysis (ICA)
 CRangeSearch< MetricType, MatType, TreeType >The RangeSearch class is a template class for performing range searches
 CRangeSearchRules< MetricType, TreeType >The RangeSearchRules class is a template helper class used by RangeSearch class when performing range searches
 CRangeSearchStatStatistic class for RangeSearch, to be set to the StatisticType of the tree type that range search is being performed with
 CRSModel
 CRSModelName
 CLARSAn implementation of LARS, a stage-wise homotopy-based algorithm for l1-regularized linear regression (LASSO) and l1+l2 regularized linear regression (Elastic Net)
 CLinearRegressionA simple linear regression algorithm using ordinary least squares
 CLogisticRegression< MatType >The LogisticRegression class implements an L2-regularized logistic regression model, and supports training with multiple optimizers and classification
 CLogisticRegressionFunction< MatType >The log-likelihood function for the logistic regression objective function
 CSoftmaxRegression< OptimizerType >Softmax Regression is a classifier which can be used for classification when the data available can take two or more class values
 CSoftmaxRegressionFunction
 CDataDependentRandomInitializerA data-dependent random dictionary initializer for SparseCoding
 CNothingInitializerA DictionaryInitializer for SparseCoding which does not initialize anything; it is useful for when the dictionary is already known and will be set with SparseCoding::Dictionary()
 CRandomInitializerA DictionaryInitializer for use with the SparseCoding class
 CSparseCodingAn implementation of Sparse Coding with Dictionary Learning that achieves sparsity via an l1-norm regularizer on the codes (LASSO) or an (l1+l2)-norm regularizer on the codes (the Elastic Net)
 CQUIC_SVDQUIC-SVD is a matrix factorization technique, which operates in a subspace such that A's approximation in that subspace has minimum error(A being the data matrix)
 CRandomizedSVDRandomized SVD is a matrix factorization that is based on randomized matrix approximation techniques, developed in in "Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions"
 CRegularizedSVD< OptimizerType >Regularized SVD is a matrix factorization technique that seeks to reduce the error on the training set, that is on the examples for which the ratings have been provided by the users
 CRegularizedSVDFunction
 CTimerThe timer class provides a way for mlpack methods to be timed
 CTimers
 CAllCategoricalSplit< FitnessFunction >The AllCategoricalSplit is a splitting function that will split categorical features into many children: one child for each category
 CAllCategoricalSplit< FitnessFunction >::AuxiliarySplitInfo< ElemType >
 CAxisParallelProjVectorAxisParallelProjVector defines an axis-parallel projection vector
 CBestBinaryNumericSplit< FitnessFunction >The BestBinaryNumericSplit is a splitting function for decision trees that will exhaustively search a numeric dimension for the best binary split
 CBestBinaryNumericSplit< FitnessFunction >::AuxiliarySplitInfo< ElemType >
 CBinaryNumericSplit< FitnessFunction, ObservationType >The BinaryNumericSplit class implements the numeric feature splitting strategy devised by Gama, Rocha, and Medas in the following paper:
 CBinaryNumericSplitInfo< ObservationType >
 CBinarySpaceTree< MetricType, StatisticType, MatType, BoundType, SplitType >A binary space partitioning tree, such as a KD-tree or a ball tree
 CBinarySpaceTree< MetricType, StatisticType, MatType, BoundType, SplitType >::BreadthFirstDualTreeTraverser< RuleType >
 CBinarySpaceTree< MetricType, StatisticType, MatType, BoundType, SplitType >::DualTreeTraverser< RuleType >A dual-tree traverser for binary space trees; see dual_tree_traverser.hpp
 CBinarySpaceTree< MetricType, StatisticType, MatType, BoundType, SplitType >::SingleTreeTraverser< RuleType >A single-tree traverser for binary space trees; see single_tree_traverser.hpp for implementation
 CCategoricalSplitInfo
 CCompareCosineNode
 CCosineTree
 CCoverTree< MetricType, StatisticType, MatType, RootPointPolicy >A cover tree is a tree specifically designed to speed up nearest-neighbor computation in high-dimensional spaces
 CCoverTree< MetricType, StatisticType, MatType, RootPointPolicy >::DualTreeTraverser< RuleType >A dual-tree cover tree traverser; see dual_tree_traverser.hpp
 CCoverTree< MetricType, StatisticType, MatType, RootPointPolicy >::SingleTreeTraverser< RuleType >A single-tree cover tree traverser; see single_tree_traverser.hpp for implementation
 CDiscreteHilbertValue< TreeElemType >The DiscreteHilbertValue class stores Hilbert values for all of the points in a RectangleTree node, and calculates Hilbert values for new points
 CEmptyStatisticEmpty statistic if you are not interested in storing statistics in your tree
 CExampleTree< MetricType, StatisticType, MatType >This is not an actual space tree but instead an example tree that exists to show and document all the functions that mlpack trees must implement
 CFirstPointIsRootThis class is meant to be used as a choice for the policy class RootPointPolicy of the CoverTree class
 CGiniGainThe Gini gain, a measure of set purity usable as a fitness function (FitnessFunction) for decision trees
 CGiniImpurity
 CGreedySingleTreeTraverser< TreeType, RuleType >
 CHilbertRTreeAuxiliaryInformation< TreeType, HilbertValueType >
 CHilbertRTreeDescentHeuristicThis class chooses the best child of a node in a Hilbert R tree when inserting a new point
 CHilbertRTreeSplit< splitOrder >The splitting procedure for the Hilbert R tree
 CHoeffdingCategoricalSplit< FitnessFunction >This is the standard Hoeffding-bound categorical feature proposed in the paper below:
 CHoeffdingNumericSplit< FitnessFunction, ObservationType >The HoeffdingNumericSplit class implements the numeric feature splitting strategy alluded to by Domingos and Hulten in the following paper:
 CHoeffdingTree< FitnessFunction, NumericSplitType, CategoricalSplitType >The HoeffdingTree object represents all of the necessary information for a Hoeffding-bound-based decision tree
 CHyperplaneBase< BoundT, ProjVectorT >HyperplaneBase defines a splitting hyperplane based on a projection vector and projection value
 CInformationGainThe standard information gain criterion, used for calculating gain in decision trees
 CIsSpillTree< TreeType >
 CIsSpillTree< tree::SpillTree< MetricType, StatisticType, MatType, HyperplaneType, SplitType > >
 CMeanSpaceSplit< MetricType, MatType >
 CMeanSplit< BoundType, MatType >A binary space partitioning tree node is split into its left and right child
 CMeanSplit< BoundType, MatType >::SplitInfoAn information about the partition
 CMidpointSpaceSplit< MetricType, MatType >
 CMidpointSplit< BoundType, MatType >A binary space partitioning tree node is split into its left and right child
 CMidpointSplit< BoundType, MatType >::SplitInfoA struct that contains an information about the split
 CMinimalCoverageSweep< SplitPolicy >The MinimalCoverageSweep class finds a partition along which we can split a node according to the coverage of two resulting nodes
 CMinimalCoverageSweep< SplitPolicy >::SweepCost< TreeType >A struct that provides the type of the sweep cost
 CMinimalSplitsNumberSweep< SplitPolicy >The MinimalSplitsNumberSweep class finds a partition along which we can split a node according to the number of required splits of the node
 CMinimalSplitsNumberSweep< SplitPolicy >::SweepCost< typename >A struct that provides the type of the sweep cost
 CNoAuxiliaryInformation< TreeType >
 CNumericSplitInfo< ObservationType >
 COctree< MetricType, StatisticType, MatType >
 COctree< MetricType, StatisticType, MatType >::DualTreeTraverser< MetricType, StatisticType, MatType >A dual-tree traverser; see dual_tree_traverser.hpp
 COctree< MetricType, StatisticType, MatType >::SingleTreeTraverser< RuleType >A single-tree traverser; see single_tree_traverser.hpp
 CProjVectorProjVector defines a general projection vector (not necessarily axis-parallel)
 CQueueFrame< TreeType, TraversalInfoType >
 CRectangleTree< MetricType, StatisticType, MatType, SplitType, DescentType, AuxiliaryInformationType >A rectangle type tree tree, such as an R-tree or X-tree
 CRectangleTree< MetricType, StatisticType, MatType, SplitType, DescentType, AuxiliaryInformationType >::DualTreeTraverser< MetricType, StatisticType, MatType, SplitType, DescentType, AuxiliaryInformationType >A dual tree traverser for rectangle type trees
 CRectangleTree< MetricType, StatisticType, MatType, SplitType, DescentType, AuxiliaryInformationType >::SingleTreeTraverser< RuleType >A single traverser for rectangle type trees
 CRPlusPlusTreeAuxiliaryInformation< TreeType >
 CRPlusPlusTreeDescentHeuristic
 CRPlusPlusTreeSplitPolicyThe RPlusPlusTreeSplitPolicy helps to determine the subtree into which we should insert a child of an intermediate node that is being split
 CRPlusTreeDescentHeuristic
 CRPlusTreeSplit< SplitPolicyType, SweepType >The RPlusTreeSplit class performs the split process of a node on overflow
 CRPlusTreeSplitPolicyThe RPlusPlusTreeSplitPolicy helps to determine the subtree into which we should insert a child of an intermediate node that is being split
 CRPTreeMaxSplit< BoundType, MatType >This class splits a node by a random hyperplane
 CRPTreeMaxSplit< BoundType, MatType >::SplitInfoAn information about the partition
 CRPTreeMeanSplit< BoundType, MatType >This class splits a binary space tree
 CRPTreeMeanSplit< BoundType, MatType >::SplitInfoAn information about the partition
 CRStarTreeDescentHeuristicWhen descending a RectangleTree to insert a point, we need to have a way to choose a child node when the point isn't enclosed by any of them
 CRStarTreeSplitA Rectangle Tree has new points inserted at the bottom
 CRTreeDescentHeuristicWhen descending a RectangleTree to insert a point, we need to have a way to choose a child node when the point isn't enclosed by any of them
 CRTreeSplitA Rectangle Tree has new points inserted at the bottom
 CSpaceSplit< MetricType, MatType >
 CSpillTree< MetricType, StatisticType, MatType, HyperplaneType, SplitType >A hybrid spill tree is a variant of binary space trees in which the children of a node can "spill over" each other, and contain shared datapoints
 CSpillTree< MetricType, StatisticType, MatType, HyperplaneType, SplitType >::SpillDualTreeTraverser< MetricType, StatisticType, MatType, HyperplaneType, SplitType >A generic dual-tree traverser for hybrid spill trees; see spill_dual_tree_traverser.hpp for implementation
 CSpillTree< MetricType, StatisticType, MatType, HyperplaneType, SplitType >::SpillSingleTreeTraverser< MetricType, StatisticType, MatType, HyperplaneType, SplitType >A generic single-tree traverser for hybrid spill trees; see spill_single_tree_traverser.hpp for implementation
 CTraversalInfo< TreeType >The TraversalInfo class holds traversal information which is used in dual-tree (and single-tree) traversals
 CTreeTraits< TreeType >The TreeTraits class provides compile-time information on the characteristics of a given tree type
 CTreeTraits< BinarySpaceTree< MetricType, StatisticType, MatType, bound::BallBound, SplitType > >This is a specialization of the TreeType class to the BallTree tree type
 CTreeTraits< BinarySpaceTree< MetricType, StatisticType, MatType, bound::CellBound, SplitType > >This is a specialization of the TreeType class to the UBTree tree type
 CTreeTraits< BinarySpaceTree< MetricType, StatisticType, MatType, bound::HollowBallBound, SplitType > >This is a specialization of the TreeType class to an arbitrary tree with HollowBallBound (currently only the vantage point tree is supported)
 CTreeTraits< BinarySpaceTree< MetricType, StatisticType, MatType, BoundType, RPTreeMaxSplit > >This is a specialization of the TreeType class to the max-split random projection tree
 CTreeTraits< BinarySpaceTree< MetricType, StatisticType, MatType, BoundType, RPTreeMeanSplit > >This is a specialization of the TreeType class to the mean-split random projection tree
 CTreeTraits< BinarySpaceTree< MetricType, StatisticType, MatType, BoundType, SplitType > >This is a specialization of the TreeTraits class to the BinarySpaceTree tree type
 CTreeTraits< CoverTree< MetricType, StatisticType, MatType, RootPointPolicy > >The specialization of the TreeTraits class for the CoverTree tree type
 CTreeTraits< Octree< MetricType, StatisticType, MatType > >This is a specialization of the TreeTraits class to the Octree tree type
 CTreeTraits< RectangleTree< MetricType, StatisticType, MatType, RPlusTreeSplit< SplitPolicyType, SweepType >, DescentType, AuxiliaryInformationType > >Since the R+/R++ tree can not have overlapping children, we should define traits for the R+/R++ tree
 CTreeTraits< RectangleTree< MetricType, StatisticType, MatType, SplitType, DescentType, AuxiliaryInformationType > >This is a specialization of the TreeType class to the RectangleTree tree type
 CTreeTraits< SpillTree< MetricType, StatisticType, MatType, HyperplaneType, SplitType > >This is a specialization of the TreeType class to the SpillTree tree type
 CUBTreeSplit< BoundType, MatType >Split a node into two parts according to the median address of points contained in the node
 CVantagePointSplit< BoundType, MatType, MaxNumSamples >The class splits a binary space partitioning tree node according to the median distance to the vantage point
 CVantagePointSplit< BoundType, MatType, MaxNumSamples >::SplitInfoA struct that contains an information about the split
 CXTreeAuxiliaryInformation< TreeType >The XTreeAuxiliaryInformation class provides information specific to X trees for each node in a RectangleTree
 CXTreeAuxiliaryInformation< TreeType >::SplitHistoryStructThe X tree requires that the tree records it's "split history"
 CXTreeSplitA Rectangle Tree has new points inserted at the bottom
 CCLIDeleterExtremely simple class whose only job is to delete the existing CLI object at the end of execution
 CNullOutStreamUsed for Log::Debug when not compiled with debugging symbols
 COption< N >A static object whose constructor registers a parameter with the CLI class
 CPrefixedOutStreamAllows us to output to an ostream with a prefix at the beginning of each line, in the same way we would output to cout or cerr
 CProgramDocA static object whose constructor registers program documentation with the CLI class
 CNeighborSearchStat< neighbor::NearestNeighborSort >
 Ctemplate AuxiliarySplitInfo< ElemType >
 CRangeType< double >
 CRangeType< ElemType >
 CRASearch< mlpack::tree::BinarySpaceTree >
 CRASearch< mlpack::tree::CoverTree >
 CRASearch< mlpack::tree::Octree >
 CRASearch< mlpack::tree::RectangleTree >
 CSDP< arma::sp_mat >
 CT