Cxgboost::common::AFTLoss< Distribution > | The AFT loss function |
Cxgboost::BatchIterator< T > | |
Cxgboost::BatchIteratorImpl< T > | |
Cxgboost::BatchParam | Parameters for constructing batches |
Cxgboost::BatchSet< T > | |
Cxgboost::BitFieldContainer< VT, Direction, IsConst > | A non-owning type with auxiliary methods defined for manipulating bits |
►Cxgboost::BitFieldContainer< VT, LBitsPolicy< VT, false >, false > | |
Cxgboost::LBitsPolicy< VT, IsConst > | |
►Cxgboost::BitFieldContainer< VT, RBitsPolicy< VT > > | |
Cxgboost::RBitsPolicy< VT > | |
Cxgboost::common::BlockedSpace2d | |
Cxgboost::common::PartitionBuilder< BlockSize >::BlockInfo | |
►Cxgboost::common::Column< BinIdxType > | Column storage, to be used with ApplySplit. Note that each bin id is stored as index[i] + index_base. Different types of column index for each column allow to reduce the memory usage |
Cxgboost::common::DenseColumn< BinIdxType > | |
Cxgboost::common::SparseColumn< BinIdxType > | |
Cxgboost::common::ColumnMatrix | Collection of columns, with support for construction from GHistIndexMatrix |
Cxgboost::common::ColumnSampler | Handles selection of columns due to colsample_bytree, colsample_bylevel and colsample_bynode parameters. Should be initialised before tree construction and to reset when tree construction is completed |
Cxgboost::common::CompressedBufferWriter | Writes bit compressed symbols to a memory buffer. Use CompressedIterator to read symbols back from buffer. Currently limited to a maximum symbol size of 28 bits |
Cxgboost::common::CompressedIterator< T > | Read symbols from a bit compressed memory buffer. Usable on device and host |
Cxgboost::common::ConfigParser | Implementation of config reader |
►Cxgboost::Configurable | |
Cxgboost::GradientBooster | Interface of gradient boosting model |
Cxgboost::Learner | Learner class that does training and prediction. This is the user facing module of xgboost training. The Load/Save function corresponds to the model used in python/R |
Cxgboost::LinearUpdater | Interface of linear updater |
Cxgboost::Metric | Interface of evaluation metric used to evaluate model performance. This has nothing to do with training, but merely act as evaluation purpose |
Cxgboost::ObjFunction | Interface of objective function |
Cxgboost::TreeUpdater | Interface of tree update module, that performs update of a tree |
Cxgboost::DMatrix | Internal data structured used by XGBoost during training |
Cxgboost::common::RowSetCollection::Elem | Data structure to store an instance set, a subset of rows (instances) associated with a particular node in a decision tree |
Cxgboost::EllpackPage | A page stored in ELLPACK format |
Cxgboost::Entry | Element from a sparse vector |
Cxgboost::common::WQSummary< DType, RType >::Entry | Entry in the sketch summary |
Cxgboost::common::ExtremeDistribution | |
►Cfalse_type | |
►Cxgboost::common::detail::IsSpanOracle< std::remove_cv< T >::type > | |
Cxgboost::common::detail::IsSpan< T > | |
Cxgboost::common::detail::IsSpanOracle< T > | |
Cxgboost::FeatureMap | Feature map data structure to help text model dump. TODO(tqchen) consider make it even more lightweight |
Cxgboost::from_chars_result | |
►CFunctionRegEntryBase | |
Cxgboost::GradientBoosterReg | Registry entry for tree updater |
Cxgboost::LinearUpdaterReg | Registry entry for linear updater |
Cxgboost::MetricReg | Registry entry for Metric factory functions. The additional parameter const char* param gives the value after @, can be null. For example, metric map@3, then: param == "3" |
Cxgboost::ObjFunctionReg | Registry entry for objective factory functions |
Cxgboost::PredictorReg | Registry entry for predictor |
Cxgboost::TreeUpdaterReg | Registry entry for tree updater |
Cxgboost::RegTree::FVec | Dense feature vector that can be taken by RegTree and can be construct from sparse feature vector |
Cxgboost::common::GHistBuilder< GradientSumT > | Builder for histograms of gradient statistics |
Cxgboost::common::GHistIndexBlock | |
Cxgboost::common::GHistIndexBlockMatrix | |
Cxgboost::common::GHistIndexMatrix | Preprocessed global index matrix, in CSR format |
Cxgboost::detail::GradientPairInternal< T > | Implementation of gradient statistics pair. Template specialisation may be used to overload different gradients types e.g. low precision, high precision, integer, floating point |
Cxgboost::common::detail::Greater< T > | |
Cdmlc::serializer::Handler< xgboost::Entry > | |
►Cxgboost::IntrusivePtr< T >::Hash | |
Cstd::hash< xgboost::IntrusivePtr< T > > | |
Cxgboost::common::HistCollection< GradientSumT > | Histogram of gradient statistics for multiple nodes |
Cxgboost::common::HistogramCuts | |
Cxgboost::HostDeviceVector< T > | |
Cxgboost::HostDeviceVector< bst_float > | |
Cxgboost::HostDeviceVector< bst_row_t > | |
Cxgboost::HostDeviceVector< FeatureType > | |
Cxgboost::HostDeviceVector< float > | |
Cxgboost::HostDeviceVector< uint32_t > | |
Cxgboost::HostDeviceVector< xgboost::Entry > | |
Cxgboost::HostDeviceVectorImpl< T > | |
Cxgboost::HostDeviceVectorImpl< bst_float > | |
Cxgboost::HostDeviceVectorImpl< bst_row_t > | |
Cxgboost::HostDeviceVectorImpl< FeatureType > | |
Cxgboost::HostDeviceVectorImpl< float > | |
Cxgboost::HostDeviceVectorImpl< uint32_t > | |
Cxgboost::HostDeviceVectorImpl< xgboost::Entry > | |
Cxgboost::common::HostSketchContainer | |
Cxgboost::HostSparsePageView | |
Cxgboost::common::Index | |
►Cintegral_constant | |
Cxgboost::common::detail::ExtentAsBytesValue< T, Extent > | |
Cxgboost::common::detail::ExtentValue< Extent, Offset, Count > | |
Cxgboost::common::detail::IsAllowedElementTypeConversion< From, To > | |
Cxgboost::common::detail::IsAllowedExtentConversion< From, To > | |
Cxgboost::IntrusivePtr< T > | Implementation of Intrusive Pointer. A smart pointer that points to an object with an embedded reference counter. The underlying object must implement a friend function IntrusivePtrRefCount() that returns the ref counter (of type IntrusivePtrCell). The intrusive pointer is faster than std::shared_ptr<>: std::shared_ptr<> makes an extra memory allocation for the ref counter whereas the intrusive pointer does not |
Cxgboost::IntrusivePtr< xgboost::Value > | |
Cxgboost::IntrusivePtrCell | Helper class for embedding reference counting into client objects. See https://www.boost.org/doc/libs/1_74_0/doc/html/atomic/usage_examples.html for discussions of memory order |
Cxgboost::common::Range::Iterator | |
Cxgboost::Json | Data structure representing JSON format |
Cxgboost::JsonReader | |
Cxgboost::JsonWriter | |
Cxgboost::LearnerModelParam | |
Cxgboost::common::detail::Less< T > | |
Cxgboost::common::LogisticDistribution | |
Cxgboost::MetaInfo | Meta information about dataset, always sit in memory |
►Cxgboost::Model | |
Cxgboost::GradientBooster | Interface of gradient boosting model |
Cxgboost::Learner | Learner class that does training and prediction. This is the user facing module of xgboost training. The Load/Save function corresponds to the model used in python/R |
Cxgboost::RegTree | Define regression tree to be the most common tree model. This is the data structure used in xgboost's major tree models |
Cxgboost::common::Monitor | Timing utility used to measure total method execution time over the lifetime of the containing object |
Cxgboost::RegTree::Node | Tree node |
Cxgboost::common::NormalDistribution | |
Cxgboost::NumericLimits< T > | |
Cxgboost::NumericLimits< float > | |
Cxgboost::NumericLimits< int64_t > | |
Cxgboost::common::ParallelGHistBuilder< GradientSumT > | Stores temporary histograms to compute them in parallel Supports processing multiple tree-nodes for nested parallelism Able to reduce histograms across threads in efficient way |
Cxgboost::common::ParallelGroupBuilder< ValueType, SizeType, is_row_major > | Multi-thread version of group builder |
►CParameter | |
Cxgboost::TreeParam | Meta parameters of the tree |
Cxgboost::XGBoostParameter< Type > | |
►Cxgboost::XGBoostParameter< AFTParam > | |
Cxgboost::common::AFTParam | Parameter structure for AFT loss and metric |
►Cxgboost::XGBoostParameter< GenericParameter > | |
Cxgboost::GenericParameter | |
►Cxgboost::XGBoostParameter< GlobalConfiguration > | |
Cxgboost::GlobalConfiguration | |
Cxgboost::common::PartitionBuilder< BlockSize > | |
Cxgboost::BitFieldContainer< VT, Direction, IsConst >::Pos | |
Cxgboost::PredictionCacheEntry | Contains pointer to input matrix and associated cached predictions |
Cxgboost::PredictionContainer | |
Cxgboost::Predictor | Performs prediction on individual training instances or batches of instances for GBTree. Prediction functions all take a GBTreeModel and a DMatrix as input and output a vector of predictions. The predictor does not modify any state of the model itself |
Cxgboost::common::WQSummary< DType, RType >::Queue::QEntry | |
Cxgboost::common::QuantileSketchTemplate< DType, RType, TSummary > | Template for all quantile sketch algorithm that uses merge/prune scheme |
►Cxgboost::common::QuantileSketchTemplate< DType, unsigned, WQSummary< DType, unsigned > > | |
Cxgboost::common::WQuantileSketch< DType, RType > | Quantile sketch use WQSummary |
►Cxgboost::common::QuantileSketchTemplate< DType, unsigned, WXQSummary< DType, unsigned > > | |
Cxgboost::common::WXQuantileSketch< DType, RType > | Quantile sketch use WXQSummary |
Cxgboost::common::WQSummary< DType, RType >::Queue | Input data queue before entering the summary |
Cxgboost::common::Range | |
Cxgboost::common::Range1d | |
Cxgboost::common::RowSetCollection | Collection of rowset |
Cxgboost::RTreeNodeStat | Node statistics used in regression tree |
Cxgboost::RegTree::Segment | |
►CSerializable | |
Cxgboost::Learner | Learner class that does training and prediction. This is the user facing module of xgboost training. The Load/Save function corresponds to the model used in python/R |
Cxgboost::JsonReader::SourceLocation | |
Cxgboost::common::Span< T, Extent > | Span class implementation, based on ISO++20 span<T>. The interface should be the same |
Cxgboost::common::Span< bst_row_t const > | |
Cxgboost::common::Span< const BinIdxType > | |
Cxgboost::common::Span< const size_t > | |
Cxgboost::common::Span< value_type > | |
Cxgboost::common::Span< xgboost::Entry const > | |
Cxgboost::common::detail::SpanIterator< SpanType, IsConst > | |
►Cxgboost::SparsePage | In-memory storage unit of sparse batch, stored in CSR format |
Cxgboost::CSCPage | |
Cxgboost::SortedCSCPage | |
Cxgboost::common::RowSetCollection::Split | |
►CStream | |
Cxgboost::common::Base64InStream | Stream that reads from base64, note we take from file pointers |
Cxgboost::common::Base64OutStream | Stream that write to base64, note we take from file pointers |
►Cxgboost::common::PeekableInStream | Input stream that support additional PeekRead operation, besides read |
Cxgboost::common::FixedSizeStream | A simple class used to consume ‘dmlc::Stream’ all at once |
Cxgboost::common::StreamBufferReader | Buffer reader of the stream that allows you to get |
Cxgboost::StringView | |
►CSummary | |
Cxgboost::common::QuantileSketchTemplate< DType, RType, TSummary >::SummaryContainer | Same as summary, but use STL to backup the space |
Cxgboost::common::Timer | |
Cxgboost::to_chars_result | |
Cxgboost::TrainingObserver | |
Cxgboost::common::Transform< CompiledWithCuda > | Do Transformation on HostDeviceVectors |
►Ctrue_type | |
Cxgboost::common::detail::IsSpanOracle< Span< T, Extent > > | |
►Cxgboost::Value | |
Cxgboost::JsonArray | |
Cxgboost::JsonBoolean | Describes both true and false |
Cxgboost::JsonInteger | |
Cxgboost::JsonNull | |
Cxgboost::JsonNumber | |
Cxgboost::JsonObject | |
Cxgboost::JsonString | |
Cxgboost::Version | |
►Cxgboost::common::WQSummary< DType, RType > | Experimental wsummary |
Cxgboost::common::WXQSummary< DType, RType > | Try to do efficient pruning |
Cxgboost::XGBAPIThreadLocalEntry | Entry to to easily hold returning information |
CXGBoostBatchCSR | Mini batch used in XGBoost Data Iteration |