This can be very fast because the cost of sorting and merging is amortized over several insertion. If we keep N centroids total and have the input array is k long, then the amortized cost is something like
N/k + log k
These costs even out when N/k = log k. Balancing costs is often a good place to start in optimizing an algorithm. For different values of compression factor, the following table shows estimated asymptotic values of N and suggested values of k:
Compression | N | k |
50 | 78 | 25 |
100 | 157 | 42 |
200 | 314 | 73 |
The virtues of this kind of t-digest implementation include:
- No allocation is required after initialization
- The data structure automatically compresses existing centroids when possible
- No Java object overhead is incurred for centroids since data is kept in primitive arrays
The current implementation takes the liberty of using ping-pong buffers for implementing the merge resulting in a substantial memory penalty, but the complexity of an in place merge was not considered as worthwhile since even with the overhead, the memory cost is less than 40 bytes per centroid which is much less than half what the AVLTreeDigest uses and no dynamic allocation is required at all.
-
Field Summary
FieldsModifier and TypeFieldDescriptionboolean
boolean
static boolean
-
Constructor Summary
ConstructorsConstructorDescriptionMergingDigest
(double compression) Allocates a buffer merging t-digest.MergingDigest
(double compression, int bufferSize) If you know the size of the temporary buffer for incoming points, you can use this entry point.MergingDigest
(double compression, int bufferSize, int size) Fully specified constructor. -
Method Summary
Modifier and TypeMethodDescriptionvoid
add
(double x, long w) Adds a sample to a histogram.int
byteSize()
Returns the number of bytes required to encode this TDigest using #asBytes().double
cdf
(double x) Returns the fraction of all points added which are ≤ x.int
ACollection
that lets you go through the centroids in ascending order by mean.void
compress()
Merges any pending inputs and compresses the data down to the public setting.double
Returns the current compression factor.double
quantile
(double q) Returns an estimate of a cutoff such that a specified fraction of the data added to this TDigest would be less than or equal to the cutoff.void
setScaleFunction
(ScaleFunction scaleFunction) long
size()
Returns the number of points that have been added to this TDigest.toString()
Methods inherited from class org.elasticsearch.tdigest.AbstractTDigest
add
Methods inherited from class org.elasticsearch.tdigest.TDigest
add, createAvlTreeDigest, createHybridDigest, createMergingDigest, createSortingDigest, getMax, getMin, reserve
-
Field Details
-
useAlternatingSort
public boolean useAlternatingSort -
useTwoLevelCompression
public boolean useTwoLevelCompression -
useWeightLimit
public static boolean useWeightLimit
-
-
Constructor Details
-
MergingDigest
public MergingDigest(double compression) Allocates a buffer merging t-digest. This is the normally used constructor that allocates default sized internal arrays. Other versions are available, but should only be used for special cases.- Parameters:
compression
- The compression factor
-
MergingDigest
public MergingDigest(double compression, int bufferSize) If you know the size of the temporary buffer for incoming points, you can use this entry point.- Parameters:
compression
- Compression factor for t-digest. Same as 1/\delta in the paper.bufferSize
- How many samples to retain before merging.
-
MergingDigest
public MergingDigest(double compression, int bufferSize, int size) Fully specified constructor. Normally only used for deserializing a buffer t-digest.- Parameters:
compression
- Compression factorbufferSize
- Number of temporary centroidssize
- Size of main buffer
-
-
Method Details
-
add
public void add(double x, long w) Description copied from class:TDigest
Adds a sample to a histogram. -
compress
public void compress()Merges any pending inputs and compresses the data down to the public setting. Note that this typically loses a bit of precision and thus isn't a thing to be doing all the time. It is best done only when we want to show results to the outside world. -
size
public long size()Description copied from class:TDigest
Returns the number of points that have been added to this TDigest. -
cdf
public double cdf(double x) Description copied from class:TDigest
Returns the fraction of all points added which are ≤ x. Points that are exactly equal get half credit (i.e. we use the mid-point rule) -
quantile
public double quantile(double q) Description copied from class:TDigest
Returns an estimate of a cutoff such that a specified fraction of the data added to this TDigest would be less than or equal to the cutoff. -
centroidCount
public int centroidCount()- Specified by:
centroidCount
in classTDigest
-
centroids
Description copied from class:TDigest
ACollection
that lets you go through the centroids in ascending order by mean. Centroids returned will not be re-used, but may or may not share storage with this TDigest. -
compression
public double compression()Description copied from class:TDigest
Returns the current compression factor.- Specified by:
compression
in classTDigest
- Returns:
- The compression factor originally used to set up the TDigest.
-
getScaleFunction
-
setScaleFunction
- Overrides:
setScaleFunction
in classTDigest
-
byteSize
public int byteSize()Description copied from class:TDigest
Returns the number of bytes required to encode this TDigest using #asBytes(). -
toString
-