Class MemMlStats
java.lang.Object
co.elastic.clients.elasticsearch.ml.get_memory_stats.MemMlStats
- All Implemented Interfaces:
JsonpSerializable
@JsonpDeserializable public class MemMlStats extends java.lang.Object implements JsonpSerializable
- See Also:
- API specification
-
Nested Class Summary
Nested Classes Modifier and Type Class Description static class
MemMlStats.Builder
Builder forMemMlStats
. -
Field Summary
Fields Modifier and Type Field Description static JsonpDeserializer<MemMlStats>
_DESERIALIZER
Json deserializer forMemMlStats
-
Method Summary
Modifier and Type Method Description java.lang.String
anomalyDetectors()
Amount of native memory set aside for anomaly detection jobs.int
anomalyDetectorsInBytes()
Required - Amount of native memory, in bytes, set aside for anomaly detection jobs.java.lang.String
dataFrameAnalytics()
Amount of native memory set aside for data frame analytics jobs.int
dataFrameAnalyticsInBytes()
Required - Amount of native memory, in bytes, set aside for data frame analytics jobs.java.lang.String
max()
Maximum amount of native memory (separate to the JVM heap) that may be used by machine learning native processes.int
maxInBytes()
Required - Maximum amount of native memory (separate to the JVM heap), in bytes, that may be used by machine learning native processes.java.lang.String
nativeCodeOverhead()
Amount of native memory set aside for loading machine learning native code shared libraries.int
nativeCodeOverheadInBytes()
Required - Amount of native memory, in bytes, set aside for loading machine learning native code shared libraries.java.lang.String
nativeInference()
Amount of native memory set aside for trained models that have a PyTorch model_type.int
nativeInferenceInBytes()
Required - Amount of native memory, in bytes, set aside for trained models that have a PyTorch model_type.static MemMlStats
of(java.util.function.Function<MemMlStats.Builder,ObjectBuilder<MemMlStats>> fn)
void
serialize(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper)
Serialize this object to JSON.protected void
serializeInternal(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper)
protected static void
setupMemMlStatsDeserializer(ObjectDeserializer<MemMlStats.Builder> op)
java.lang.String
toString()
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
-
Field Details
-
_DESERIALIZER
Json deserializer forMemMlStats
-
-
Method Details
-
of
public static MemMlStats of(java.util.function.Function<MemMlStats.Builder,ObjectBuilder<MemMlStats>> fn) -
anomalyDetectors
@Nullable public final java.lang.String anomalyDetectors()Amount of native memory set aside for anomaly detection jobs.API name:
anomaly_detectors
-
anomalyDetectorsInBytes
public final int anomalyDetectorsInBytes()Required - Amount of native memory, in bytes, set aside for anomaly detection jobs.API name:
anomaly_detectors_in_bytes
-
dataFrameAnalytics
@Nullable public final java.lang.String dataFrameAnalytics()Amount of native memory set aside for data frame analytics jobs.API name:
data_frame_analytics
-
dataFrameAnalyticsInBytes
public final int dataFrameAnalyticsInBytes()Required - Amount of native memory, in bytes, set aside for data frame analytics jobs.API name:
data_frame_analytics_in_bytes
-
max
@Nullable public final java.lang.String max()Maximum amount of native memory (separate to the JVM heap) that may be used by machine learning native processes.API name:
max
-
maxInBytes
public final int maxInBytes()Required - Maximum amount of native memory (separate to the JVM heap), in bytes, that may be used by machine learning native processes.API name:
max_in_bytes
-
nativeCodeOverhead
@Nullable public final java.lang.String nativeCodeOverhead()Amount of native memory set aside for loading machine learning native code shared libraries.API name:
native_code_overhead
-
nativeCodeOverheadInBytes
public final int nativeCodeOverheadInBytes()Required - Amount of native memory, in bytes, set aside for loading machine learning native code shared libraries.API name:
native_code_overhead_in_bytes
-
nativeInference
@Nullable public final java.lang.String nativeInference()Amount of native memory set aside for trained models that have a PyTorch model_type.API name:
native_inference
-
nativeInferenceInBytes
public final int nativeInferenceInBytes()Required - Amount of native memory, in bytes, set aside for trained models that have a PyTorch model_type.API name:
native_inference_in_bytes
-
serialize
Serialize this object to JSON.- Specified by:
serialize
in interfaceJsonpSerializable
-
serializeInternal
-
toString
public java.lang.String toString()- Overrides:
toString
in classjava.lang.Object
-
setupMemMlStatsDeserializer
-