java.lang.Object
co.elastic.clients.elasticsearch.ml.get_memory_stats.MemMlStats
All Implemented Interfaces:
JsonpSerializable

@JsonpDeserializable public class MemMlStats extends Object implements JsonpSerializable
See Also:
  • Field Details

  • Method Details

    • of

    • anomalyDetectors

      @Nullable public final String anomalyDetectors()
      Amount of native memory set aside for anomaly detection jobs.

      API name: anomaly_detectors

    • anomalyDetectorsInBytes

      public final int anomalyDetectorsInBytes()
      Required - Amount of native memory, in bytes, set aside for anomaly detection jobs.

      API name: anomaly_detectors_in_bytes

    • dataFrameAnalytics

      @Nullable public final String dataFrameAnalytics()
      Amount of native memory set aside for data frame analytics jobs.

      API name: data_frame_analytics

    • dataFrameAnalyticsInBytes

      public final int dataFrameAnalyticsInBytes()
      Required - Amount of native memory, in bytes, set aside for data frame analytics jobs.

      API name: data_frame_analytics_in_bytes

    • max

      @Nullable public final String max()
      Maximum amount of native memory (separate to the JVM heap) that may be used by machine learning native processes.

      API name: max

    • maxInBytes

      public final int maxInBytes()
      Required - Maximum amount of native memory (separate to the JVM heap), in bytes, that may be used by machine learning native processes.

      API name: max_in_bytes

    • nativeCodeOverhead

      @Nullable public final String nativeCodeOverhead()
      Amount of native memory set aside for loading machine learning native code shared libraries.

      API name: native_code_overhead

    • nativeCodeOverheadInBytes

      public final int nativeCodeOverheadInBytes()
      Required - Amount of native memory, in bytes, set aside for loading machine learning native code shared libraries.

      API name: native_code_overhead_in_bytes

    • nativeInference

      @Nullable public final String nativeInference()
      Amount of native memory set aside for trained models that have a PyTorch model_type.

      API name: native_inference

    • nativeInferenceInBytes

      public final int nativeInferenceInBytes()
      Required - Amount of native memory, in bytes, set aside for trained models that have a PyTorch model_type.

      API name: native_inference_in_bytes

    • serialize

      public void serialize(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper)
      Serialize this object to JSON.
      Specified by:
      serialize in interface JsonpSerializable
    • serializeInternal

      protected void serializeInternal(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper)
    • toString

      public String toString()
      Overrides:
      toString in class Object
    • setupMemMlStatsDeserializer

      protected static void setupMemMlStatsDeserializer(ObjectDeserializer<MemMlStats.Builder> op)