Skip to content

[BUG] Errors seen with logstash-oss 9.2.1 #283

@ryn9

Description

@ryn9

Describe the bug

On occasion seeing the following errors since upgrading to logstash-oss 9.2.1

logstash-1  | [2025-12-01T16:01:04,067][ERROR][org.logstash.execution.QueueReadClientBatchMetrics][output_opensearch_ds] Failed to calculate batch byte size for metrics
logstash-1  | java.lang.IllegalArgumentException: Unsupported type encountered in estimateMemory: org.logstash.Timestamp. Please ensure all objects passed to estimateMemory are of supported types. Refer to the ConvertedMap.estimateMemory method for the list of supported types.
logstash-1  |   at org.logstash.ConvertedMap.estimateMemory(ConvertedMap.java:275) ~[logstash-core.jar:?]
logstash-1  |   at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197) ~[?:?]
logstash-1  |   at java.util.IdentityHashMap$ValueSpliterator.forEachRemaining(IdentityHashMap.java:1565) ~[?:?]
logstash-1  |   at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
logstash-1  |   at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
logstash-1  |   at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921) ~[?:?]
logstash-1  |   at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
logstash-1  |   at java.util.stream.LongPipeline.reduce(LongPipeline.java:498) ~[?:?]
logstash-1  |   at java.util.stream.LongPipeline.sum(LongPipeline.java:456) ~[?:?]
logstash-1  |   at org.logstash.ConvertedMap.estimateMemory(ConvertedMap.java:177) ~[logstash-core.jar:?]
logstash-1  |   at org.logstash.ConvertedMap.estimateMemory(ConvertedMap.java:224) ~[logstash-core.jar:?]
logstash-1  |   at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197) ~[?:?]
logstash-1  |   at java.util.IdentityHashMap$ValueSpliterator.forEachRemaining(IdentityHashMap.java:1565) ~[?:?]
logstash-1  |   at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
logstash-1  |   at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
logstash-1  |   at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921) ~[?:?]
logstash-1  |   at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
logstash-1  |   at java.util.stream.LongPipeline.reduce(LongPipeline.java:498) ~[?:?]
logstash-1  |   at java.util.stream.LongPipeline.sum(LongPipeline.java:456) ~[?:?]
logstash-1  |   at org.logstash.ConvertedMap.estimateMemory(ConvertedMap.java:177) ~[logstash-core.jar:?]
logstash-1  |   at org.logstash.ConvertedMap.estimateMemory(ConvertedMap.java:224) ~[logstash-core.jar:?]
logstash-1  |   at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197) ~[?:?]
logstash-1  |   at java.util.IdentityHashMap$ValueSpliterator.forEachRemaining(IdentityHashMap.java:1565) ~[?:?]
logstash-1  |   at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
logstash-1  |   at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
logstash-1  |   at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921) ~[?:?]
logstash-1  |   at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
logstash-1  |   at java.util.stream.LongPipeline.reduce(LongPipeline.java:498) ~[?:?]
logstash-1  |   at java.util.stream.LongPipeline.sum(LongPipeline.java:456) ~[?:?]
logstash-1  |   at org.logstash.ConvertedMap.estimateMemory(ConvertedMap.java:177) ~[logstash-core.jar:?]
logstash-1  |   at org.logstash.Event.estimateMemory(Event.java:579) ~[logstash-core.jar:?]
logstash-1  |   at org.logstash.execution.QueueReadClientBatchMetrics.updateBatchSizeMetric(QueueReadClientBatchMetrics.java:72) ~[logstash-core.jar:?]
logstash-1  |   at org.logstash.execution.QueueReadClientBatchMetrics.updateBatchMetrics(QueueReadClientBatchMetrics.java:63) ~[logstash-core.jar:?]
logstash-1  |   at org.logstash.execution.QueueReadClientBase.startMetrics(QueueReadClientBase.java:211) ~[logstash-core.jar:?]
logstash-1  |   at org.logstash.ext.JrubyAckedReadClientExt.readBatch(JrubyAckedReadClientExt.java:89) ~[logstash-core.jar:?]
logstash-1  |   at org.logstash.execution.WorkerLoop.run(WorkerLoop.java:82) ~[logstash-core.jar:?]
logstash-1  |   at jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103) ~[?:?]
logstash-1  |   at java.lang.reflect.Method.invoke(Method.java:580) ~[?:?]
logstash-1  |   at org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(JavaMethod.java:300) ~[jruby.jar:?]
logstash-1  |   at org.jruby.javasupport.JavaMethod.invokeDirect(JavaMethod.java:164) ~[jruby.jar:?]
logstash-1  |   at org.jruby.java.invokers.InstanceMethodInvoker.call(InstanceMethodInvoker.java:32) ~[jruby.jar:?]
logstash-1  |   at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:193) ~[jruby.jar:?]
logstash-1  |   at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:346) ~[jruby.jar:?]
logstash-1  |   at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66) ~[jruby.jar:?]
logstash-1  |   at org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:118) ~[jruby.jar:?]
logstash-1  |   at org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136) ~[jruby.jar:?]
logstash-1  |   at org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66) ~[jruby.jar:?]
logstash-1  |   at org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58) ~[jruby.jar:?]
logstash-1  |   at org.jruby.runtime.Block.call(Block.java:144) ~[jruby.jar:?]
logstash-1  |   at org.jruby.RubyProc.call(RubyProc.java:354) ~[jruby.jar:?]
logstash-1  |   at org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:111) ~[jruby.jar:?]
logstash-1  |   at java.lang.Thread.run(Thread.java:1583) ~[?:?]


Expected behavior
No errors

Host/Environment (please complete the following information):

  • OS: linux/docker
  • Version: 2.0.3

Additional context
Configuration using the plugin:

output {
  if [@metadata][_index] and [@metadata][_id] {
    opensearch {
      hosts => ["${ENV_OUTPUT_OPENSEARCH_URL}"]
      user => "${ENV_OUTPUT_OPENSEARCH_USER}"
      password => "${ENV_OUTPUT_OPENSEARCH_PASSWORD}"
      index => "%{[@metadata][_index]}"
      ecs_compatibility => disabled
      manage_template => false
      failure_type_logging_whitelist => ["document_already_exists_exception","version_conflict_engine_exception"]
      http_compression => true
      target_bulk_bytes => 5242880
      action => "create"
      document_id => "%{[@metadata][_id]}"
    }
  }
}

Note: this config is part of a pipelines named 'output_opensearch_ds'

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions