Skip to content

Fluentd to Elasticsearch error: Invalid media-type value on headers [Content-Type, Accept] #1061

@Srijitha-code

Description

@Srijitha-code

I am encountering an error when Fluentd tries to ship logs to an Elasticsearch cluster. The error is:
2025-04-26 14:04:47 +0000 [warn]: #0 fluent/log.rb:383:warn: failed to flush the buffer. retry_times=3 next_retry_time=2025-04-26 14:04:55 +0000 chunk="633aeee3e06bda3cf586d204d14f472c" error_class=Fluent::Plugin::ElasticsearchOutput::RecoverableRequestFailure error="could not push logs to Elasticsearch cluster ({:host=>\"quickstart-v2-es-http\", :port=>9200, :scheme=>\"https\", :user=>\"elastic\", :password=>\"obfuscated\"}): [400] {\"error\":{\"root_cause\":[{\"type\":\"media_type_header_exception\",\"reason\":\"Invalid media-type value on headers [Content-Type, Accept]\"}],\"type\":\"media_type_header_exception\",\"reason\":\"Invalid media-type value on headers [Content-Type, Accept]\",\"caused_by\":{\"type\":\"status_exception\",\"reason\":\"A compatible version is required on both Content-Type and Accept headers if either one has requested a compatible version. Accept=null, Content-Type=application/vnd.elasticsearch+x-ndjson; compatible-with=9\"}},\"status\":400}"

Environment Details:

Fluentd Version: 1.18.0

Fluentd Gems:

fluent-plugin-elasticsearch version 5.4.3

elastic-transport version 8.4.0

elasticsearch version 9.0.2

elasticsearch-api version 9.0.2

Fluentd Deployment: Kubernetes Deployment

Elasticsearch Deployment: ECK (Elastic Cloud on Kubernetes) Operator 3.0

Elasticsearch Version: 9.0

OS Details: Debian GNU/Linux 12 (Bookworm) [ARM64]

Fluentd Config for Elasticsearch Output Plugin

<match local.test>
  @type elasticsearch
  host ******
  port 9200
  scheme https
  user elastic
  password *****
  verify_es_version_at_startup true
  default_elasticsearch_version 9
  ca_file /fluentd/etc/ca.crt
  index_name test-logs
  content_type application/x-ndjson
  logstash_format false
  include_timestamp true
  time_key @timestamp
  <buffer>
    @type file
    path /fluentd/buffer/local_test
    flush_interval 5s
  </buffer>
</match>

Problem Details:

Fluentd logs were not getting pushed to Elasticsearch.

The error suggested a mismatch between the Content-Type and Accept headers.

Accept was null, and Content-Type was application/vnd.elasticsearch+x-ndjson; compatible-with=9.

Workaround Found:
After adding a custom_headers field in the Fluentd config, the issue was resolved.
Here’s what I added in fluent.conf
custom_headers {"Accept":"application/vnd.elasticsearch+json; compatible-with=9"}
After applying this workaround, logs started successfully shipping to Elasticsearch.

Request for Guidance:

Is adding custom_headers like this the correct or recommended approach for Elasticsearch 9.0 compatibility?

Should there be a built-in fix or an official update in the fluent-plugin-elasticsearch or elasticsearch-ruby libraries to handle this automatically?

Are there any better ways to fix this without using a manual custom header override?

Any suggestions would be greatly appreciated! Thanks in advance.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions