虚拟化容器,大数据,DBA,中间件,监控。

ELK logstash json过滤器

20 12月
作者:admin|分类:大数据

Description


This is a JSON parsing filter. It takes an existing field which contains JSON and expands it into an actual data structure within the Logstash event.

这是一个JSON解析过滤器。它采用一个包含JSON的现有字段,并将其扩 展为Logstash事件内的实际数据结构。

By default, it will place the parsed JSON in the root (top level) of the Logstash event, but this filter can be configured to place the JSON into any arbitrary event field, using the target configuration.

默认情况下,它将解析的JSON放置在Logstash事件的根(顶级)中,但是可以使用此配置将该过滤器配置为将JSON放置在任何任意事件字段中 target

This plugin has a few fallback scenarios when something bad happens during the parsing of the event. If the JSON parsing fails on the data, the event will be untouched and it will be tagged with _jsonparsefailure; you can then use conditionals to clean the data. You can configure this tag with the tag_on_failure option.

当在事件解析期间发生不好的情况时,此插件有一些备用方案。如果对数据进行JSON解析失败,则事件将保持不变,并使用标记 _jsonparsefailure;然后,您可以使用条件清理数据。您可以使用tag_on_failure选项配置此标签 。

If the parsed data contains a @timestamp field, the plugin will try to use it for the events @timestamp, and if the parsing fails, the field will be renamed to _@timestamp and the event will be tagged with a _timestampparsefailure.

如果解析的数据包含一个@timestamp字段,则插件将尝试将其用于事件@timestamp,如果解析失败,则将该字段重命名为,_@timestamp并使用标记事件 _timestampparsefailure

 

JSON Filter Configuration Options(JSON过滤器配置选项)


This plugin supports the following configuration options plus the Common Options described later.

Setting Input type Required

skip_on_invalid_json

boolean

No

source

string

Yes

tag_on_failure

array

No

target

string

No

1.source

  • This is a required setting.
  • Value type is string
  • There is no default value for this setting.

The configuration for the JSON filter:

source => source_field

For example, if you have JSON data in the message field:

    filter {
      json {
        source => "message"
      }
    }

The above would parse the JSON from the message field.

2.skip_on_invalid_json

  • Value type is boolean
  • Default value is false

Allows for skipping the filter on invalid JSON (this allows you to handle JSON and non-JSON data without warnings)

3.tag_on_failure

  • Value type is array
  • Default value is ["_jsonparsefailure"]

 Append values to the tags field when there has been no successful match

3.target

  • Value type is string
  • There is no default value for this setting.

Define the target field for placing the parsed data. If this setting is omitted, the JSON data will be stored at the root (top level) of the event.

For example, if you want the data to be put in the doc field:

filter {
      json {
        target => "doc"
      }
    }

 JSON in the value of the source field will be expanded into a data structure in the target field.

 

if the target field already exists, it will be overwritten!

 

Common Options


The following configuration options are supported by all filter plugins:

Setting Input type Required

add_field

hash

No

add_tag

array

No

enable_metric

boolean

No

id

string

No

periodic_flush

boolean

No

remove_field

array

No

remove_tag

array

No

add_field

  • Value type is hash
  • Default value is {}

If this filter is successful, add any arbitrary fields to this event. Field names can be dynamic and include parts of the event using the %{field}.

Example:

 filter {
      json {
        add_field => { "foo_%{somefield}" => "Hello world, from %{host}" }
      }
    }
# You can also add multiple fields at once:
    filter {
      json {
        add_field => {
          "foo_%{somefield}" => "Hello world, from %{host}"
          "new_field" => "new_static_value"
        }
      }
    }

If the event has field "somefield" == "hello" this filter, on success, would add field foo_hello if it is present, with the value above and the %{host} piece replaced with that value from the event. The second example would also add a hardcoded field. 

浏览574 评论0
返回
目录
返回
首页
ELK logstash 过滤插件:JSON ELK Logstash 输入file插件