About the filter of logstash

logstash analysis log.
the log is as follows:

2018-07-19 14:02:12.577|INFO |[164acb702b5-12] - crmlog|1234567890|118|0|!|263|xxxx

logstash is configured as follows

input {
  file {
    path => [ "xxxx\log.txt" ]
    start_position => beginning
    type => "crm"
    codec => plain { charset => "UTF-8" }
  }
}
filter{
grok{
      match => {
        "message" => "(?<orderDate>%{YEAR:year}\-%{MONTH:month}\-%{DAY:day}%{TIME:time}\.%{INT:int})\|%{WORD:level}\|\[%{USER:pid}\]\-%{WORD:logType}\|%{USER:num}\|%{INT:operation}\|%{INT:result}\|%{DATA:resultmessage}\|%{INT:time}\|%{USER:operator}" 
      }
    }
}

output to elasticsearch, check in kibana, only:

 "message": """2018-07-19 13:40:02.057|INFO |[164acb5f03b-28] - crmlog|1234567890|105|11100007|\xD3\u{B2EF4}\xE6\xD4\xF2\xD2\xD1\xCF\xFA|8|xxxx\r""",

without enumerating all the fields, the effect is like this:

"orderDate":2018-07-19 13:40:02.057,
"level":INFO,

there are also garbled codes in Chinese.
Please T ^ T.

Mar.28,2021
Menu