如何解决通过Logstash读取静态文件一次
我正在尝试将logstash配置为仅读取一次大文件,而不监视文件的更改。 每个文件只有一个“消息”,应流化为弹性。
我正在使用以下配置:
input {
file {
path => "/data/output/**/*"
start_position => "beginning"
sincedb_path => "/dev/null"
exit_after_read => true
discover_interval => 1
codec => json
mode => read
file_chunk_size => 32768000
}
}
filter {
json {
source => "message"
}
}
output {
elasticsearch {
hosts => ["elasti:9200"]
index => "index-%{+YYYY-MM-dd}"
document_id => "%{id}"
}
}
但是,当我查看日志时,logstash似乎在读取这些文件(仅包含一条消息)时遇到问题,并且偶尔会自行重新启动它。
[2020-09-13T06:34:05,713][INFO ][filewatch.readmode.handlers.readfile][main][71f20dc2f83fae2b22bd5585c661a01ba9b13db3f67db0357343d138aa83324d] buffer_extract: a delimiter can't be found in current chunk,maybe there are no more delimiters or the delimiter is incorrect or the text before the delimiter,a 'line',is very large,if this message is logged often try increasing the `file_chunk_size` setting. {"delimiter"=>"\n","read_position"=>0,"bytes_read_count"=>682,"last_kNown_file_size"=>682,"file_path"=>"/data/output/21c6/21c6a448f5660e72ded7549de5fd5060"}
[2020-09-13T06:34:05,713][INFO ][filewatch.readmode.handlers.readfile][main] [71f20dc2f83fae2b22bd5585c661a01ba9b13db3f67db0357343d138aa83324d] buffer_extract: a delimiter can't be found in current chunk,"bytes_read_count"=>904,"last_kNown_file_size"=>904,"file_path"=>"/data/output/c53b/c53b5bae01ab7d262b9355e21937116e"}
[2020-09-13T06:34:05,714][INFO ][filewatch.readmode.handlers.readfile][main][71f20dc2f83fae2b22bd5585c661a01ba9b13db3f67db0357343d138aa83324d] buffer_extract: a delimiter can't be found in current chunk,"bytes_read_count"=>1128,"last_kNown_file_size"=>1128,"file_path"=>"/data/output/cf4c/cf4c0df7b3218e1f3ac6774f31c13744"}
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。