您好,登錄后才能下訂單哦!
排查logstash3.4升級到5.0版本后kafka不兼容問題
參考文檔:
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.0.5/CHANGELOG.md /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.0.5/DEVELOPER.md /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.0.5/README.md.md /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.0.5/lib/logstash/inputs/kafka.rb /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-kafka-5.0.4/lib/logstash/outputs/kafka.rb
緣由:
之前對ELKB環境從2.4版本升級到最新的5.0穩定版本,主要升級步驟可以參考http://jerrymin.blog.51cto.com/3002256/1870205,后來發現kafka集群運行報錯,現在把排查過程記錄如下,僅供參考
之前環境:
logstash3.4
logstash-input-kafka-2.0.9
logstash-output-kafka-2.0.5
kafka_2.10-0.8.2.2.tgz
升級后環境:
logstash6.0
logstash-input-kafka-2.0.9
logstash-output-kafka-2.0.5
報錯信息:
[2016-11-16T14:35:44,739][ERROR][logstash.inputs.kafka ] Unknown setting 'zk_connect' for kafka [2016-11-16T14:35:44,741][ERROR][logstash.inputs.kafka ] Unknown setting 'topic_id' for kafka [2016-11-16T14:35:44,741][ERROR][logstash.inputs.kafka ] Unknown setting 'reset_beginning' for kafka
實施步驟:
1,根據錯誤查看程序哪里報錯
grep "Unknown setting" /usr/share/logstash/ -R /usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb: self.logger.error("Unknown setting '#{name}' for #{@plugin_name}")
2,查看程序相關代碼,發現需要查看plugins的config定義文件等
def validate_check_invalid_parameter_names(params) invalid_params = params.keys # Filter out parameters that match regexp keys. # These are defined in plugins like this: # config /foo.*/ => ... @config.each_key do |config_key| if config_key.is_a?(Regexp) invalid_params.reject! { |k| k =~ config_key } elsif config_key.is_a?(String) invalid_params.reject! { |k| k == config_key } end end if invalid_params.size > 0 invalid_params.each do |name| self.logger.error("Unknown setting '#{name}' for #{@plugin_name}") end return false end # if invalid_params.size > 0 return true end # def validate_check_invalid_parameter_names
3,進入插件總目錄查看具體信息
cd /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.0.5
發現重點查看如下文件
grep config ./* -R |awk '{print $1}' |uniq ./CHANGELOG.md: ./DEVELOPER.md:See ./lib/logstash/inputs/kafka.rb:# ./lib/logstash/inputs/kafka.rb: ./README.md:- Binary
1)首先看CHANGELOG.md,就有發現logstash-input-3.0.0.beta1開始就不在向后兼容,且剔除了jruby-kafka,注意這里有個坑2)會講到,4.0.0版本說開始支持kafka 0.9,5.0.0又說開始
支持0.10切不向后兼容,這破壞性更新也是夠了。看來問題找到了我的kafka版本是kafka_2.10-0.8.2.2.tgz,kafka版本不兼容導致的。
CHANGELOG.md部分文檔如下:
## 5.0.4 - Update to Kafka version 0.10.0.1 for bug fixes ## 5.0.0 - Support for Kafka 0.10 which is not backward compatible with 0.9 broker. ## 4.0.0 - Republish all the gems under jruby. - Update the plugin to the version 2.0 of the plugin api, this change is required for Logstash 5.0 compatibility. See https://github.com/elastic/logstash/issues/5141 - Support for Kafka 0.9 for LS 5.x ## 3.0.0.beta1 - Refactor to use new Java based consumer, bypassing jruby-kafka - Breaking: Change configuration to match Kafka's configuration. This version is not backward compatible
2)之前我看DEVELOPER.md文檔時,看配置語法都正確,還以為是卻少依賴關系jruby-kafka library呢,這個再logstash3.x是在用的(另外對比logstash6.x發現5版本少了不少插件。另外
kafka版本寫的是0.8.1.1,看來這個DEVELOPER.md沒有及時更新(與后面kafka.rb文件不一致),誰要是看到了麻煩及時更新啊,雖是小問題但是也可能誤導我等屁民。當然也有可能是我沒
有全面看文檔導致的。
DEVELOPER.md文檔結尾如下:
Dependencies ==================== * Apache Kafka version 0.8.1.1 * jruby-kafka library
3)開始看README.md文檔,特意看了下kafka的兼容性 看來logstas-input-kafka5.0.5和logstash-output-kafka5.0.4只能用kafka0.10了。如果你想用Kafka0.9還想用Logstash6.0,你的
logstash-input-kafka和logstash-output-kafka只能降級版本到4.0.0了,這里都說他是中間過渡版本了,所以還是隨大流吧。
## Kafka Compatibility Here's a table that describes the compatibility matrix for Kafka Broker support. Please remember that it is good advice to upgrade brokers before consumers/producers since brokers target backwards compatibility. The 0.9 broker will work with both the 0.8 consumer and 0.9 consumer APIs but not the other way around. | Kafka Broker Version | Logstash Version | Input Plugin | Output Plugin | Why? | |:---------------:|:------------------:|:--------------:|:---------------:|:------| | 0.8 | 2.0 - 2.x | < 3.0.0 | <3.0.0 | Legacy, 0.8 is still popular | | 0.9 | 2.0 - 2.3.x | 3.0.0 | 3.0.0 | Intermediate release before 0.10 that works with old Ruby Event API `[]` | | 0.9 | 2.4, 5.0 | 4.0.0 | 4.0.0 | Intermediate release before 0.10 with new get/set API | | 0.10 | 2.4, 5.0 | 5.0.0 | 5.0.0 | Track latest Kafka release. Not compatible with 0.9 broker |
4)現在看來只能升級kafka版本了。最后我看了下jar-dependencies發現了kafka-clients-0.10.0.1.jar
ls /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.0.5/vendor/jar-dependencies/runtime-jars/ kafka-clients-0.10.0.1.jar log4j-1.2.17.jar lz4-1.3.0.jar slf4j-api-1.7.21.jar slf4j-log4j12-1.7.21.jar snappy-java-1.1.2.6.jar
5)還有一個文件沒有看,懷著好奇心我看了一眼,發現之前都白費力氣了,這里才是最有參考價值的的主參考文檔啊,是捷徑啊,隱藏的夠深的,差點錯過了,汗!
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.0.5/lib/logstash/inputs/kafka.rb
kafka.rb部分文檔如下:
# This input will read events from a Kafka topic. It uses the the newly designed # 0.10 version of consumer API provided by Kafka to read messages from the broker. # # Here's a compatibility matrix that shows the Kafka client versions that are compatible with each combination # of Logstash and the Kafka input plugin: # # [options="header"] # |========================================================== # |Kafka Client Version |Logstash Version |Plugin Version |Security Features |Why? # |0.8 |2.0.0 - 2.x.x |<3.0.0 | |Legacy, 0.8 is still popular # |0.9 |2.0.0 - 2.3.x | 3.x.x |Basic Auth, SSL |Works with the old Ruby Event API (`event['product']['price'] = 10`) # |0.9 |2.4.0 - 5.0.x | 4.x.x |Basic Auth, SSL |Works with the new getter/setter APIs (`event.set('[product][price]', 10)`) # |0.10 |2.4.0 - 5.0.x | 5.x.x |Basic Auth, SSL |Not compatible with the 0.9 broker # |========================================================== # # NOTE: We recommended that you use matching Kafka client and broker versions. During upgrades, you should # upgrade brokers before clients because brokers target backwards compatibility. For example, the 0.9 broker # is compatible with both the 0.8 consumer and 0.9 consumer APIs, but not the other way around.
6)升級kafka_2.10-0.8.2.2.tgz為kafka_2.11-0.10.0.1.tgz (我看kafka-clients-0.10.0.1.jar,所以沒有用最新的kafka_2.11-0.10.1.0.tgz)
大概步驟
關閉老kafka
/usr/local/kafka/bin/kafka-server-stop.sh /usr/local/kafka/config/server.properties
備份老配置文件
server.properties和zookeeper.properties
刪除kafka
rm -rf /usr/local/kafka/
rm -rf /data/kafkalogs/*
安裝配置新kafka
wget http://mirrors.hust.edu.cn/apache/kafka/0.10.0.1/kafka_2.11-0.10.0.1.tgz
tar zxvf kafka_2.11-0.10.0.1.tgz -C /usr/local/
ln -s /usr/local/kafka_2.11-0.10.0.1 /usr/local/kafka
diff了下server.properties和zookeeper.properties變動不大可以直接使用
啟動新kafka
/usr/local/kafka/bin/kafka-server-start.sh /usr/local/kafka/config/server.properties &
7)注意幾個關鍵配置需要修改
config :bootstrap_servers, :validate => :string, :default => "localhost:9092"
config :group_id, :validate => :string, :default => "logstash"
config :topics, :validate => :array, :default => ["logstash"]
config :consumer_threads, :validate => :number, :default => 1
除了上面的幾個關鍵配置外,kafka的topic分片信息需要重新create一份,否則KafkaMonitor監控不出Active Topic Consumer圖形,但實際是在工作中。
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。