您好,登錄后才能下訂單哦!
這期內容當中小編將會給大家帶來有關heka從kalka中讀取數據的示例分析,文章內容豐富且以專業的角度為大家分析和敘述,閱讀完這篇文章希望大家可以有所收獲。
heka從kalka中讀取數據。
配置:
[hekad]
maxprocs = 2
[KafkaInputExample]
type = "KafkaInput"
topic = "test"
addrs = ["localhost:9092"]
[RstEncoder]
[LogOutput]
message_matcher = "TRUE"
encoder = "RstEncoder"
上述配置只有從kalfka中讀取數據并顯示到console,寫到kalfka中數據,
結果
:Timestamp: 2016-07-21 09:39:46.342093657 +0000 UTC
:Type: heka.kafka
:Hostname: master
:Pid: 0
:Uuid: 501b0a0e-63a9-4eee-b9ca-ab572c17d273
:Logger: KafkaInputExample
:Payload: {"msg":"Start Request","event":"artemis.web.ensure-running1","userid":"12","extra":{"workspace-id":"cN907xLngi"},"time":"2015-05-06T 20:40:05.509926234Z","severity":1}
:EnvVersion:
:Severity: 7
:Fields:
| name:"Key" type:bytes value:
| name:"Topic" type:string value:"test"
| name:"Partition" type:integer value:0
| name:"Offset" type:integer value:8
讀取出來的數據放到了payload中,而fileds中存放了讀取kalkfa中的一些信息。那么可以使用jsondecoder進行解析。
[hekad]
maxprocs = 2
[KafkaInputExample]
type = "KafkaInput"
topic = "test"
addrs = ["localhost:9092"]
decoder="JsonDecoder"
[JsonDecoder]
type = "SandboxDecoder"
filename = "lua_decoders/json.lua"
[JsonDecoder.config]
type = "artemis"
payload_keep = true
map_fields = true
Severity = "severity"
[RstEncoder]
[LogOutput]
message_matcher = "TRUE"
encoder = "RstEncoder"
結果如下:
:Timestamp: 2016-07-21 09:42:34 +0000 UTC
:Type: artemis
:Hostname: master
:Pid: 0
:Uuid: 3965285c-70ac-4069-a1a3-a9bcf518d3e8
:Logger: KafkaInputExample
:Payload: {"msg":"Start Request","event":"artemis.web.ensure-running2","userid":"11","extra":{"workspace-id":"cN907xLngi"},"time":"2015-05-06T 20:40:05.509926234Z","severity":1}
:EnvVersion:
:Severity: 1
:Fields:
| name:"time" type:string value:"2015-05-06T 20:40:05.509926234Z"
| name:"msg" type:string value:"Start Request"
| name:"userid" type:string value:"11"
| name:"event" type:string value:"artemis.web.ensure-running2"
| name:"extra.workspace-id" type:string value:"cN907xLngi"
經過decoder解析之后,fileds發生了改變,但是我們可以看到Logger顯示的還是KafkaInputExample,說明數據不是decoder產生,而是Input產生,只不過使用了decoder進行了解析,重寫改寫了fields而已。
接下來,把數據錄入都es中吧。
[hekad]
maxprocs = 2
[KafkaInputExample]
type = "KafkaInput"
topic = "test"
addrs = ["localhost:9092"]
decoder="JsonDecoder"
[JsonDecoder]
type = "SandboxDecoder"
filename = "lua_decoders/json.lua"
[JsonDecoder.config]
type = "artemis"
payload_keep = true
map_fields = true
Severity = "severity"
[ESJsonEncoder]
index = "%{Type}-%{%Y.%m.%d}"
es_index_from_timestamp = true
type_name = "%{Type}"
[ESJsonEncoder.field_mappings]
Timestamp = "@timestamp"
Severity = "level"
[ElasticSearchOutput]
message_matcher = "TRUE"
encoder = "ESJsonEncoder"
flush_interval = 1
導入到es中,也需要json,所以使用ESJsonEncoder,同時指定索引名字和類型。執行結果如下,
可以看到,除了heka中元數據field之外,還有JsonDecoder生成field啊,其實是截取JsonDecoder中的fields屬性中拿出。注意,Payload不解析。
:Fields:
| name:"time" type:string value:"2015-05-06T 20:40:05.509926234Z"
| name:"msg" type:string value:"Start Request"
| name:"userid" type:string value:"11"
| name:"event" type:string value:"artemis.web.ensure-running2"
| name:"extra.workspace-id" type:string value:"cN907xLngi"
這些field當然隨著數據不同而不同,那么稱之為dynamic fileds。
入es的時候,可以指定提取哪些dynamic fields,
fields=["Timestamp","Uuid","Type","Logger","Pid","Hostname","DynamicFields"]
dynamic_fields=["msg","userid"]
只要使用dynamic_fileds,就必須要在fields中指定DynamicFields。
如果沒有dynamic_fileds,那么fields只能列舉幾個固定的屬性,參照官方文檔即可。
完成的列子:
[hekad]
maxprocs = 2
[KafkaInputExample]
type = "KafkaInput"
topic = "test"
addrs = ["localhost:9092"]
decoder="JsonDecoder"
[JsonDecoder]
type = "SandboxDecoder"
[hekad]
maxprocs = 2
[KafkaInputExample]
type = "KafkaInput"
topic = "test"
addrs = ["localhost:9092"]
decoder="JsonDecoder"
[JsonDecoder]
type = "SandboxDecoder"
filename = "lua_decoders/json.lua"
[JsonDecoder.config]
type = "artemis"
payload_keep = true
map_fields = true
Severity = "severity"
[ESJsonEncoder]
index = "%{Type}-%{%Y.%m.%d}"
es_index_from_timestamp = true
type_name = "%{Type}"
fields=["Timestamp","Uuid","Type","Logger","Pid","Hostname","DynamicFields"]
dynamic_fields=["msg","userid"]
raw_bytes_fields=["Payload"]
[ESJsonEncoder.field_mappings]
Timestamp = "@timestamp"
Severity = "level"
[ElasticSearchOutput]
message_matcher = "TRUE"
encoder = "ESJsonEncoder"
flush_interval = 1
結果如下,
上述就是小編為大家分享的heka從kalka中讀取數據的示例分析了,如果剛好有類似的疑惑,不妨參照上述分析進行理解。如果想知道更多相關知識,歡迎關注億速云行業資訊頻道。
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。