簡體   English   中英

Clickhouse - 數據轉換/解析

[英]Clickhouse - data transformation/parsing

我們使用 Clickhouse 來存儲 HAProxy 和 Kong 日志和指標。

“管道”是圍繞 syslog 協議和 rsyslog 構建的,如下所示:HAProxy/Kong -> 本地 rsyslog -> 遠程 rsyslog (TCP) -> omclickhouse rsyslog 模塊 -> clickhouse。

syslog 消息的格式在 HAProxy 和 Kong 之間當然不同。

HAProxy 消息如下所示:

1.2.3.4:58629 [06/Jun/2020:14:54:59.932] HTTPS~ HAPBACKEND/HAPSERVER 0/0/1/36/37 200 778 - - ---- 32/32/1/1/0 0/0 "GET /api/map/v2/GetSomeStuff/json?Latitude=47.22960133109915&Longitude=-1.5727845858797176 HTTP/1.1"

如此處所述: https://cbonte.github.io/haproxy-dconv/1.7/configuration.html#8.2.3

Kong 消息是基於 JSON 的,如下所示:

{
    "request": {
        "method": "GET",
        "uri": "/get",
        "url": "http://httpbin.org:8000/get",
        "size": "75",
        "querystring": {},
        "headers": {
            "accept": "*/*",
            "host": "httpbin.org",
            "user-agent": "curl/7.37.1"
        },
        "tls": {
            "version": "TLSv1.2",

如此處所述: https://docs.konghq.com/hub/kong-inc/syslog/

rsyslog omclickhouse 模塊(默認情況下)將所有 syslog 消息插入到名為“SystemEvents”的表中,該表具有以下結構:

┌─severity─┬─facility─┬───────────timestamp─┬─hostname─────────────────┬─tag────────────┬─message──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│        6 │       18 │ 2020-06-06 15:01:00 │ reverseproxy.fqdn        │ haproxy[6892]: │  1.2.3.4:57220 [06/Jun/2020:15:00:59.996] HTTPS~ HAPBACKEND/HAPSRV 15/0/1/2/18 500 617 - - ---- 48/42/9/9/0 0/0 "POST /SOAPService HTTP/1.1" │
└──────────┴──────────┴─────────────────────┴──────────────────────────┴────────────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘

(我們不想涉足自定義 rsyslog 解析 C 模塊的開發)

出於報告目的,我們感興趣的是 syslog 消息字段中包含的 HAProxy(或 Kong)詳細信息,而不是整個 syslog 內容本身。 因此,為了能夠獲得“細粒度”的查詢能力,我們創建了另一個表,比如“HAPROXY_LOGS”,其結構如下:

(`CLIENT_IP` String, `CLIENT_PORT` Int32, `REQUEST_DATE` DateTime, `FRONTEND_NAME` String, `BACKEND_NAME` String, `SERVER_NAME` String, `TREQ` Int32, `TWAIT` Int32, `TCONNECTION` Int32, `TRESPONSE` Int32, `TACTIVE` Int32, `STATUS_CODE` Int32, `BYTES_READ` Int32, `CAPTURED_REQUEST_COOKIE` String, `CAPTURED_RESPONSE_COOKIE` String, `TERMINATION_STATE` String, `ACTCONN` Int32, `FECONN` Int32, `BECONN` Int32, `SRV_CONN` Int32, `RETRIES` Int32, `SRV_QUEUE` Int32, `BACKEND_QUEUE` Int32, `METHOD` String, `REQUEST` String, `PARAMETERS` String, `PROTOCOL` String) ENGINE = MergeTree() PARTITION BY toYYYYMM(REQUEST_DATE) ORDER BY (REQUEST_DATE, TRESPONSE, STATUS_CODE, PARAMETERS) SETTINGS index_granularity = 8192

這就是事情開始變得更奇怪的地方...... Clickhouse 本身似乎既不提供某種調度程序,à la MSSQL,也不提供將編程語言嵌入引擎的方法(PL/pgSQL,PL/ Python - 類似),也不是觸發器(我們還沒有研究物化視圖)。 So, to transform and move the data from one table to another, a shell script is launched by cron every minute, use clickhouse-client to get the input data, pipe it to a Python script, whose result itself is then piped again to clickhouse -客戶端插入:

* * * * * { /usr/bin/clickhouse-client < /path/clkh/extract-system-events.sql | /path/clkh/latestmessages-to-TSV-pipe.py 2>/path/clkh/errors-haproxy.log ; } |/usr/bin/clickhouse-client --query="INSERT INTO HAPROXY_LOGS FORMAT TSV" >> /var/log/latestmessages-to-TSV-pipe.log

Python 腳本對於 HAProxy 和 Kong 解析是不同的。

聽起來像一個骯臟的黑客......

有沒有更好的方法來完成同樣的事情?

(盡管有這個 hack,整個東西都很好,報告構建時間大大減少了,Clickhouse 存儲了 600M+ 行沒有任何問題。)

謝謝

我認為在 ClickHouse 之外轉換數據是正確的方法。

然而,CH 可以自行承擔。 讓我們以 JSON 日志為例,將使用物化視圖和豐富的json 相關函數集):

/* Table that store JSON-logs from several sources. */
CREATE TABLE Raw_Json_Logs (
  time DateTime DEFAULT now(),
  json String,
  log_type LowCardinality(String)
) ENGINE = MergeTree()
ORDER BY time;

/* Table for Kong-logs. */
CREATE MATERIALIZED VIEW Kong_Logs (
  time DateTime DEFAULT now(),
  raw_json String,
  /* define the required log-attributes that should be stored in separate columns */
  method LowCardinality(String),
  host LowCardinality(String),
  /* .. */
  raw_response_headers String
  /* .. */
) ENGINE = MergeTree()
ORDER BY (time, method, host /* .. */)
AS 
SELECT 
  time,
  json AS raw_json,
  JSONExtractString(json, 'request', 'method') AS method,
  JSONExtractString(json, 'request', 'headers', 'host') AS host,
  JSONExtractRaw(json, 'response', 'headers') AS raw_response_headers
  /* .. */
FROM Raw_Json_Logs
/* Takes only Kong-specific logs. */
WHERE log_type = 'kong';

測試數據集:

INSERT INTO Raw_Json_Logs(json, log_type)
VALUES ('{"request":{"method":"GET","uri":"/get","url":"http://httpbin.org:8000/get","size":"75","querystring":{},"headers":{"accept":"*/*","host":"httpbin.org","user-agent":"curl/7.37.1"},"tls":{"version":"TLSv1.2","cipher":"ECDHE-RSA-AES256-GCM-SHA384","supported_client_ciphers":"ECDHE-RSA-AES256-GCM-SHA384","client_verify":"NONE"}},"upstream_uri":"/","response":{"status":200,"size":"434","headers":{"Content-Length":"197","via":"kong/0.3.0","Connection":"close","access-control-allow-credentials":"true","Content-Type":"application/json","server":"nginx","access-control-allow-origin":"*"}},"tries":[{"state":"next","code":502,"ip":"127.0.0.1","port":8000},{"ip":"127.0.0.1","port":8000}],"authenticated_entity":{"consumer_id":"80f74eef-31b8-45d5-c525-ae532297ea8e","id":"eaa330c0-4cff-47f5-c79e-b2e4f355207e"},"route":{"created_at":1521555129,"hosts":null,"id":"75818c5f-202d-4b82-a553-6a46e7c9a19e","methods":null,"paths":["/example-path"],"preserve_host":false,"protocols":["http","https"],"regex_priority":0,"service":{"id":"0590139e-7481-466c-bcdf-929adcaaf804"},"strip_path":true,"updated_at":1521555129},"service":{"connect_timeout":60000,"created_at":1521554518,"host":"example.com","id":"0590139e-7481-466c-bcdf-929adcaaf804","name":"myservice","path":"/","port":80,"protocol":"http","read_timeout":60000,"retries":5,"updated_at":1521554518,"write_timeout":60000},"workspaces":[{"id":"b7cac81a-05dc-41f5-b6dc-b87e29b6c3a3","name":"default"}],"consumer":{"username":"demo","created_at":1491847011000,"id":"35b03bfc-7a5b-4a23-a594-aa350c585fa8"},"latencies":{"proxy":1430,"kong":9,"request":1921},"client_ip":"127.0.0.1","started_at":1433209822425}', 'kong');

INSERT INTO Raw_Json_Logs(json, log_type)
VALUES ('{}', 'other_type');

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM