Loki对标EFK/ELK,由于其轻量的设计,备受欢迎,Loki相比EFK/ELK,它不对原始日志进行索引,只对日志的标签进行索引,而日志通过压缩进行存储,通常是文件系统存储,所以其操作成本更低,数量级效率更高
由于Loki的存储都是基于文件系统的,所以它的日志搜索时基于内容即日志行中的文本,所以它的查询支持LogQL,在搜索窗口中通过过滤标签的方式进行搜索和查询。
Loki分两部分,Loki是日志引擎部分,Promtail是收集日志端,然后通过Grafana进行展示.
1.安装grafana
wget https://dl.grafana.com/oss/release/grafana-8.2.5.linux-amd64.tar.gz
tar -zxvf grafana-8.2.5.linux-amd64.tar.gz
mv grafana-8.2.5 /usr/local/grafana
创建Systemd服务
cat>/usr/lib/systemd/system/grafana-server.service<<EOF
[Unit]
Description=Grafana Server
After=network.target
[Service]
Type=simple
User=root
WorkingDirectory=/usr/local/grafana
ExecStart=/usr/local/grafana/bin/grafana-server
Restart=on-failure
LimitNOFILE=65536
[Install]
WantedBy=multi-user.target
EOF
服务开机自启
systemctl daemon-reload
systemctl enable grafana-server.service && systemctl start grafana-server.service
/usr/local/grafana/conf/defaults.ini为默认配置文件
工作端口,默认为3000。
Loki的官方文档
https://grafana.com/docs/loki/latest/installation
Loki·安装
从官方文档看,Loki支持源码安装、Docker、Helm、Local、Tanka
我选择local,就是编译好的二进制可执行文件
安装步骤:
镜像网址
https://hub.fastgit.org/grafana
1.下载二进制可执行文件
https://hub.fastgit.org/grafana/loki/releases/download/v2.4.1/loki-linux-amd64.zip
unzip loki-linux-amd64.zip && mv loki-linux-amd64 /usr/bin/loki
chmod o+x /usr/bin/loki
创建工作目录并下载配置文件
mkdir -p /data/loki
参考配置文件
https://hub.fastgit.org/grafana/loki/blob/main/cmd/loki/loki-local-config.yaml
最终lok.yml如下
vim /data/loki/lok.yml
auth_enabled: false
server:
http_listen_port: 3100
grpc_listen_port: 9096
common:
path_prefix: /data/loki
storage:
filesystem:
chunks_directory: /data/loki/chunks
rules_directory: /data/loki/rules
replication_factor: 1
ring:
instance_addr: 127.0.0.1
kvstore:
store: inmemory
schema_config:
configs:
- from: 2020-10-24
store: boltdb-shipper
object_store: filesystem
schema: v11
index:
prefix: index_
period: 24h
##报警地址
ruler:
alertmanager_url: http://localhost:9093
#启用Redis缓存配置
auth_enabled: false
server:
http_listen_port: 3100
grpc_listen_port: 9096
common:
path_prefix: /data/loki
storage:
filesystem:
chunks_directory: /data/loki/chunks
rules_directory: /data/loki/rules
replication_factor: 1
ring:
instance_addr: 127.0.0.1
kvstore:
store: inmemory
schema_config:
configs:
- from: 2020-10-24
store: boltdb-shipper
object_store: filesystem
schema: v11
index:
prefix: index_
period: 24h
frontend:
compress_responses: true
query_range:
split_queries_by_interval: 24h
results_cache:
cache:
redis:
endpoint: 192.168.1.6:6379
expiration: 10s
db: 1
cache_results: true
chunk_store_config:
chunk_cache_config:
redis:
endpoint: 192.168.1.6:6379
expiration: 10s
db: 1
write_dedupe_cache_config:
redis:
endpoint: 192.168.1.6:6379
expiration: 1h
db: 2
ruler:
alertmanager_url: http://localhost:9093
创建相关目录
mkdir -p /data/loki/{chunks,rules,logs}
启动
nohup loki --config.file=/data/loki/loki.yml > /data/loki/logs/loki.log 2>&1 &
启动脚本
#!/bin/bash
CONFILE='/data/loki/loki.yml'
PIDFILE='/data/loki/loki.pid'
LOGFILE='/data/loki/loki.log'
OPTS='/usr/bin/loki --config.file='
PID=`ps -ef | grep ${CONFILE} | grep 'config.file' | awk '{print $2}'`
if [ ! ${PID} ]; then
nohup ${OPTS}${CONFILE} >${LOGFILE} 2>&1 & echo $! >${PIDFILE} &
sleep 3
echo "loki PID:`ps -ef | grep ${CONFILE} | grep 'config.file' | awk '{print $2}'`"
PID=`ps -ef | grep ${CONFILE} | grep 'config.file' | awk '{print $2}'`
if [ ! ${PID} ] ; then echo "loki config error, tail -f ${LOGFILE}!"
fi
else
echo "loki is running,PID:`cat ${PIDFILE}`"
fi
安装promtail进行收集日志
Promtail是收集日志端
下载安装
wget https://hub.fastgit.org/grafana/loki/releases/download/v2.4.1/promtail-linux-amd64.zip
unzip promtail-linux-amd64.zip && mv promtail-linux-amd64 /usr/bin/promtail && chmod o+x /usr/bin/promtail
配置参考文件
官网配置文件文档
https://hub.fastgit.org/grafana/loki/blob/main/clients/cmd/promtail/promtail-local-config.yaml
server:
http_listen_port: 9080
grpc_listen_port: 0
positions:
filename: /tmp/positions.yaml
clients:
- url: http://localhost:3100/loki/api/v1/push
scrape_configs:
- job_name: system
static_configs:
- targets:
- localhost
labels:
job: varlogs
__path__: /var/log/*log
参数说明
server部分定义监听端口,positions定义读取的文件偏移量存储位置,clients定义loki接口地址,最后一部分scrape_configs是重点部分
promtail通过scrape_configs部分配置收集日志的相关信息,以测试配置文件为例:
job_name 用来区分日志组
static_configs 收集日志的静态配置
targets 收集日志的节点,这个参数其实是在自动发现的时候使用的
labels 定义一个要收集的日志文件和一组可选的附加标签
job 标签名称,在grafana索引的时候用到的标签名称
__path__ 定义日志收集的文件或路径,支持正则
配置文件修改完成后,就可以启动promtail了,和loki启动方法一样,通过--config.file指定配置文件启动
promtail,类似于tail,它只监听新增日志,不会像filebeat一样,读取日志所有内容,这是和filebeat的一个区别
最终配置,启用静态与文件自动发现
mkdir -p /data/promtail
vim /data/promtail/promtail.yml
server:
http_listen_port: 9080
grpc_listen_port: 0
positions:
filename: /tmp/positions.yaml
clients:
- url: http://192.168.1.6:3100/loki/api/v1/push
scrape_configs:
- job_name: system
static_configs:
- targets:
- localhost
labels:
job: varlogs
__path__: /var/log/*log
file_sd_configs:
- files:
- /data/promtail/log_file/*.json
refresh_interval: 1m
启动
创建相关文件
mkdir -p /data/promtail/log_file/
文件自动发现配置
vim /data/promtail/log_file/mail.json
[
{
"targets": [ "localhost" ],
"labels": {
"__path__": "/var/log/maillog",
"job": "mailserver"
}
}
]
开机自启动
nohup promtail --config.file=/data/promtail/promtail.yml >/data/promtail/promtail.log 2>&1 &
启动脚本
vim /data/promtail/start.sh
#!/bin/bash
CONFILE='/data/promtail/promtail.yml'
PIDFILE='/data/promtail/promtail.pid'
LOGFILE='/data/promtail/promtail.log'
OPTS='/usr/bin/promtail --config.file='
PID=`ps -ef | grep ${CONFILE} | grep 'config.file' | awk '{print $2}'`
if [ ! ${PID} ]; then
nohup ${OPTS}${CONFILE} >${LOGFILE} 2>&1 & echo $! >${PIDFILE} &
sleep 3
echo "promtail PID:`ps -ef | grep ${CONFILE} | grep 'config.file' | awk '{print $2}'`"
PID=`ps -ef | grep ${CONFILE} | grep 'config.file' | awk '{print $2}'`
if [ ! ${PID} ] ; then echo "promtail config error, tail -f ${LOGFILE}!"
fi
else
echo "promtail is running,PID:`cat ${PIDFILE}`"
fi
chmod o+x /data/promtail/start.sh
查看是否工作
[root@openrestry promtail]# netstat -ntpl | grep loki
tcp6 0 0 :::9096 :::* LISTEN 1721/loki
tcp6 0 0 :::3100 :::* LISTEN 1721/loki
[root@openrestry promtail]# netstat -ntpl | grep promtail
tcp6 0 0 :::40210 :::* LISTEN 1833/promtail
tcp6 0 0 :::9080 :::* LISTEN 1833/promtail
在Grafana中添加显示输出
登录http://Grafana:3000,添加loki数据源