I have setup a cluster on aws using kops. I have installed elasticsearch, kibana and fluentd in a namespace logging. I also created an ingress routing for the same. I can access the kibana dashboard, but when I try to create index; It asks for adding integration like below. How Can I connect it with ..
I have multiple microservices hosted on AWS EKS. Each Service is Multi tenant i.e. it caters to multiple tenants. I have integrated Cloudwatch Container Insights using Fluentd. Fluentd is creating one Log stream for each active Service pod. Now, I want to create separate Log stream for each Tenant with only their logs. So, Ideally ..
I am using Docker for Windows on Windows 10 with WSL I want to collect docker logs using Fluentd I figured out that docker logs on my machine store here wsl$docker-desktop-dataversion-pack-datacommunitydockercontainers I need to bind mount this folder to fluend container Any ideas for that? Source: Docker..
I am trying to write file on S3 using fluentd S3 out plugin but its giving me Access denied error. Here my code and error logs Code: <match foo> @type s3 aws_key_id ************* aws_sec_key ************** s3_bucket ************ s3_region us-east-2 path logs/ <buffer tag,time> @type file path /var/log/fluent/s3 timekey 3600 # 1 hour partition timekey_wait 10m ..
I have multiple containers running in docker-compose & fluentd also in docker v1.12-1 I need to add a version from tag of each image into it’s resulting fluentd log. So if I upgrade some containers to new a version I can see what version is running. I don’t want to embed any variable & etc ..
How do I pass docker image tag (docker-compose) to logs? I want to see app version in logs & using denver tags for that. I’m using fluentd driver & latest fluentd 1.12. I was looking at log tag option but I don’t see any way to pass tag there https://docs.docker.com/config/containers/logging/log_tags/ Source: Docker..
need suggestions how can i capture containers log using stdout or stderr ? within a pod on following use case ? my pod contains 3 containers where i want third container to capture logs by using any of these longing options filebeat, logstash or fluentd. i dont want to save logs in file within containers. ..
Problem: I have a complicated setup where I use Elasticsearch and FluentD as part of my logging stack. I would like to add a metric and test the FluentD config for that. How can I do that? Idea: Use docker-compose to start an EFK stack. Create a docker container that writes to stdout and simulates ..
my configuration of ” fluent-plugin-concat” is causing my long logs to disappear instead of be concatenated and sent to Kinesis steam. I use fluentd to send logs from containers deployed on AWS/ECS to a kinesis stream. ( and then to ES cluster somewhere) on rare occasions, some of the logs are very big. most of ..
I am trying to take my docker container logs with fluentd. both application and fluentd process start through supervisord and both are in the same container but fluentd only taking half of the application logs. I need to fetch the logs from the beginning. adding fluentd conf below:- <source> type tail path /var/log/* path_key path ..