简体   繁体   中英

How can I push text log files into Cloud Logging?

I have an application (Automation Anywhere A360) that whenever I want to log something with the app it will log it into a txt/csv file. I run a process in Automation Anywhere that is run in 10 bot runners (Windows VMs) concurrently (so each bot runner is going to log what is going on locally)

My intention is that instead of having sepparate log files for each bot runner, I'd like to have a centralized place where I store all the logs (ie Cloud Logging).

I know that this can be accomplished using Python, Java, etc. However, if every time I need to log something into Cloud Logging I invoke a Python script, even though that does the job, it takes around 2-3 seconds (I think this is a bit slow) connecting to gcp client and logging in (taking in this first step most of the time).

How woud you guys tackle this?

To whom it may (still) concern: You could use fluentd to forward logs to pubSub and from there to a Cloud Logging bucket.

https://flugel.it/infrastructure-as-code/how-to-setup-fluentd-to-retrieve-logs-send-them-to-gcp-pub-sub-to-finally-push-them-to-elasticsearch/

The solution that I am looking for is something like this . It is named BindPlane and it can collect log data from on-premises and hybrid infra and send it to GCP monitoring/logging stack

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM