简体   繁体   中英

How to get the logs of a POD in openshift to local file

I have my spring boot application running on Openshift as container built using docker Image. I have enabled the logging in my application and prints all the logs. Now i want to examine the log files of the POD to check for any errors, since one my request is failing. I do know about the command line option oc logs -f <podname> That just prints the log into cmd prompt, but i want whole log to be copied from server to local file. So that i can find some particular lines or errors. Is is possible?

You can copy files in and out of pods using the rsync command.

Or use the logs command like you are and just redirect to a file so you can edit it locally:

oc logs  <podname> &> /path/to/file

oc rsync

Try oc rsync as suggested above:

oc rsync <pod>:/path/to/file localfilename

However in my case I got:

WARNING: cannot use rsync: rsync not available in container <pod>

oc cp

So I tried try oc cp , and it worked:

oc cp <namespace>/<pod>:/path/to/file local_filename

Without specifying the namespace the copy command will not work (and display no error message), so I had to which project the pod belongs.

Identify a pod's project/namespace

  • <pod> is the pod name
  • <namespace> is actually the project the <pod> belongs to.
  • use oc project to list current project, or
  • oc projects to list all projects
  • or or search for the pod name in all projects oc get pods --all-namespaces | egrep <pod> oc get pods --all-namespaces | egrep <pod>

Important Note

 # !!!Important Note!!!
 # Requires that the 'tar' binary is present in your container
 # image.  If 'tar' is not present, 'kubectl cp' will fail.
 # about my environment
 # oc version
 # oc 3.6, openshift 3.7, kubernetes 1.7

Docs:

OpenShift 4.3 Oficial Documentation - copying files

From your oc CLI tool execute:

oc logs pod_name -n project_name > filename.log

That just prints the log into cmd prompt, but i want whole log to be copied from server to local file. So that i can find some particular lines or errors. Is is possible?

What about check /var/log/containers on the node which pods are running? There are all container logs which are symbolic links as <pod name>_<namespace>_<container name>-<hash> format. Basically, oc logs also refers the same container logs at there.

eg>

node ~# ls -1 /var/log/containers
alertmanager-main-0_openshift-monitoring_alertmanager-123...789.log
alertmanager-main-0_openshift-monitoring_alertmanager-456...123.log
alertmanager-main-0_openshift-monitoring_alertmanager-proxy-789...456.log
...

login from oc cli tools, then switch to project, execute:

oc logs <podname> >> <podname>.log

To download the file from fabric pod to your local machine

  • To Connect to the fabric instance with oc login command

    oc login url--token=&lttoken>>

  • Check to connect pod using terminal

    oc rsh &ltpodname>

    Just Check if it connects to POD and do some ls -lh(it should give some response)

  • To Copy the file from remote POD to your local:

    oc rsync &ltpodname>:&ltpath>/logs.txt localfilename

  1. Extract the oc login command.
  2. Login into oc using login command on cli client.
  3. Navigate to namespace where pod is hosted -> oc project project-name
  4. Type in -> oc logs podname > pods_logs.txt

Log file is generated in the current directory.

oc logs pod_name -n project_name > filename.log

this works for me

for me the way it works was to copy the folder with the logs I was intended to get from my pod to the default path were the pod sent you when you login in the pod and then apply:

oc cp <namespace>/<pod>:<myFolder> .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM