简体   繁体   中英

How do I save ALL the cookies from a specific Site using curl or wget

Basically I am trying to do specific things on a certain website. The first thing I want to do is login and save the cookies which I have done successfully with something like this:

    curl "http://site/login" --data "email=email" --data "password=password" --cookie "cookies.txt" --cookie-jar "cookies.txt" --location > login.html

however this method of storing the cookies only gives me fields such as kayla and djcs_route. But this site has other fields that I need such as djcs_session and s_vnum . The cookie that I saved using the method above doesn't save these fields. When I go onto the google chrome cookies, I can see all of these fields. So my question is, is there a way to obtain all the fields.

I've attached a picture of the cookie fields that I want. 在此处输入图片说明 Is there a way to obtain the contents of the cookies and save it to a text file?

I AM NOT trying to extract cookies from the Chrome broswer, I am trying to emulate what it does and save all the session cookies to cookies.txt using curl

This is not much of a programming question, so it might be more appropriate on some other site, maybe on Unix & Linux or Super User. That said, the reason why your cookies.txt does not contain all cookies is that Chrome does not store temporary session cookies which form a major part of your 18 cookies there on disk.

You can use any of available third party cookie manager extensions to Chrome to export those session cookies to a text file.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM