简体   繁体   中英

Web Scraping a Forum Post in Python Using Beautiful soup and lxml Cannot get all posts

Im having an issue that is driving me absolutely crazy. I am a newbie to web scraping, and I am practicing web scraping by trying to scrape the contents of a forum post, namely the actual posts people made. I have isolated the posts to what i think contains the text which is div id="post message_ 2793649 (see attached Screenshot_1 for better representation of the html) Screenshot_1

The example above is just one of many posts. Each post has its own unique identifier number, but the rest is consistent as div id="post_message_.

here is what I am stuck at currently

import requests
from bs4 import BeautifulSoup
import lxml

r = requests.get('http://www.catforum.com/forum/43-forum-fun/350938-count-one-    billion-2016-a-120.html')

soup = BeautifulSoup(r.content)

data = soup.find_all("td", {"class": "alt1"})

for link in data:
    print(link.find_all('div', {'id': 'post_message'}))

the above code just creates a bunch of empty lists that go down the page its so frustrating. (See Screenshot_2 for the code that I ran with its output next to it) Screenshot_2 What am I missing.

The end result that I am looking for is just all the contents of what people said contained in a long string without any of the html clutter.

I am using Beautiful Soup 4 running the lxml parser

You have a couple of issues, the first being you have multiple spaces in the url so you are not going to the page you think you are:

In [50]: import requests


In [51]: r.url # with spaces
Out[51]: 'http://www.catforum.com/forum/43-forum-fun/350938-count-one-billion-2016-a-120.html'
Out[49]: 'http://www.catforum.com/forum/'

In [50]: r = requests.get('http://www.catforum.com/forum/43-forum-fun/350938-count-one-billion-2016-a-120.html')

In [51]: r.url # without spaces
Out[51]: 'http://www.catforum.com/forum/43-forum-fun/350938-count-one-billion-2016-a-120.html'

The next issue is the id's start with post_message , none are equal to post_message exactly, you can use a css selector that will match id's starting with post_message to pull all the divs you want, then just extract the text:

r = requests.get('http://www.catforum.com/forum/43-forum-fun/350938-count-one-billion-2016-a-120.html')

soup = BeautifulSoup(r.text)


for div in soup.select('[id^=post_message]'):
     print(div.get_text("\n", strip=True))

Which will give you:

11311301
Did you get the cortisone shots? Will they have to remove it?
My Dad and stepmom got a new Jack Russell! Her name's Daisy. She's 2 years old, and she's a rescue(d) dog. She was rescued from an abusive situation. She can't stand noise, and WILL NOT allow herself  to be picked up. They're working on that. Add to that the high-strung, hyper nature of a Jack Russell... But they love her. When I called last night, Pat was trying to teach her 'sit'!
11302
Well, I tidied, cleaned, and shopped. Rest of the list isn't done and I'm too tired and way too hot to care right now.
Miss Luna is howling outside the Space Kitten's room because I let her out and gave them their noms. SHE likes to gobble their food.....little oink.
11303
Daisy sounds like she has found a perfect new home and will realize it once she feels safe.
11304
No, Kurt, I haven't gotten the cortisone shot yet.  They want me to rest it for three weeks first to see if that helps.  Then they would try a shot and remove it if the shot doesn't work.  It might feel a smidge better today but not much.
So have you met Daisy in person yet?  She sounds like a sweetie.
And Carrie, Amelia is a piggie too.  She eats the dog food if I don't watch her carefully!
11305
I had a sore neck yesterday morning after turning it too quickly. Applied heat....took an anti-inflammatory last night. Thought I'd wake up feeling better....nope....still hurts. Grrrrrrrr.
11306
MM- Thanks for your welcome to the COUNTING thread. Would have been better if I remembered to COUNT. I've been a long time lurker on the thread but happy now to get involved in the chat.
Hope your neck is feeling better. Lily and Lola are reminding me to say 'hello' from them too.
11307
Welcome back anniegirl and Lily and Lola! We didn't scare you away! Yeah!
Nightmare afternoon. My SIL was in a car accident and he car pools with my daughter. So, in rush hour, I have to drive an hour into Vancouver to get them (I hate rush hour traffic....really hate it). Then an hour back to their place.....then another half hour to get home. Not good for the neck or the nerves (I really hate toll bridges and driving in Vancouver and did I mention rush hour traffic). At least he is unharmed. Things we do for love of our children!
11308. Hi annegirl! None of us can count either - you'll fit right in.
MM, yikes how scary. Glad he's ok, but that can't have been fun having to do all that driving, especially with an achy neck.
I note that it's the teachers on this thread whose bodies promptly went down...coincidentally once the school year was over...
DebS, how on earth are you supposed to rest your foot for 3 weeks, short of lying in bed and not moving?
MM, how is your shoulder doing? And I missed the whole goodbye to Pyro.
Gah, I hope it slowly gets easier over time as you remember that they're going to families who will love them.
I'm finally not constantly hungry, just nearly constantly.
My weight had gone under 100 lbs
so I have quite a bit of catching up to do. Because of the partial obstruction I had after the surgery, the doctor told me to try to stay on a full liquid diet for a week. I actually told him no, that I was hungry, lol. So he told me to just be careful. I have been, mostly (bacon has entered the picture 3 times in the last 3 days
) and the week expired today, so I'm off to the races.
11309
Welcome to you, annegirl, along with Lily and Lola!  We always love having new friends on our counting thread.
And Spirite, good to hear from you and I'm glad you are onto solid foods.
11310
DebS and Spirite thank you too for the Welcome. Oh MM what an ordeal with your daughter but glad everyone us on.
DevS - hope your foot is improving Its so horrible to be in pain.
Spirite - go wild on the  bacon and whatever else you fancy. I'm making a chocolate orange cheese cake to bring to a dinner party this afternoon. It has so much marscapone in it you put on weight just looking at it.

If you wanted to use find_all , you would need to use a regex:

import re
r = requests.get('http://www.catforum.com/forum/43-forum-fun/350938-count-one-billion-2016-a-120.html')
soup = BeautifulSoup(r.text)
for div in soup.find_all(id=re.compile("^post_message")):
    print(div.get_text("\n", strip=True))

The result will be the same.

There's nothing with the id post_message , so link.find_all returns an empty list. You'll first want to grab all of the ids within all the div s, and then filter that list of ids with a regex (eg) to get only those that start with post_message_ and then a number. Then you can do

for message_id in message_ids:
    print(link.find_all('div', {'id': message_id}))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM