繁体   English   中英

使用漂亮的汤和 lxml 在 Python 中抓取论坛帖子无法获取所有帖子

[英]Web Scraping a Forum Post in Python Using Beautiful soup and lxml Cannot get all posts

我有一个问题让我非常疯狂。 我是网络抓取的新手,我正在通过尝试抓取论坛帖子的内容(即人们发布的实际帖子)来练习网络抓取。 我已将帖子隔离为我认为包含 div id="post message_ 2793649 的文本(请参阅随附的 Screenshot_1 以更好地表示 html) Screenshot_1

上面的例子只是众多帖子中的一个。 每个帖子都有自己唯一的标识符号,但其余部分一致为 div id="post_message_。

这是我目前遇到的问题

import requests
from bs4 import BeautifulSoup
import lxml

r = requests.get('http://www.catforum.com/forum/43-forum-fun/350938-count-one-    billion-2016-a-120.html')

soup = BeautifulSoup(r.content)

data = soup.find_all("td", {"class": "alt1"})

for link in data:
    print(link.find_all('div', {'id': 'post_message'}))

上面的代码只是创建了一堆空列表,这些列表沿着页面向下移动,令人沮丧。 (有关我运行的代码及其旁边的输出,请参阅 Screenshot_2) Screenshot_2我错过了什么。

我正在寻找的最终结果只是人们所说的包含在长字符串中的所有内容,而没有任何 html 混乱。

我正在使用 Beautiful Soup 4 运行 lxml 解析器

您有几个问题,首先是您的 url 中有多个空格,因此您不会进入您认为的页面:

In [50]: import requests


In [51]: r.url # with spaces
Out[51]: 'http://www.catforum.com/forum/43-forum-fun/350938-count-one-billion-2016-a-120.html'
Out[49]: 'http://www.catforum.com/forum/'

In [50]: r = requests.get('http://www.catforum.com/forum/43-forum-fun/350938-count-one-billion-2016-a-120.html')

In [51]: r.url # without spaces
Out[51]: 'http://www.catforum.com/forum/43-forum-fun/350938-count-one-billion-2016-a-120.html'

下一个问题是idpost_message 开头,没有一个完全等于post_message ,你可以使用一个 css 选择器来匹配以post_message开头的 id 来拉你想要的所有 div,然后只提取文本:

r = requests.get('http://www.catforum.com/forum/43-forum-fun/350938-count-one-billion-2016-a-120.html')

soup = BeautifulSoup(r.text)


for div in soup.select('[id^=post_message]'):
     print(div.get_text("\n", strip=True))

这会给你:

11311301
Did you get the cortisone shots? Will they have to remove it?
My Dad and stepmom got a new Jack Russell! Her name's Daisy. She's 2 years old, and she's a rescue(d) dog. She was rescued from an abusive situation. She can't stand noise, and WILL NOT allow herself  to be picked up. They're working on that. Add to that the high-strung, hyper nature of a Jack Russell... But they love her. When I called last night, Pat was trying to teach her 'sit'!
11302
Well, I tidied, cleaned, and shopped. Rest of the list isn't done and I'm too tired and way too hot to care right now.
Miss Luna is howling outside the Space Kitten's room because I let her out and gave them their noms. SHE likes to gobble their food.....little oink.
11303
Daisy sounds like she has found a perfect new home and will realize it once she feels safe.
11304
No, Kurt, I haven't gotten the cortisone shot yet.  They want me to rest it for three weeks first to see if that helps.  Then they would try a shot and remove it if the shot doesn't work.  It might feel a smidge better today but not much.
So have you met Daisy in person yet?  She sounds like a sweetie.
And Carrie, Amelia is a piggie too.  She eats the dog food if I don't watch her carefully!
11305
I had a sore neck yesterday morning after turning it too quickly. Applied heat....took an anti-inflammatory last night. Thought I'd wake up feeling better....nope....still hurts. Grrrrrrrr.
11306
MM- Thanks for your welcome to the COUNTING thread. Would have been better if I remembered to COUNT. I've been a long time lurker on the thread but happy now to get involved in the chat.
Hope your neck is feeling better. Lily and Lola are reminding me to say 'hello' from them too.
11307
Welcome back anniegirl and Lily and Lola! We didn't scare you away! Yeah!
Nightmare afternoon. My SIL was in a car accident and he car pools with my daughter. So, in rush hour, I have to drive an hour into Vancouver to get them (I hate rush hour traffic....really hate it). Then an hour back to their place.....then another half hour to get home. Not good for the neck or the nerves (I really hate toll bridges and driving in Vancouver and did I mention rush hour traffic). At least he is unharmed. Things we do for love of our children!
11308. Hi annegirl! None of us can count either - you'll fit right in.
MM, yikes how scary. Glad he's ok, but that can't have been fun having to do all that driving, especially with an achy neck.
I note that it's the teachers on this thread whose bodies promptly went down...coincidentally once the school year was over...
DebS, how on earth are you supposed to rest your foot for 3 weeks, short of lying in bed and not moving?
MM, how is your shoulder doing? And I missed the whole goodbye to Pyro.
Gah, I hope it slowly gets easier over time as you remember that they're going to families who will love them.
I'm finally not constantly hungry, just nearly constantly.
My weight had gone under 100 lbs
so I have quite a bit of catching up to do. Because of the partial obstruction I had after the surgery, the doctor told me to try to stay on a full liquid diet for a week. I actually told him no, that I was hungry, lol. So he told me to just be careful. I have been, mostly (bacon has entered the picture 3 times in the last 3 days
) and the week expired today, so I'm off to the races.
11309
Welcome to you, annegirl, along with Lily and Lola!  We always love having new friends on our counting thread.
And Spirite, good to hear from you and I'm glad you are onto solid foods.
11310
DebS and Spirite thank you too for the Welcome. Oh MM what an ordeal with your daughter but glad everyone us on.
DevS - hope your foot is improving Its so horrible to be in pain.
Spirite - go wild on the  bacon and whatever else you fancy. I'm making a chocolate orange cheese cake to bring to a dinner party this afternoon. It has so much marscapone in it you put on weight just looking at it.

如果您想使用find_all ,则需要使用正则表达式:

import re
r = requests.get('http://www.catforum.com/forum/43-forum-fun/350938-count-one-billion-2016-a-120.html')
soup = BeautifulSoup(r.text)
for div in soup.find_all(id=re.compile("^post_message")):
    print(div.get_text("\n", strip=True))

结果将是相同的。

id post_message没有任何post_message ,因此link.find_all返回一个空列表。 您首先要获取所有div的所有 id,然后使用正则表达式(例如)过滤该 id 列表,以仅获取以post_message_开头的那些,然后是一个数字。 然后你可以做

for message_id in message_ids:
    print(link.find_all('div', {'id': message_id}))

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM