简体   繁体   中英

Combining Beautiful Soup find_all list into one element

I have created a script that parses through emails on a weekly basis looking for tables within specific emails. I know that I want things that are within a table tag with a specific class name. The goal then is to take those tables, essentially concat them with a
tag in between, and put into another email to automatically send every week.

What I have so far is the actual email scraping, the sending of the email at the end, but I just don't know how to combine the results of a find_all into one element. I'm obviously open to different approaches, which is why I posed the question as thus.

What I have for code is this:

def parse_messages(enhance_str):
    soup = BeautifulSoup(enhance_str, 'html.parser')
    table = soup.find_all('table', {'class': 'MsoNormalTable'})
    return table

which gives me a list-like object (I know find_all sub classes list), but any list methods I know don't work with this object. I thought I could just do something like

'<br/>'.join(table)

but this throws an attribute error.

I'm sure there is a simple answer, but I can't see it. Any help is greatly appreciated.

EDIT: As a clarifcation, I was just trying to preserve the html structure of these tables so I can just pop them into a new email and send them as is. The solution below works for me, so I'm marking it as the accepted answer.

Thanks for the help!

The elements in the output list of soup.find_all are bs4.element.Tag objects, not some objects you can join together as-is to make a string.

I'm not sure what you're upto but if you want to make them all a single str , you can iterate over the Tag s, call str on them to get the string representation and then join :

'<br/>'.join([str(tag) for tag in table])

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM