繁体   English   中英

Scrapypipeline.py没有从蜘蛛插入项目到MYSQL

[英]Scrapy pipeline.py not inserting items to MYSQL from spider

我正在使用scrapy来抓取新闻头条,并且我是一个新手,可以整体抓取和抓取。 几天来,我遇到了巨大的问题,将已抓取的数据通过管道传输到我的SQL数据库中。 我的pipes.py文件中有2个类,一个用于将项目插入数据库中,另一个用于出于前端Web开发原因将抓取的数据备份到json文件中。

这是我的蜘蛛的代码-它从start_urls提取新闻标题-它使用extract()并以字符串的形式extract()这些数据,随后遍历所有数据并使用strip()删除空格以获取更好的格式

from scrapy.spider import Spider
from scrapy.contrib.linkextractors.sgml import SgmlLinkExtractor
from scrapy.selector import Selector
from scrapy.item import Item
from Aljazeera.items import AljazeeraItem
from datetime import date, datetime


class AljazeeraSpider(Spider):
    name = "aljazeera"
    allowed_domains = ["aljazeera.com"]
    start_urls = [
        "http://www.aljazeera.com/news/europe/",
        "http://www.aljazeera.com/news/middleeast/",
        "http://www.aljazeera.com/news/asia/",
        "http://www.aljazeera.com/news/asia-pacific/",
        "http://www.aljazeera.com/news/americas/",
        "http://www.aljazeera.com/news/africa/",
        "http://blogs.aljazeera.com/"

    ]

    def parse(self, response):
        sel = Selector(response)
        sites = sel.xpath('//td[@valign="bottom"]')
        contents = sel.xpath('//div[@class="indexSummaryText"]')
        items = []

        for site,content in zip(sites, contents):
            item = AljazeeraItem()
            item['headline'] = site.xpath('div[3]/text()').extract()
            item['content'] = site.xpath('div/a/text()').extract()
            item['date'] = str(date.today())
            for headline, content in zip(item['content'], item['headline']):
              item['headline'] = headline.strip()
              item['content'] = content.strip()
              items.append(item)
        return items

我的pipeline.py的代码如下:

import sys
import MySQLdb
import hashlib
from scrapy.exceptions import DropItem
from scrapy.http import Request
import json
import os.path

class SQLStore(object):
  def __init__(self):
    self.conn = MySQLdb.connect(user='root', passwd='', db='aj_db', host='localhost', charset="utf8", use_unicode=True)
    self.cursor = self.conn.cursor()
    #log data to json file


def process_item(self, item, spider): 

    try:
        self.cursor.execute("""INSERT INTO scraped_data(headlines, contents, dates) VALUES (%s, %s, %s)""", (item['headline'].encode('utf-8'), item['content'].encode('utf-8'), item['date'].encode('utf-8')))
        self.conn.commit()

    except MySQLdb.Error, e:
        print "Error %d: %s" % (e.args[0], e.args[1])

        return item



#log runs into back file 
class JsonWriterPipeline(object):

    def __init__(self):
        self.file = open('backDataOfScrapes.json', "w")

    def process_item(self, item, spider):
        line = json.dumps(dict(item)) + "\n"
        self.file.write("item === " + line)
        return item

并且settings.py如下所示:

BOT_NAME = 'Aljazeera'

SPIDER_MODULES = ['Aljazeera.spiders']
NEWSPIDER_MODULE = 'Aljazeera.spiders'

# Crawl responsibly by identifying yourself (and your website) on the user-agent
#USER_AGENT = 'Aljazeera (+http://www.yourdomain.com)'

ITEM_PIPELINES = {
    'Aljazeera.pipelines.JsonWriterPipeline': 300,
    'Aljazeera.pipelines.SQLStore': 300,
}

我的SQL设置都可以。 在运行scrapy crawl aljazeera它甚至可以以json格式输出项目,如下所示:

item === {"headline": "Turkey court says Twitter ban violates rights", "content": "Although ruling by Turkey's highest court is binding, it is unclear whether the government will overturn the ban.", "date": "2014-04-02"}

我真的不知道或看不到我在这里想念的东西。 如果你们能帮助我,我将不胜感激。

谢谢你的时间,

您的缩进在SQLStore管道中是错误的。 我已经测试了正确的缩进及其工作正常。 复制以下内容,它应该是完美的。

class SQLStore(object):
def __init__(self):
    self.conn = MySQLdb.connect(user='root', passwd='', db='aj_db', host='localhost', charset="utf8", use_unicode=True)
    self.cursor = self.conn.cursor()
    #log data to json file


def process_item(self, item, spider): 

    try:
        self.cursor.execute("""INSERT INTO scraped_data(headlines, contents, dates) VALUES (%s, %s, %s)""", (item['headline'].encode('utf-8'), item['content'].encode('utf-8'), item['date'].encode('utf-8')))
        self.conn.commit()

    except MySQLdb.Error, e:
        print "Error %d: %s" % (e.args[0], e.args[1])

        return item

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM