当前位置: 代码迷 >> python >> 运行scrapyd项目时出错
  详细解决方案

运行scrapyd项目时出错

热度:71   发布时间:2023-06-13 15:32:30.0

我成功地部署了我的项目(sogou),但是当我运行它时:

curl http://localhost:6800/schedule.json -d project=sogou -d spider=sogou

它失败了:

2017-02-13 10:44:51 [scrapy] INFO: Scrapy 1.2.1 started (bot: sogou)

2017-02-13 10:44:51 [scrapy] INFO: Overridden settings: 
{'NEWSPIDER_MODULE': 'sogou.spiders', 'CONCURRENT_REQUESTS': 5, 
'SPIDER_MODULES': ['sogou.spiders'], 'RETRY_HTTP_CODES': [500, 502,
503, 504, 400, 403, 408], 'BOT_NAME': 'sogou', 'DOWNLOAD_TIMEOUT': 10,
'RETRY_TIMES': 10, 'LOG_FILE': 
'logs/sogou/sogou/63a0bbacf19611e69eea240a644f1626.log'}

2017-02-13 10:44:51 [scrapy] INFO: Enabled extensions:
['scrapy.extensions.logstats.LogStats', 
'scrapy.extensions.telnet.TelnetConsole', 
'scrapy.extensions.corestats.CoreStats'] 2017-02-13 10:44:51 [twisted]
CRITICAL: Unhandled error in Deferred: 2017-02-13 10:44:51 [twisted]
CRITICAL:  Traceback (most recent call last):

  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
    result = g.send(result)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 90, in crawl
    six.reraise(*exc_info)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 71, in crawl
    self.spider = self._create_spider(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 94, in _create_spider
    return self.spidercls.from_crawler(self, *args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/spiders/__init__.py", line 50, in from_crawler
    spider = cls(*args, **kwargs)

TypeError: __init__() got an unexpected keyword argument '_job'

没有源代码很难找到问题,但很可能你覆盖了蜘蛛的__init__而不是它不接受任意的**kwargs ,而scrapyd将作业标识符作为蜘蛛参数传递。 在这种情况下,您应该将**kwargs添加到蜘蛛的构造函数中,如下所示:

class Spider(scrapy.Spider):
    name = 'spider'

    def __init__(self, param1, param2, **kwargs):
        ...