猿问

无法运行“爬行报价”

无法使scrapy教程起作用。


我正在尝试学习scrapy,但甚至无法运行教程。我试图在 python 3.7 & 3.5.5 中运行它,结果相同


导入scrapy


类 QuotesSpider(scrapy.Spider): name = "quotes"


def start_requests(self):

    urls = [

        'http://quotes.toscrape.com/page/1/',

        'http://quotes.toscrape.com/page/2/',

    ]

    for url in urls:

        yield scrapy.Request(url=url, callback=self.parse)


def parse(self, response):

    page = response.url.split("/")[-2]

    filename = 'quotes-%s.html' % page

    with open(filename, 'wb') as f:

        f.write(response.body)

    self.log('Saved file %s' % filename)

这似乎运行正常。至少它不会抛出任何错误。


当我在 Anaconda 提示窗口中运行“scrapy crawl quote”时,我得到了这个:


"hed) C:\Users\userOne\python script files\scrapy\tutorial>scrapy crawl 

 quotes

 2019-01-23 18:34:27 [scrapy.utils.log] INFO: Scrapy 1.5.1 started (bot: 

 tutorial)

 2019-01-23 18:34:27 [scrapy.utils.log] INFO: Versions: lxml 4.2.3.0, libxml2 

 2.9.5, cssselect 1.0.3, parsel 1.5.0, w3lib 1.19.0, Twisted 18.7.0, Python 

 3.5.5 | packaged by conda-forge | (default, Jul 24 2018, 01:52:17) [MSC 

 v.1900 64 bit (AMD64)], pyOpenSSL 18.0.0 (OpenSSL 1.0.2p  14 Aug 2018), 

 cryptography 2.3.1, Platform Windows-10-10.0.17134-SP0

 Traceback (most recent call last):

   File "C:\Users\userOne\Anaconda3\envs\hed\lib\site- packages\scrapy\spiderloader.py", line 69, in load

     return self._spiders[spider_name]

 KeyError: 'quotes'


 During handling of the above exception, another exception occurred:


 Traceback (most recent call last):

   File "C:\Users\userOne\Anaconda3\envs\hed\Scripts\scrapy-script.py", line 

 10, in <module>

     sys.exit(execute())

   File "C:\Users\userOne\Anaconda3\envs\hed\lib\site- packages\scrapy\cmdline.py", line 150, in execute

     _run_print_help(parser, _run_command, cmd, args, opts)

   File "C:\Users\userOne\Anaconda3\envs\hed\lib\site- packages\scrapy\cmdline.py", line 90, in _run_print_help

     func(*a, **kw)

   File "C:\Users\userOne\Anaconda3\envs\hed\lib\site- packages\scrapy\cmdline.py", line 157, in _run_command


富国沪深
浏览 263回答 3
3回答
随时随地看视频慕课网APP

相关分类

Python
我要回答