Scrapy 1.2 Documentation4 Chapter 1. Getting help CHAPTER 2 First steps Scrapy at a glance Scrapy is an application framework for crawling web sites and extracting structured data which can be used for a wide range of useful multi-purpose helper for dealing with URLs and web page encodings • twisted, an asynchronous networking framework • cryptography and pyOpenSSL, to deal with various network-level security needs The minimal versions is used instead to create the Requests. This method is also called only once from Scrapy, so it’s safe to implement it as a generator. The default implementation uses make_requests_from_url() to generate0 码力 | 266 页 | 1.10 MB | 1 年前3
Scrapy 1.3 Documentation4 Chapter 1. Getting help CHAPTER 2 First steps Scrapy at a glance Scrapy is an application framework for crawling web sites and extracting structured data which can be used for a wide range of useful multi-purpose helper for dealing with URLs and web page encodings • twisted, an asynchronous networking framework • cryptography and pyOpenSSL, to deal with various network-level security needs The minimal versions is used instead to create the Requests. This method is also called only once from Scrapy, so it’s safe to implement it as a generator. The default implementation uses make_requests_from_url() to generate0 码力 | 272 页 | 1.11 MB | 1 年前3
Scrapy 1.5 DocumentationChapter 1. Getting help CHAPTER 2 First steps 2.1 Scrapy at a glance Scrapy is an application framework for crawling web sites and extracting structured data which can be used for a wide range of useful multi-purpose helper for dealing with URLs and web page encodings • twisted, an asynchronous networking framework • cryptography and pyOpenSSL, to deal with various network-level security needs The minimal versions It is called by Scrapy when the spider is opened for scraping. Scrapy calls it only once, so it is safe to implement start_requests() as a generator. The default implementation generates Request(url, dont_filter=True)0 码力 | 285 页 | 1.17 MB | 1 年前3
Scrapy 1.6 DocumentationChapter 1. Getting help CHAPTER 2 First steps 2.1 Scrapy at a glance Scrapy is an application framework for crawling web sites and extracting structured data which can be used for a wide range of useful multi-purpose helper for dealing with URLs and web page encodings • twisted, an asynchronous networking framework • cryptography and pyOpenSSL, to deal with various network-level security needs The minimal versions It is called by Scrapy when the spider is opened for scraping. Scrapy calls it only once, so it is safe to implement start_requests() as a generator. The default implementation generates Request(url, dont_filter=True)0 码力 | 295 页 | 1.18 MB | 1 年前3
Scrapy 1.2 DocumentationUnderstand Scrapy versioning and API stability. Scrapy at a glance Scrapy is an application framework for crawling web sites and extracting structured data which can be used for a wide range of useful with URLs and web page encodings twisted [https://twistedmatrix.com/], an asynchronous networking framework cryptography [https://cryptography.io/] and pyOpenSSL [https://pypi.python.org/pypi/pyOpenSSL] is used instead to create the Requests. This method is also called only once from Scrapy, so it’s safe to implement it as a generator. The default implementation uses make_requests_from_url() to generate0 码力 | 330 页 | 548.25 KB | 1 年前3
Scrapy 1.3 DocumentationUnderstand Scrapy versioning and API stability. Scrapy at a glance Scrapy is an application framework for crawling web sites and extracting structured data which can be used for a wide range of useful with URLs and web page encodings twisted [https://twistedmatrix.com/], an asynchronous networking framework cryptography [https://cryptography.io/] and pyOpenSSL [https://pypi.python.org/pypi/pyOpenSSL] is used instead to create the Requests. This method is also called only once from Scrapy, so it’s safe to implement it as a generator. The default implementation uses make_requests_from_url() to generate0 码力 | 339 页 | 555.56 KB | 1 年前3
Scrapy 1.4 Documentation4 Chapter 1. Getting help CHAPTER 2 First steps Scrapy at a glance Scrapy is an application framework for crawling web sites and extracting structured data which can be used for a wide range of useful multi-purpose helper for dealing with URLs and web page encodings • twisted, an asynchronous networking framework • cryptography and pyOpenSSL, to deal with various network-level security needs The minimal versions It is called by Scrapy when the spider is opened for scraping. Scrapy calls it only once, so it is safe to implement start_requests() as a generator. The default implementation generates Request(url, dont_filter=True)0 码力 | 281 页 | 1.15 MB | 1 年前3
Scrapy 1.7 DocumentationScrapy Documentation, Release 1.7.4 Scrapy is a fast high-level web crawling and web scraping framework, used to crawl websites and extract structured data from their pages. It can be used for a wide Chapter 1. Getting help CHAPTER 2 First steps 2.1 Scrapy at a glance Scrapy is an application framework for crawling web sites and extracting structured data which can be used for a wide range of useful multi-purpose helper for dealing with URLs and web page encodings • twisted, an asynchronous networking framework • cryptography and pyOpenSSL, to deal with various network-level security needs The minimal versions0 码力 | 306 页 | 1.23 MB | 1 年前3
Scrapy 1.8 DocumentationScrapy Documentation, Release 1.8.4 Scrapy is a fast high-level web crawling and web scraping framework, used to crawl websites and extract structured data from their pages. It can be used for a wide Chapter 1. Getting help CHAPTER TWO FIRST STEPS 2.1 Scrapy at a glance Scrapy is an application framework for crawling web sites and extracting structured data which can be used for a wide range of useful multi-purpose helper for dealing with URLs and web page encodings • twisted, an asynchronous networking framework • cryptography and pyOpenSSL, to deal with various network-level security needs The minimal versions0 码力 | 335 页 | 1.44 MB | 1 年前3
Scrapy 0.24 DocumentationChapter 1. Getting help CHAPTER 2 First steps 2.1 Scrapy at a glance Scrapy is an application framework for crawling web sites and extracting structured data which can be used for a wide range of useful is used instead to create the Requests. This method is also called only once from Scrapy, so it’s safe to implement it as a generator. The default implementation uses make_requests_from_url() to generate it’s intended to perform any last time processing required before returning the results to the framework core, for example setting the item IDs. It receives a list of results and the response which originated0 码力 | 222 页 | 988.92 KB | 1 年前3
共 62 条
- 1
- 2
- 3
- 4
- 5
- 6
- 7













