Tornado 6.2 Documentation
doesn't look like a normal call, since (continues on next page) 6.1. User’s guide 17 Tornado Documentation, Release 6.2 (continued from previous page) # we pass the function object to be called by the IOLoop base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen (continues on next page) 20 Chapter 6. Documentation Tornado Documentation, Release 6.2 (continued from previous page) async def get_links_from_url(url): """Download the page at `url` and parse it0 码力 | 260 页 | 1.06 MB | 1 年前3
Tornado 6.5 Documentationdoesn't look like a normal call, since (continues on next page) 6.1. User’s guide 17Tornado Documentation, Release 6.5.1 (continued from previous page) # we pass the function object to be called by the IOLoop base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen (continues on next page) 20 Chapter 6. DocumentationTornado Documentation, Release 6.5.1 (continued from previous page) async def get_links_from_url(url): """Download the page at `url` and parse it0 码力 | 272 页 | 1.12 MB | 3 月前3
Tornado 5.1 Documentation
ensure_future() (both work in Tornado). (continues on next page) 16 Chapter 5. Documentation Tornado Documentation, Release 5.1.1 (continued from previous page) fetch_future = convert_yielded(self.fetch_next_chunk()) base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen 'http://www.tornadoweb.org/en/stable/' concurrency = 10 async def get_links_from_url(url): """Download the page at `url` and parse it for links. Returned links have had the fragment after `#` removed, and have0 码力 | 243 页 | 895.80 KB | 1 年前3
Tornado 6.1 Documentation
ensure_future() (both work in Tornado). (continues on next page) 18 Chapter 6. Documentation Tornado Documentation, Release 6.1 (continued from previous page) fetch_future = convert_yielded(self.fetch_next_chunk()) base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen "http://www.tornadoweb.org/en/stable/" concurrency = 10 async def get_links_from_url(url): """Download the page at `url` and parse it for links. Returned links have had the fragment after `#` removed, and have0 码力 | 245 页 | 904.24 KB | 1 年前3
Tornado 6.3 Documentation
doesn't look like a normal call, since (continues on next page) 6.1. User’s guide 17 Tornado Documentation, Release 6.3.3 (continued from previous page) # we pass the function object to be called by the base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen (continues on next page) 20 Chapter 6. Documentation Tornado Documentation, Release 6.3.3 (continued from previous page) async def get_links_from_url(url): """Download the page at `url` and parse0 码力 | 264 页 | 1.06 MB | 1 年前3
Tornado 6.0 Documentation
base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen get_links_from_url(url): (continues on next page) 20 Chapter 6. Documentation Tornado Documentation, Release 6.0.4 (continued from previous page) """Download the page at `url` and parse it for links. Returned url in q: if url is None: return (continues on next page) 6.1. User’s guide 21 Tornado Documentation, Release 6.0.4 (continued from previous page) try: await fetch_url(url) except Exception as e:0 码力 | 245 页 | 885.76 KB | 1 年前3
Tornado 6.4 Documentation
doesn't look like a normal call, since (continues on next page) 6.1. User’s guide 17 Tornado Documentation, Release 6.4 (continued from previous page) # we pass the function object to be called by the IOLoop base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen (continues on next page) 20 Chapter 6. Documentation Tornado Documentation, Release 6.4 (continued from previous page) async def get_links_from_url(url): """Download the page at `url` and parse it0 码力 | 268 页 | 1.09 MB | 1 年前3
Tornado 6.4 Documentation
doesn't look like a normal call, since (continues on next page) 6.1. User’s guide 17 Tornado Documentation, Release 6.4 (continued from previous page) # we pass the function object to be called by the IOLoop base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen (continues on next page) 20 Chapter 6. Documentation Tornado Documentation, Release 6.4 (continued from previous page) async def get_links_from_url(url): """Download the page at `url` and parse it0 码力 | 268 页 | 1.09 MB | 1 年前3
Tornado 6.4 Documentation
doesn't look like a normal call, since (continues on next page) 6.1. User’s guide 17 Tornado Documentation, Release 6.4 (continued from previous page) # we pass the function object to be called by the IOLoop base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen (continues on next page) 20 Chapter 6. Documentation Tornado Documentation, Release 6.4 (continued from previous page) async def get_links_from_url(url): """Download the page at `url` and parse it0 码力 | 268 页 | 1.09 MB | 1 年前3
Tornado 4.5 Documentation
new in Tornado 1.1 What’s new in Tornado 1.0.1 What’s new in Tornado 1.0 Index Module Index Search Page Discussion and support You can discuss Tornado on the Tornado developer mailing list [http://groups base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen org/en/stable/' concurrency = 10 @gen.coroutine def get_links_from_url(url): """Download the page at `url` and parse it for links. Returned links have had the fragment after `#` removed, and have0 码力 | 333 页 | 322.34 KB | 1 年前3
共 20 条
- 1
- 2













