Tornado 5.1 Documentation
item. A Queue maintains a count of unfinished tasks, which begins at zero. put increments the count; task_done decrements it. In the web-spider example here, the queue begins containing only base_url. When calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done Exception as e: print('Exception: %s %s' % (e, url)) finally: q.task_done() await q.put(base_url) # Start workers, then wait for the work queue to be empty.0 码力 | 359 页 | 347.32 KB | 1 年前3
Tornado 4.5 Documentation
decorator like tornado.gen.coroutine or asyncio.coroutine [https://docs.python.org/3.5/library/asyncio-task.html#asyncio.coroutine]), not all coroutines are compatible with each other. There is a coroutine callbacks instead of Future, wrap the call in a Task. This will add the callback argument for you and return a Future which you can yield: @gen.coroutine def call_task(): # Note that there are no parens on on some_function. # This will be translated by Task into # some_function(other_args, callback=callback) yield gen.Task(some_function, other_args) Calling blocking functions The simplest way0 码力 | 333 页 | 322.34 KB | 1 年前3
Tornado 6.0 Documentation
item. A Queue maintains a count of unfinished tasks, which begins at zero. put increments the count; task_done decrements it. In the web-spider example here, the queue begins containing only base_url. When calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done Exception as e: print("Exception: %s %s" % (e, url)) finally: q.task_done() await q.put(base_url) # Start workers, then wait for the work queue to be empty.0 码力 | 869 页 | 692.83 KB | 1 年前3
Tornado 6.1 Documentation
item. A Queue maintains a count of unfinished tasks, which begins at zero. put increments the count; task_done decrements it. In the web-spider example here, the queue begins containing only base_url. When calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done print("Exception: %s %s" % (e, url)) dead.add(url) finally: q.task_done() await q.put(base_url) # Start workers, then wait for the work queue to be empty.0 码力 | 931 页 | 708.03 KB | 1 年前3
Tornado 4.5 Documentation
call in a Task. This will add the callback argument for you and return a Future which you can yield: 4.1. User’s guide 13 Tornado Documentation, Release 4.5.3 @gen.coroutine def call_task(): # Note there are no parens on some_function. # This will be translated by Task into # some_function(other_args, callback=callback) yield gen.Task(some_function, other_args) Calling blocking functions The simplest item. A Queue maintains a count of unfinished tasks, which begins at zero. put increments the count; task_done decrements it. In the web-spider example here, the queue begins containing only base_url. When0 码力 | 222 页 | 833.04 KB | 1 年前3
Tornado 6.2 Documentation
item. A Queue maintains a count of unfinished tasks, which begins at zero. put increments the count; task_done decrements it. In the web-spider example here, the queue begins containing only base_url. When calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done print("Exception: %s %s" % (e, url)) dead.add(url) finally: q.task_done() await q.put(base_url) # Start workers, then wait for the work queue to be empty.0 码力 | 407 页 | 385.03 KB | 1 年前3
Tornado 5.1 Documentation
item. A Queue maintains a count of unfinished tasks, which begins at zero. put increments the count; task_done decrements it. In the web-spider example here, the queue begins containing only base_url. When calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done try: await fetch_url(url) except Exception as e: print('Exception: %s %s' % (e, url)) finally: q.task_done() await q.put(base_url) # Start workers, then wait for the work queue to be empty. workers0 码力 | 243 页 | 895.80 KB | 1 年前3
Tornado 6.4 Documentation
item. A Queue maintains a count of unfinished tasks, which begins at zero. put increments the count; task_done decrements it. In the web-spider example here, the queue begins containing only base_url. When calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done print("Exception: %s %s" % (e, url)) dead.add(url) finally: q.task_done() await q.put(base_url) # Start workers, then wait for the work queue to be empty.0 码力 | 432 页 | 402.58 KB | 1 年前3
Tornado 6.4 Documentation
item. A Queue maintains a count of unfinished tasks, which begins at zero. put increments the count; task_done decrements it. In the web-spider example here, the queue begins containing only base_url. When calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done print("Exception: %s %s" % (e, url)) dead.add(url) finally: q.task_done() await q.put(base_url) # Start workers, then wait for the work queue to be empty.0 码力 | 432 页 | 402.58 KB | 1 年前3
Tornado 6.4 Documentation
item. A Queue maintains a count of unfinished tasks, which begins at zero. put increments the count; task_done decrements it. In the web-spider example here, the queue begins containing only base_url. When calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done print("Exception: %s %s" % (e, url)) dead.add(url) finally: q.task_done() await q.put(base_url) # Start workers, then wait for the work queue to be empty.0 码力 | 432 页 | 402.58 KB | 1 年前3
共 20 条
- 1
- 2













