File tree Expand file tree Collapse file tree 1 file changed +10
-9
lines changed Expand file tree Collapse file tree 1 file changed +10
-9
lines changed Original file line number Diff line number Diff line change @@ -303,24 +303,25 @@ Or you can provide ``asyncio`` coroutine that yields bytes objects::
303
303
``aiohttp `` internally handles such a coroutines.
304
304
305
305
Also it is possible to use ``StreamReader `` object. Lets say we want to upload
306
- file and calculate file sha1 hash::
306
+ file from another request and calculate file sha1 hash::
307
307
308
- >>> def feed_stream(stream):
308
+ >>> def feed_stream(resp, stream):
309
309
... h = hashlib.sha1()
310
310
...
311
- ... with open('some.iso') as f :
312
- ... chunk = f.read(8192 )
313
- ... while chunk:
314
- ... h.update(chunk)
315
- ... s.feed_data (chunk)
316
- ... chunk = f.read(8192 )
311
+ ... with True :
312
+ ... chunk = yield from resp.content.readany( )
313
+ ... if not chunk:
314
+ ... break
315
+ ... h.update (chunk)
316
+ ... s.feed_data(chunk )
317
317
...
318
318
... return h.hexdigest()
319
319
320
+ >>> resp = aiohttp.request('get', 'http://httpbin.org/post')
320
321
>>> stream = StreamReader()
321
322
>>> asyncio.async(aiohttp.request('post', 'http://httpbin.org/post', data=stream)
322
323
323
- >>> file_hash = yield from feed_stream(stream)
324
+ >>> file_hash = yield from feed_stream(resp, stream)
324
325
325
326
326
327
Because response's content attribute is a StreamReader, you can chain get and
You can’t perform that action at this time.
0 commit comments