Skip to content

Commit f71b6af

Browse files
committed
update docs
1 parent 9a0f947 commit f71b6af

File tree

1 file changed

+10
-9
lines changed

1 file changed

+10
-9
lines changed

docs/client.rst

Lines changed: 10 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -303,24 +303,25 @@ Or you can provide ``asyncio`` coroutine that yields bytes objects::
303303
``aiohttp`` internally handles such a coroutines.
304304

305305
Also it is possible to use ``StreamReader`` object. Lets say we want to upload
306-
file and calculate file sha1 hash::
306+
file from another request and calculate file sha1 hash::
307307

308-
>>> def feed_stream(stream):
308+
>>> def feed_stream(resp, stream):
309309
... h = hashlib.sha1()
310310
...
311-
... with open('some.iso') as f:
312-
... chunk = f.read(8192)
313-
... while chunk:
314-
... h.update(chunk)
315-
... s.feed_data(chunk)
316-
... chunk = f.read(8192)
311+
... with True:
312+
... chunk = yield from resp.content.readany()
313+
... if not chunk:
314+
... break
315+
... h.update(chunk)
316+
... s.feed_data(chunk)
317317
...
318318
... return h.hexdigest()
319319

320+
>>> resp = aiohttp.request('get', 'http://httpbin.org/post')
320321
>>> stream = StreamReader()
321322
>>> asyncio.async(aiohttp.request('post', 'http://httpbin.org/post', data=stream)
322323

323-
>>> file_hash = yield from feed_stream(stream)
324+
>>> file_hash = yield from feed_stream(resp, stream)
324325

325326

326327
Because response's content attribute is a StreamReader, you can chain get and

0 commit comments

Comments
 (0)