Skip to content

Commit

Permalink
Use msgpack.Unpacker.feed() to allow fetching large data from the str…
Browse files Browse the repository at this point in the history
…eam (#118)

This is to prevent timeout error when the response is large. It works
with 3+GiB msgpack data.
  • Loading branch information
chezou authored Sep 6, 2024
1 parent a8c2b29 commit 5ce228d
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 3 deletions.
8 changes: 5 additions & 3 deletions tdclient/job_api.py
Original file line number Diff line number Diff line change
Expand Up @@ -248,9 +248,11 @@ def job_result_format_each(self, job_id, format, header=False):
if code != 200:
self.raise_error("Get job result failed", res, "")
if format == "msgpack":
unpacker = msgpack.Unpacker(res, raw=False)
for row in unpacker:
yield row
unpacker = msgpack.Unpacker(raw=False, max_buffer_size=1000 * 1024 ** 2)
for chunk in res.stream(1024 ** 2):
unpacker.feed(chunk)
for row in unpacker:
yield row
elif format == "json":
for row in codecs.getreader("utf-8")(res):
yield json.loads(row)
Expand Down
4 changes: 4 additions & 0 deletions tdclient/test/test_helper.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,11 @@ def read(size=None):
else:
return b""

def stream(size=None):
yield read(size)

response.read.side_effect = read
response.stream.side_effect = stream
return response


Expand Down

0 comments on commit 5ce228d

Please sign in to comment.