Skip to content

Conversation

anoadragon453
Copy link
Member

@anoadragon453 anoadragon453 commented Oct 14, 2025

Otherwise the exception will be raised outside of the error handling code. Fixes #19061.

Missed in #17365.

Pull Request Checklist

  • Pull request is based on the develop branch
  • Pull request includes a changelog file. The entry should:
    • Be a short description of your change which makes sense to users. "Fixed a bug that prevented receiving messages from other servers." instead of "Moved X method from EventStore to EventWorkerStore.".
    • Use markdown where necessary, mostly for code blocks.
    • End with either a period (.) or an exclamation mark (!).
    • Start with a capital letter.
    • Feel free to credit yourself, by adding a sentence "Contributed by @github_username." or "Contributed by [Your Name]." to the end of the entry.
  • Code style is correct (run the linters)

Otherwise the exception will be raised outside of the error handling
code.
@anoadragon453 anoadragon453 marked this pull request as ready for review October 15, 2025 07:38
@anoadragon453 anoadragon453 requested a review from a team as a code owner October 15, 2025 07:38
response, output_stream, boundary, expected_size + 1
)
deferred.addTimeout(self.default_timeout_seconds, self.reactor)
multipart_response = await make_deferred_yieldable(deferred)
Copy link
Contributor

@MadLittleMods MadLittleMods Oct 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be inside the async with self.remote_download_linearizer.queue(ip_address):?

Maybe. We do it in get_file which is probably what federation_get_file was based off of:

async with self.remote_download_linearizer.queue(ip_address):
# add a byte of headroom to max size as function errs at >=
d = read_body_with_max_size(response, output_stream, expected_size + 1)
d.addTimeout(self.default_timeout_seconds, self.reactor)
length = await make_deferred_yieldable(d)

And although this isn't explained anywhere, I guess the point of self.remote_download_linearizer is to ensure we only download 6 pieces of remote media at a time for some reason (maybe resource exhaustion, just a bunch of assumptions). So it makes sense that the download part is under the lock.

@@ -1758,6 +1719,7 @@
response, output_stream, boundary, expected_size + 1
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we have tests for this kind of timeout? (just wondering)

It looks like this was previously functional and the only problem was the unexpected logs/error handling so an high-level integration/end-to-end test wouldn't have really prevented this.

It looks like we already have other tests in tests/http/test_matrixfederationclient.py that ensure we raise a RequestSendFailed in certain scenarios. We could add one of those 🤷

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Don't print deferred backtraces into the logs

2 participants