-
Notifications
You must be signed in to change notification settings - Fork 6
Reach functional backend using jupyter_collaboration frontend
#29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reach functional backend using jupyter_collaboration frontend
#29
Conversation
|
Looks good to me! |
jupyter_collaboration frontendjupyter_rtc_core backend + jupyter_collaboration frontend
jupyter_rtc_core backend + jupyter_collaboration frontendjupyter_rtc_core backend + jupyter_collaboration frontend
jupyter_rtc_core backend + jupyter_collaboration frontendjupyter_collaboration frontend
jupyter_rtc_core/rooms/yroom.py
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I spent an hour trying to figure out why the client was auto-closing every WS right after we write a message to it. Turns out, you need to pass binary=True to write byte streams. Otherwise, Tornado will automatically encode byte streams as UTF-8, which the client will reject with an opaque error log. 🤦
From the documentation:
def write_message(
self, message: Union[bytes, str, Dict[str, Any]], binary: bool = False
) -> "Future[None]":
"""Sends the given message to the client of this Web Socket.
The message may be either a string or a dict (which will be
encoded as json). If thebinaryargument is false, the
message will be sent as utf8; in binary mode any byte string
is allowed.
fdd0329 to
a0851a6
Compare
|
@ellisonbg @jzhang20133 Hey team! I've:
Check out how fast the new documents load: Screen.Recording.2025-05-13.at.4.30.10.PM.movFeel free to approve and merge at your convenience! |
jupyter_collaboration frontendjupyter_collaboration frontend
jzhang20133
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approve to unblock! Awesome job!
Description
rtccore-jcollab-frontendto allow reviewers to independently verify this branch.YRoomWebsockethandler.YRoomSessionHandlerto allow usage with thejupyter_collaborationfrontend.UPDATE: Also got auto-saving working.
save()calls concurrently. We can increase the throttling, but it works quite well as-is.UPDATE: Cleaned up the logs & rebased onto latest
main.UPDATE: Also fixed the long latency when opening a document. Jialin was right, this was fixed by closing
JupyterLab:globalAwarenesswebsockets more gracefully.Demo
Screen.Recording.2025-05-13.at.2.20.47.PM.mov
How to test this branch
jupyter_collaborationfrontend by running:IMO, the latency is either because we haven't implemented the global awareness websocket yet, or it is because the
jupyter_collaborationfrontend is inherently slow. It doesn't seem like it's caused by our server extension, since we don't get the initial client SS1 message until a few seconds before the spinner resolves. I'll spend some time to check on this.