Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Error Animation in Code Cell for Jupyter-AI Errors #1197

Open
wants to merge 7 commits into
base: main
Choose a base branch
from

Conversation

Darshan808
Copy link
Member

@Darshan808 Darshan808 commented Jan 13, 2025

References #1144

This PR introduces the error animation functionality to code cells for jupyter-ai. It aligns jupyter-ai's error reporting with JupyterLab's completer error animation feature.

  • Added an error field in responses from jupyter-ai during error generation.

Testing

  • Test is added to validate the error field emission.
  • Currently seeking guidance on how to simulate error scenarios for testing.

Feedback on the testing approach is appreciated, particularly on methods to programmatically raise errors during tests.

Screencast

Simulated by raising custom exception from server side

ai-error.mp4

Co-authored-by: Andrew Fulton andrewfulton9@gmail.com
@Darshan808 Darshan808 force-pushed the error-animation-in-code-cell branch from ce9bb53 to 8784c56 Compare January 13, 2025 17:39
@krassowski
Copy link
Member

I think you could override handle_stream_request in the Mock class to conditionally raise an exception (conditionally on arguments passed in to the mock?)

@andrewfulton9
Copy link
Contributor

andrewfulton9 commented Jan 14, 2025

I think you could override handle_stream_request in the Mock class to conditionally raise an exception (conditionally on arguments passed in to the mock?)

I think you will also need to bump the version of jupyterlab/completer to 4.3.0 if the main branch of jupyter-ai isn't on that version yet. It wasn't when I looked at this issue.

Copy link
Member

@dlqqq dlqqq left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Darshan808 Welcome to Jupyter AI! I love seeing new contributors. Thank you for getting started on this feature request.

I've left a few suggestions below. In addition, I have a couple of questions:

  1. Could you add a screenshot to this PR?
  2. The original issue (Show error animation in code cell when error is triggered  #1144) shows an icon when the completer fails. How is the icon being added?

Comment on lines 32 to 35
class InlineCompletionError(BaseModel):
message: str


Copy link
Member

@dlqqq dlqqq Jan 14, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With this PR, it seems like whether an inline completion resulted in an error can be determined in 2 different ways:

  1. An error occurred if only one InlineCompletionItem is returned and InlineCompletionItem.error exists.

  2. An error occurred if InlineCompletionReply.error exists.

    • This attribute takes the type CompletionError, which is very similar to the InlineCompletionError model that you added. CompletionError is already defined here on line 49.

I suggest that we:

  1. Have a single source of truth and drop either InlineCompletionItem.error or InlineCompletionReply.error. That way, there's only one way to determine if a completion reply has an error.

  2. Drop either InlineCompletionError or CompletionError, as the models are very similar and can probably be unified.

Comment on lines 213 to 220
async def test_handle_request_with_error(inline_handler):
inline_handler = MockCompletionHandler(
lm_provider=MockProvider,
lm_provider_params={
"model_id": "model",
"responses": ["test"],
},
)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently seeking guidance on how to simulate error scenarios for testing.

You can modify MockProvider to take an raise_exc param. If "raise_exc": True is set in lm_provider_params, then MockProvider should raise an exception in its _call() method. For example:

def _call(self, *args, **kwargs):
    if self.raise_exc:
        raise Exception("Test exception")
    else:
        return super()._call(*args, **kwargs)

In LangChain, LLM._call() is (usually) the method which is called when a non-streaming reply is requested. If this doesn't work, you can try following the above steps for the _acall() method instead.

@Darshan808
Copy link
Member Author

Thank you for your feedback and suggestions, @dlqqq

Have a single source of truth and drop either InlineCompletionItem.error or InlineCompletionReply.error. That way, there's only one way to determine if a completion reply has an error.

Yes, this was redundant, so I dropped InlineCompletionItem.error since InlineCompletionReply.error already provides more context about the error.

The original issue (Show error animation in code cell when error is triggered #1144) shows an icon when the completer fails. How is the icon being added?

Whenever we return an item with the error field containing a message, JupyterLab's completer has implemented the necessary code to display that animation. So, I don't think we need to reimplement it here. Simply returning a completion with the error field does the job.

Could you add a screenshot to this PR?

Sure, I am adding a clip.

In LangChain, LLM._call() is (usually) the method which is called when a non-streaming reply is requested. If this doesn't work, you can try following the above steps for the _acall() method instead.

This worked perfectly.

@Darshan808
Copy link
Member Author

@Darshan808
Copy link
Member Author

The newly added test passes locally but fails on CI. Looking into it

@Darshan808 Darshan808 requested a review from dlqqq January 23, 2025 15:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants