Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reponse stops midway #143

Closed
nsolsen85 opened this issue Sep 24, 2023 · 14 comments
Closed

Reponse stops midway #143

nsolsen85 opened this issue Sep 24, 2023 · 14 comments

Comments

@nsolsen85
Copy link

Hi!

I tried out the R package but keep getting incomplete responses (i.e., chatgpt will start responding with a few lines and then halt without completing the response). In this screenshot, I have attached an error message that I have noticed repeatedly in 'background jobs':

gptstudio_error

Thanks in advance!
Niels

@HilianoMop
Copy link

Are you using Windows? if yes, it seems there's a problem with that.
you may try a previous version of GPTstudio to make it run normally.
try the code: remotes::install_version("gptstudio", version = "0.2.0")
also make sure you have credits to use your API key

@nsolsen85
Copy link
Author

Thanks!

Yes, I am using Windows. It seems that installing the 0.2.0 version did help.
(The API key did have credits.)

@HilianoMop
Copy link

OK great!

@JamesHWade
Copy link
Collaborator

Duplicate of #142

@JamesHWade JamesHWade marked this as a duplicate of #142 Sep 24, 2023
@JamesHWade
Copy link
Collaborator

Can someone on a windows machine please give it a try to see if it works now?

@JamesHWade JamesHWade reopened this Sep 24, 2023
@HilianoMop
Copy link

Thanks James.
I tried now in my Windows machine at home which still had version 0.3.0 and did not work (incomplete answers from chatbot and error messages).
however, when changing to version 0.2.0 it works normally (full answers, no error).

@JamesHWade
Copy link
Collaborator

Would you mind trying on 0.3.1? I don't expect 0.3.1 to work. You can install it with pak::pak("michelnivard/gptstudio").

Thanks for the help!

@HilianoMop
Copy link

OK I just tried this version 0.3.1.
1st prompt gave a full answer.
2nd prompt gave incomplete answer and this error message:
Warning in readRDS(path) : invalid or incomplete compressed data
Warning: Error in readRDS: error reading from connection
121: readRDS
120:
118: valueFunc
117:
101: reactive_stream
100: renderUI
99: func
86: renderFunc
85: output$app-chat-streaming
4: shiny::runApp
3: eval
2: eval
1: .rs.sourceWithProgress
3rd and 4th prompts also incomplete answers with error messages:
Background session status: 200, done callr-rs-result-26084fd12b6b, FALSE, ,
, and NULL: FALSE
Background session status: 200, done callr-rs-result-26085bed48b2, NULL, , ,
and list(message = "in callr subprocess.", srcref = NULL, status = 0, stdout =
"", stderr = "", parent_trace = list(parent = c(12, 13, 15, 15, 17, 18, 19, 19,
20, 22, 22, 22, 24, 25, 26, 0, 28), visible = c(TRUE, TRUE, TRUE, TRUE, TRUE,
TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE),
namespace = c(NA, NA, "gptstudio", "base", "gptstudio", "gptstudio",
"gptstudio", "gptstudio", NA, NA, "base", NA, "gptstudio", "base", "base",
"base", NA), scope = c("", "", ":::", "::", "::", "::", ":::", "::", "", "",
"::", "local", ":::", "::", "::", "::", "global"), srcref = list(NULL, NULL,
NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL,
NULL, NULL), procsrcref = list(NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL,
NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL), pid = c(412, 412, 412,
412, 412, 412, 412, 412, 412, 412, 412, 412, 412, 412, 412, 412, 412), call =
list("gptstudio::gptstudio_job(skeleton, skill, style, task, custom_prompt)",
c("gptstudio_skeleton_build(skeleton, skill, style, task, custom_prompt) %>% ",
" gptstudio_request_perform() %>% gptstudio_response_process() %>% "),
"save_skeleton(.)", "saveRDS(skeleton, skeleton_file())",
"gptstudio_response_process(.)", "gptstudio_request_perform(.)",
"gptstudio_request_perform.gptstudio_request_openai(.)",
"stream_chat_completion(prompt = messages, model = model)",
c("httr2::request(base_url) %>%
httr2::req_url_path_append("chat/completions") %>% ", "
httr2::req_body_json(body) %>% httr2::req_auth_bearer_token(openai_api_key) %>%
"), c("httr2::req_stream(., callback = function(x) {", "
openai_stream_parse(x)"), "isTRUE(callback(buf))", "callback(buf)",
"openai_stream_parse(x)", "saveRDS(gptstudio_env$stream$parsed, file =
streaming_file())", "gzfile(file, mode)", c(".handleSimpleError(function (e) ",
"{"), "h(simpleError(msg, call))")), call = NULL, procsrcref = NULL, parent =
list(message = "cannot open the connection", call = "gzfile(file, mode)",
srcref = NULL, procsrcref = NULL))

@JamesHWade
Copy link
Collaborator

My best guess is this is a problem with saveRDS taking too long to save. I've increased the wait time to hopefully help with that. It may be fixed in the latest 0.3.1.

As a reminder for others, you can install the latest version with pak::pak("michelnivard/gptstudio").

@calderonsamuel
Copy link
Collaborator

Hey @JamesHWade Do you think re-implementing the R6 streaming flow could help with this? I could give it a try, but will take time.

Pros:

  • We would stop using on-disk saves, which will improve user experience (no more blinking response). Looks like saveRDS() is the problem here and hard coded wait times are always fragile/suboptimal solutions.
  • No need to use gptstudio_env and methods.
  • Hopefully will have unit testing

Cons:

  • It will probably mean stop using httr2::req_stream() and going back to curl::curl_fetch_stream() + session$sendCustomMessage(), which is difficult to document. I honestly look at the code with req_stream() and don't understand how it translates the streaming, cuz that doesn't happen when I do "print testing" on my machine.
  • In the short term, everything executes 100% as a background job, which will make extra difficult to feed code from "source" or "environment" in the near future.
  • Probably some other internal changes.
  • Probably a few weeks to get it done. In the meantime the bug reported here keeps happening.

@JamesHWade
Copy link
Collaborator

I like the idea of re-implementing R6. I suggest we create an issue on httr2 to ask for help with req_stream(). You may have discovered a bug in req_stream. Let me know how I can help with a re-implementation.

@AlexanderCasper
Copy link

Same issue here, newer versions are buggy for Windows. Older version works.

@andreifoldes
Copy link

andreifoldes commented Nov 10, 2023

hello - thx for this plugin - I just downloaded it as a Windows user;
unfortunately experiencing the cutoff on 0.3.0:
image

but the Dev version seems to work! 0.3.1.9000

@JamesHWade
Copy link
Collaborator

Closing since dev version seems to work

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants