Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump quickjs to 2024 #133

Merged
merged 6 commits into from
Feb 1, 2024
Merged

Bump quickjs to 2024 #133

merged 6 commits into from
Feb 1, 2024

Conversation

L-jasmine
Copy link
Collaborator

No description provided.

Copy link
Member

juntao commented Jan 28, 2024

Hello, I am a code review bot on flows.network. Here are my reviews of code commits in this PR.


Overall Summary:
In the GitHub Pull Request titled "Bump quickjs to 2024", there are several important findings and potential issues that need to be addressed.

First, there are dependency updates, including a changed checksum for the base64 crate. This may introduce compatibility issues or bugs, so it is crucial to review the changes in the base64 crate to ensure it won't cause any problems. Additionally, the updates to several dependencies may introduce compatibility issues or breaking changes, so it's important to review the changelogs or release notes for each dependency.

Next, there are changes related to the development environment setup. The removal of the .devcontainer/devcontainer.json file may have unintended consequences, so it's necessary to understand the reason behind this change and ensure that it does not impact the project. Furthermore, the use of sudo in the command for installing WasmEdge should be reviewed to ensure it is necessary and doesn't introduce security risks. Additionally, the URL for downloading the WasmEdge-plugin-wasmedge_rustls tarball is not updated to reflect the new version.

There are also changes related to the addition of new files and directories, such as ggml_chat.js and ggml/mod.rs. These changes don't seem to have any potential problems.

Furthermore, there are changes made to specific code files, such as src/internal_module/httpx/js_module.rs and a fix for the promise_loop_poll function in the Context struct. These changes require further testing and clarification to ensure their correctness and understand their purpose.

Finally, there is a fix for an issue with the process.argv variable, but the introduction of globalThis.args without sufficient context raises concerns about potential problems it may cause.

In summary, this pull request introduces several important changes and potential issues that need to be addressed. It's crucial to review the changes in base64 crate, dependency updates, development environment setup, specific code modifications, and the introduction of globalThis.args. Clarifications and further testing are needed to ensure the correctness and impact of the proposed changes.

Details

Commit d8e70af5bc103e448da01d4608d14332a4183a62

The key changes in this patch are:

  • Bumping the version of the quickjs library to 2024.
  • Updating the versions of several dependencies, including base64, bitflags, bytemuck, conv, crossbeam-deque, crossbeam-epoch, crossbeam-utils, env_logger, getrandom, is-terminal, libc, linux-raw-sys, lock_api, memchr, miniz_oxide, num-integer, num-traits, openssl, parse_display, pico-args, proc-macro2, quote, rand, rayon, rayon-core, serde, serde_derive, serde_json, thread_local, time, unicode-xid, wasm-bindgen, wasi, winapi, winapi-build, winapi-i686-pc-windows-gnu, winapi-x86_64-pc-windows-gnu, windows-sys, and winreg.

Potential problems to note:

  • The checksum for the base64 crate has changed, indicating a modification to the crate. This may introduce compatibility issues or bugs. Review the changes in the base64 crate to ensure it won't cause any problems.
  • Several dependencies have changed versions. This may introduce compatibility issues or breaking changes. Review the changelogs or release notes for each dependency to identify any potential problems.
  • The patch does not include any details or explanation of the changes in quickjs or the updated dependencies. This might make it difficult to understand the rationale behind the changes or evaluate their impact. Request additional information from the author

Commit 7ca0db86c34c09304243836ebaaeee1f8b9d91d8

Key changes in the patch:

  1. The file .devcontainer/devcontainer.json is deleted.
  2. In the file .github/workflows/examples.yml, the WasmEdge version is bumped from 0.13.4 to 0.13.5. The command for installing WasmEdge is also modified to include the plugins wasi_nn-tensorflowlite and wasi_crypto.

Potential problems:

  1. Removing the .devcontainer/devcontainer.json file may have unintended consequences for the development environment setup. It would be good to understand the reason behind this change and ensure that it does not impact the project.
  2. The use of sudo in the command for installing WasmEdge (curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | sudo bash -s -- -e all --version=$VERSION --plugins wasi_nn-tensorflowlite wasi_crypto -p /usr/local) should be reviewed to ensure it is necessary and does not introduce security risks.
  3. The URL for downloading the WasmEdge-plugin-wasmedge_rustls tarball is not updated to reflect the new version ($VERSION should be replaced with 0.13.5).

These are the most important findings and potential issues in the patch.

Commit 6d0be6d33d978ca83d037754a4e45a9380c9b082

The key changes in this patch include:

  • A new file ggml_chat.js is added to the example_js directory.
  • A new directory ggml is added to the src/internal_module directory, containing a new mod.rs file.

In ggml_chat.js:

  • A new chat completion request is imported from the _wasi_nn_ggml_template module.
  • The build_graph_from_cache function is imported from the _wasi_nn_ggml module.
  • The main function is defined, which creates a chat prompt template, initializes an execution context, and pushes messages to the chat prompt template.
  • The main function is called at the bottom of the file.

In ggml/mod.rs:

  • A new module ggml is defined.
  • The WasiNNGraph struct is defined, wrapping the wasi_nn::Graph type.
  • Implementation of JsClassDef for WasiNNGraph, defining the class methods and fields.

There don't appear to be any potential problems with these changes.

Commit cb444fef94d90737619b293f89427776d3d0a096

Key changes:

  • In the file src/internal_module/httpx/js_module.rs, the js_get_method method of the HttpRequest struct has been modified.
  • The line ctx.new_string(&format!("{:?}", self.method)).into() has been changed to ctx.new_string(self.method.to_string().as_str()).into().

Potential problems:

  • The change seems to convert the self.method value to a string using to_string() and then convert it to a &str using as_str().
  • There could be potential issues with the conversion if self.method is not compatible with the to_string() and as_str() methods.

Overall, the change appears to be a fix for the js_get_method method, ensuring that the self.method value is properly converted to a string. However, further testing may be needed to ensure that the conversion works as expected in all scenarios.

Commit c39f8e3b6503e7d1c0e76f534fcf5aa826265641

Key Changes:

  • The function promise_loop_poll in the Context struct has been modified to fix an issue.
  • The function now contains unsafe code that executes pending jobs using JS_ExecutePendingJob.

Potential Problems:

  • Removing the #[deprecated] attribute from the function promise_loop_poll might indicate that it is no longer deprecated. However, it is unclear whether this change is intentional or a mistake.
  • The variable pctx is initialized with a null pointer and then used as an argument to js_std_dump_error without any validation. This could cause a null pointer dereference and crash the program if err is less than or equal to 0.
  • It is unclear whether the loop should continue executing pending jobs until err is less than or equal to 0. The purpose of this loop needs to be clarified.

Overall, it is important to address the potential problems and clarify the intentions behind the changes made in this patch.

Commit 17128409cda50459b4559f559c98a736233dfb25

Key changes:

  • The patch fixes an issue with the process.argv variable.

Potential problems:

  • The patch adds a fallback for globalThis.argv using globalThis.args, but it's unclear what globalThis.args is and if it will provide the same functionality as globalThis.argv. Further clarification is needed.

Overall, the patch seems to address a specific issue and makes a minor change to the process.js file. However, the introduction of globalThis.args without sufficient context raises doubts about the potential problems it may cause.

Copy link
Member

@juntao juntao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add a CI example with TinyLlama? Thanks.

https://huggingface.co/second-state/TinyLlama-1.1B-Chat-v1.0-GGUF

@@ -0,0 +1,57 @@
import { GGMLChatCompletionRequest, GGMLChatPrompt } from '_wasi_nn_ggml_template'
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this package private or internal? Why do we have _ in the package name?

"repeat-penalty": 1.1
}

let graph = build_graph_from_cache(3, JSON.stringify(opt), "default")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does the first argument 3 mean?


let req = new GGMLChatCompletionRequest()

let messages = ['hello', 'who are you?']
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would like to show an interactive example here -- like llama-chat in LlamaEdge.


while (1) {
try {
context.compute_single()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we use streaming responses? In fact, we could have two examples. One uses the simple response and the other uses streaming responses.

print("[YOU]:", messages[i])
req.push_message("user", messages[i])
let p = template.build(req)
context.set_input(0, p, [1], 3)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do those numbers in the call parameters mean?


let template = new GGMLChatPrompt('llama-2-chat')

let req = new GGMLChatCompletionRequest()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can I set an optional system prompt here?


for (var i in messages) {
print("[YOU]:", messages[i])
req.push_message("user", messages[i])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the user here is a special string? Maybe we should turn it into a const to avoid mis-spelling etc.

}
}
}
req.push_message("assistant", ss)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as the user comment above, but for assistant

Copy link
Member

@juntao juntao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We decided to postpone the GGML examples to a later PR.

@juntao juntao merged commit d76ccf4 into main Feb 1, 2024
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants