-
Notifications
You must be signed in to change notification settings - Fork 483
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Breaking down large files into smaller chunks based on context window size #2
Comments
@0xpayne This is a highly important fix. When will it be available pls? |
IMO, this is quite a dangerous thing. At least from some experiments using the regular GPT webinterface, I found that "carelessly" splitting a larger file can lead to vastly crappy results when some code relies on previous functions / definitions / variables. |
@Sineos I totally agree. GPT can't create/handle logic, even more if the code is broke down to chunks. The code quality is correlated with the dependency between variables, functions, libraries, classes, etc... |
This problem can be partial solved with AST tree. |
I've been (slowly) working on a solution for this where I've abstracted away the separately "compilable" parts. My initial aim was to use it for a project like this that I was going to write. But it seems like adding it to this project would be more worthwhile. |
Our library does this using tree-sitter: https://github.com/janus-llm/janus-llm/blob/public/janus%2Flanguage%2Ftreesitter%2Ftreesitter.py |
So does mine, but yours looks way more advanced. It looks like you're trying to do the same thing as this project? |
It has some differences. We're trying to focus on other aspects of modernization that aren't just direct translation of source files. Although, we still have that functionality. And we don't have the loop where we run code, get an error, and update the code based on output. |
@doyled-it Looks like your project could use that too.... |
No description provided.
The text was updated successfully, but these errors were encountered: