You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For now controllers just send full text file to workers, and periodically killing workers, if they are outdated.
This is very error-prone because the controller parses the file for location information, or attrpath construction, and workers parses these files again.
Previously we don't have a memory-safe nix parser and it is coupled with evaluators in upstream C++ nix. Thus the only solution is sending raw text contents to the worker, and let them evaluate the file, the controller just roughly redirect lsp requests to workers. And since the controller does not have a good parser, we have experienced very high overhead.
There are a parser generated by bison and in-complete semantic analysis module in these two PRs. For now they are outdated because we have a new handwritten parser.
Let the controller do all parsing stuff, and serialize ASTs to workers.
As commented above, currently both controller process & worker process do parsing stuff, and I think a better architecture is just offload all parsing job to the controller. Serialize ASTs in shared memory and let workers fetch the AST and perform evaluation.
The controller will ask worker process values & envs for specific nodes from worker, not sending full text contents. The serialization format & implementation:
Uses a new language frontend just for parsing, but uses official nix implementation for evaluation
This is a technical decision. Because evaluator is not very easy to maintain and current implementation "just works" for language servers. Here are PRs related to the new frontend:
Motivation
For now controllers just send full text file to workers, and periodically killing workers, if they are outdated.
This is very error-prone because the controller parses the file for location information, or attrpath construction, and workers parses these files again.
Previously we don't have a memory-safe nix parser and it is coupled with evaluators in upstream C++ nix. Thus the only solution is sending raw text contents to the worker, and let them evaluate the file, the controller just roughly redirect lsp requests to workers. And since the controller does not have a good parser, we have experienced very high overhead.
Related issues:
Previous Work
There are a parser generated by bison and in-complete semantic analysis module in these two PRs. For now they are outdated because we have a new handwritten parser.
The iteration plan
Let the controller do all parsing stuff, and serialize ASTs to workers.
As commented above, currently both controller process & worker process do parsing stuff, and I think a better architecture is just offload all parsing job to the controller. Serialize ASTs in shared memory and let workers fetch the AST and perform evaluation.
The controller will ask worker process values & envs for specific nodes from worker, not sending full text contents. The serialization format & implementation:
Uses a new language frontend just for parsing, but uses official nix implementation for evaluation
This is a technical decision. Because evaluator is not very easy to maintain and current implementation "just works" for language servers. Here are PRs related to the new frontend:
Misc Refactor
Token
,getXxx()
->xxx()
#293Parser
( expr )
#308inherit
support) #314inherit
binding #321Semantic Lowering
AttrSet
s #323Semantic Checking
The new worker, and new controller
Controller
nix-node-eval
The text was updated successfully, but these errors were encountered: