Configurable workspace indexing limit? #2608
Replies: 12 comments 4 replies
-
Workspace indexing is enabled only in pylance, so I'm going to transfer this issue to the pylance repo. |
Beta Was this translation helpful? Give feedback.
-
For the latter idea, see: #1150 We'd appreciate your vote on that if that interests you. |
Beta Was this translation helpful? Give feedback.
-
Is the hard limit in-place to protect against memory issues? We have a huge mono-repo (~+4000 python files in the workspace) and the lack of indexing is very noticeable. |
Beta Was this translation helpful? Give feedback.
-
I can confirm this is a large issue with mono-repositories. |
Beta Was this translation helpful? Give feedback.
-
Is there a workaround for this issue? |
Beta Was this translation helpful? Give feedback.
-
In some cases you might be able to exclude files that you don't need. For example, I'm excluding our Django migrations files in
Would be nice to configure this in |
Beta Was this translation helpful? Give feedback.
-
This issue has been around for a long while so I figured I'd give it another bump with some data from my system. Go To Symbol in Workspace doesn't work at all in my project. The logs indicate we have approx 35,000 source files which is approximately correct.
|
Beta Was this translation helpful? Give feedback.
-
So this issue seems like a hard wall for moving to VSCode. Is there a way to build VSCode with a higher limit then 2000? |
Beta Was this translation helpful? Give feedback.
-
It's hard coded into the code to avoid out of memory issues |
Beta Was this translation helpful? Give feedback.
-
Throwing my hat in the ring, we also have a large monorepo with over 2000 files, and this limit seems to cause inconsistent behavior with symbol renames and autocomplete. Please make it configurable. |
Beta Was this translation helpful? Give feedback.
-
This is now available through the |
Beta Was this translation helpful? Give feedback.
-
I seem to have a similar issue, where pylance is indexing 6k files. It is running out of memoryfrom reading the virtual environment, so I recon part of the environment goes without indexing. Is this normal behavior? I tried signalling the venv folder and excluding it from linting, but then i get missing import errors on standard python packages. Here is my pyproject.toml [tool.mypy]
mypy_path = ".stubs"
strict = true
pretty = true
disallow_untyped_defs = true
disable_error_code = ["import-untyped"]
plugins = ["strawberry.ext.mypy_plugin"]
[tool.ruff]
src = [".", "src", "tests", "visory"]
extend-exclude = [".github", "**/__pycache__", "**/data"]
output-format = "full"
line-length = 90
indent-width = 4
[tool.ruff.format]
line-ending = "auto"
quote-style = "single"
indent-style = "space"
docstring-code-format = true
skip-magic-trailing-comma = true
docstring-code-line-length = "dynamic"
[tool.ruff.lint.pydocstyle]
convention = "pep257"
[tool.pyright]
exclude = [".github", "**/__pycache__", "**/data", "**/migrations"]
stubPath = "./.stubs"
venvPath = "./.venv"
venv = "./.venv/bin/python" And I am currently running uv as the package manager. Ruff does not say anything about imports, so I am wondering why they have an issue. p.d If I delete the pyright config the import error goes away but then I have the memory and vice versa. |
Beta Was this translation helpful? Give feedback.
-
VS Code tells me
[Warn - 1:02:37 PM] Workspace indexing has hit its upper limit: 2000 files
So I guess I'm hitting https://github.com/microsoft/pyright/blob/2477c9030f42d30ade1562f0a01ff20e5641df5c/packages/pyright-internal/src/analyzer/program.ts#L475
Can this limit be made a configurable
pyrightconfig.json
option?This is in a workspace with a couple thousand Python files and the config json is checked in so excluding some dirs is not an option.
An alternative solution would be to have VS Code provide an
exclude
setting to pass through to Pylance/pyright.Beta Was this translation helpful? Give feedback.
All reactions