-
Notifications
You must be signed in to change notification settings - Fork 766
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Auto-import is limited by packageIndexDepth #4682
Comments
Thanks for the issue. Using requirements.txt or pyproject.toml sounds like a good way to default index depths higher. Persisted indices also makes the index depth less of a problem too. Maybe our default should be more than 2. |
Yes I was surprised to find "persist to disk" as an option that defaulted to off. My setup was taking 10 minutes to index on every start (on a i7 12700K with fast SSDs). It took hours to go and find all the directories that I needed to manually exclude to make start up time reasonable. But this only was noticeable with WSL, if I recall correctly. Is there a reason the default wouldn't just be no depth limit? The assumption seems to be that more deeply nested packages are less likely to be imported. In my opinion (as someone not very knowledgeable with VS Code!), sensible defaults would be:
At least that's what PyCharm does and it 'just works' without having to go hunting through settings to coerce the IDE into making everything available. |
Persist to disk will likely be the default soon. It's still in the experimental stages to make sure it doesn't break anything. No depth limit sounds unlikely unless we can determine the actual requirements. Some people have 1000s of packages installed. Indexing all of that to an infinite depth would not be tractable in my opinion. Not unless we significantly change the architecture of Pylance. UI feedback seems like a good thing too. I'm surprised we don't do this already. How is a user supposed to know when Pylance is 'ready'? |
It might be interesting to get some numbers. For a project with lots of packages, how many symbols are found at depth limits of 1, 2, 3, Infinity etc. Comparing no limit vs a limit of, say, 3 might only be 10% more symbols, but means a user goes from having 90% of their symbols indexed to 100%, which is much better. I've no idea what the actual values are of course. Question: is there a way for the user to set the depth for all packages? If not, perhaps Another question: is the primary concern index time, or lookup time? |
Setting all of the index depths can be done like so: "python.analysis.packageIndexDepths": [
{
"name": "",
"depth": 20
}
] There's also a limit to how many files we will index. So, changing the depth probably means that value needs to be set too: "python.analysis.userFileIndexingLimit": 2000 The depth is there because indexing can be really slow. We have to open every file, parse it, look for all public symbols, and then write that to disk (or a table in memory). The index is there to make lookup time a lot faster later. The other problem is the index (in the current design) just gets loaded into memory (either by creating it or reading it from disk). If the depth is too deep, this can blow out most of the memory for the server process. Maybe it will get more complicated in the future with a indexed database or something. |
For vscode version: 1.84.2 on macOS Sonoma, this problem still persists with Django despite mentioning everything in packageIndexDepth. My settings look like this:
I still cannot auto import TestCase and other classes from django. @davidgilbertson you said that you fixed this issue with this option. What settings did you use to fix it? It would be really helpful if you could share it. Thanks. |
I wouldn't set All my "python.analysis.completeFunctionParens": true,
"python.analysis.autoImportCompletions": true,
"python.analysis.autoFormatStrings": true,
"python.analysis.packageIndexDepths": [
{
"name": "sktime",
"depth": 99,
"includeAllSymbols": true
},
{
"name": "rich",
"depth": 99
},
{
"name": "statsmodels",
"depth": 99
},
{
"name": "sklearn",
"depth": 99
},
{
"name": "matplotlib",
"depth": 99
},
{
"name": "scipy",
"depth": 2
}
],
"python.analysis.persistAllIndices": true, I don't recall if this continued to work, I gave up on VS Code and went back to PyCharm which is much better at indexing and auto importing. |
We're attempting to make auto imports work better this month and this issue is on the short list of enhancements. I'm going to attempt to outline a mini spec on what it would look like: Mini Spec
Hopefully all of that will be transparent and auto imports for packages in your requirements.txt will just work like expected. |
we have related issues - #5434 |
Okay so after discussing this, we believe the idea I outlined above would be too slow for most people. We're instead going to try the idea outlined in the issue Heejae just linked: #5434. Where the idea of using requirements.txt to set index depth would be part of the 'full' mode. |
Edit, I typed out this whole issue, then found the
python.analysis.packageIndexDepths
setting that fixes the issue.I'm going to create the issue anyway for future travellers, and because I think the UX can be improved here for newcomers to VS code who haven't read through every single setting and just want auto-import to work (in my case, coming from PyCharm I almost gave up on VS Code because I thought it was buggy because auto-import only sometimes worked with no clue that it was actually designed to not index everything!).
Environment data
Repro Steps
venv
environmentpip install numpy rich
- one of these works with auto-import, the other doesn'tx = ndarray<ctrl+space>
- auto import workst = Table<ctrl+space>
- auto import works notFor
numpy
But for
rich
:But if I type it out, it can see that
rich
has aTable
.So, does it only work for packages with stubs? It seems to be the less common packages that don't work with auto import. E.g.
sklearn
works fine,sktime
doesn't.Expected behavior
Should find items from any package. Or if performance is a concern, any package that I import in my project, or any package in
requirements.txt
orpyproject.toml
or something like that. Note that for the above, I both imported rich and created arequirements.txt
withrich
in it. And in another project I have the dependency listed inpyproject.toml
, but still no luck.Actual behavior
Doesn't work for lots of packages.
Logs
You can search the below for
table.py
, it finds the file and says it parses it. So it finds and parses the file but then chucks away the info because of the package depths setting?The text was updated successfully, but these errors were encountered: