Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve performance of infer primary key #10782

Merged
merged 2 commits into from
Sep 25, 2024

Conversation

gshank
Copy link
Contributor

@gshank gshank commented Sep 25, 2024

Resolves #10781

Problem

Poor performance of inferring primary key.

Solution

Do not process primary keys for unchanged models. When inferring a primary key, start by making a map of models to generic tests.

Checklist

  • I have read the contributing guide and understand what's expected of me.
  • I have run this code in development, and it appears to resolve the stated issue.
  • This PR includes tests, or tests are not required or relevant for this PR.
  • This PR has no interface changes (e.g., macros, CLI, logs, JSON artifacts, config files, adapter interface, etc.) or this PR has already received feedback and approval from Product or DX.
  • This PR includes type annotations for new and modified functions.

@gshank gshank requested a review from a team as a code owner September 25, 2024 20:08
@cla-bot cla-bot bot added the cla:yes label Sep 25, 2024
Copy link

codecov bot commented Sep 25, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 88.98%. Comparing base (359a2c0) to head (50c0a27).
Report is 1 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main   #10782      +/-   ##
==========================================
- Coverage   89.04%   88.98%   -0.06%     
==========================================
  Files         181      181              
  Lines       23115    23126      +11     
==========================================
- Hits        20583    20579       -4     
- Misses       2532     2547      +15     
Flag Coverage Δ
integration 86.18% <100.00%> (-0.14%) ⬇️
unit 62.18% <5.26%> (-0.03%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Components Coverage Δ
Unit Tests 62.18% <5.26%> (-0.03%) ⬇️
Integration Tests 86.18% <100.00%> (-0.14%) ⬇️

Copy link
Contributor

@ChenyuLInx ChenyuLInx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! A small nit that is not blocking

for node in self.manifest.nodes.values():
if not isinstance(node, ModelNode):
continue
generic_tests = self._get_generic_tests_for_model(node)
if node.created_at < self.started_at:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why we can continue here because this node is from partial parting which means it didn't change? maybe a short comment?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, because it hasn't changed on this parse. That equivalent line occurs in 19 other places, so I'm not going to bother documenting it here :)

@gshank gshank merged commit ac66f91 into main Sep 25, 2024
64 of 66 checks passed
@gshank gshank deleted the infer_primary_keys_only_for_changed_models branch September 25, 2024 22:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Improve performance of "process_model_inferred_primary_keys"
2 participants