Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Complexity Analysis Benchmark #4

Open
ishimwed opened this issue Nov 29, 2022 · 0 comments
Open

Complexity Analysis Benchmark #4

ishimwed opened this issue Nov 29, 2022 · 0 comments

Comments

@ishimwed
Copy link
Collaborator

We want to perform a systematic survey of complexity analysis papers. From each paper, we collect all the programs used to evaluate the resulting tool. We will convert those programs into SV-COMP format and discard duplicates.
We can start with papers used in Didier's comprehensive exam report. Note that some tools only work on cost models or intermediate languages instead of working implementations. If the cost models aren't automatically generated from working implementations by the tool we will ignore their benchmark programs. We want to focus on papers which explicitly aims at inferring complexity bounds as opposed to numerical invariant in general.
Preliminary list of complexity analysis papers

  1. Chora
  2. KoAT
  3. https://link.springer.com/chapter/10.1007/978-3-540-45069-6_39
  4. SPEED
  5. RAML
  6. Dynaplex
  7. ICRA
  8. Badger
  9. CAMPY
  10. TiML

As an example program in figure 4.4 of SPEED was transformed into this example for 2019 Complexity Analysis Competition in SV-COMP. More examples can be found at the same SV-COMP.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant