Skip to content

As a Singly Developer, I want to check in new code and have integration tests run automatically

ctide edited this page Sep 2, 2011 · 8 revisions

Analysis

Goal

The goal of this story is, first and foremost, to setup a server that will run all of our tests with every checkin and complain loudly if things are broken. The secondary goal will be to clean up the test suite so that it's actually reliable.

Scoping

  • Includes: All of our tests checked into master are run on every checkin.
  • Includes: These tests should rarely fail if there isn't any actual breakage
  • Includes: Some sort of web UI for the team to see the state of the build.
  • Includes: Establishing a test environment that mirrors production in terms of packages that are installed.
  • Includes: Notifications [email + IRC] for failures and for when the build is fixed again.
  • Excludes: Setting up selenium tests. This should be created as a separate story as getting selenium-rc running happily is a very non-trivial task.
  • BONUS: Fix all of the race conditions
  • BONUS BONUS: Enable it to run the suite on all checkins on all branches as well.
  • BONUS BONUS BONUS BONUS: http://www.thinkgeek.com/geektoys/warfare/8a0f/

Acceptance criteria

  • When I check in code that breaks the build, the entire team is notified [email + IRC] about the breakage.
  • When I check in code that fixes that error, the entire team is notified [email + IRC] that I've fixed the build.

Testing

  • There are no specific tests that need to be created, but we will need to setup a new environment for the CI tests to be run out of.

Dependencies

  • N/A

Wireframes

  • N/A

Mockups

  • N/A

List of known tasks

  • Setup a new environment that can run our tests manually successfully.
  • Setup Jenkins CI to run our test suite
  • Setup notifications from Jenkins
  • Potentially fix the tests

Analysis review

Signed off by:

  • ...

Known issues

  • Some tests still fail on Forrest's machine. I don't really know how to fix them, but until they start failing on CI, I'm not going to worry about them. I'm a bit reluctant to comment them out as I have to comment out the entire locker-client-test.js since it all seems to be duct taped together and commenting out the test that occasionally fails causes a landslide of failures.
  • Do we want to move the bot into the main channel now or leave it by itself for a bit until we feel super happy about it?
  • Branches - Right now I added searchresults and master to the CI tool, but it's super trivial for other people to add branches if we don't want to just run tests against everything.

Pre-acceptance checklist

  • Ask someone who did not work on the project to get it working on their laptop, to chase out issues you may have missed
  • Make appropriate preparations so that remote developers can follow along with the demo

Acceptance checklist

  • Agreement from the team on how we will deal with failures.
  • Demo
  • Have the acceptance criteria been met?
  • Are the tests sufficient?
  • Review known issues
  • Does the implementation meet expectations for code quality and clarity?

Merge checklist

  • Merge into master
  • Check that tests pass
Clone this wiki locally