Jake Scruggs

writes ruby/wears crazy shirts

Jan 25, 2007

Testing vs. Speed

So I tend to be on the more extreme side of the testing debate. On my current project I helped institute extensive use of RSpec and Selenium Remote Control. RSpec allows you to test the models, view, and controllers in isolation from each other through the use of a built in mocking framework (to mock out ActiveRecord objects and Stub out call to finders) and some ideas borrowed from Zen Test. Selenium Remote Control allows you to drive a browser (your choice) from Ruby (or most other langs). Which is cool because we were going to have lots of Javascript in this project to control the Google Maps features. The RSpec tests went well: Easy to write once you got a handle on their syntax, and they run real damn fast they don’t hit the database for the view and controller tests (their opinion is that model tests should still hit the db b/c models are so tied to the db it would be silly to test them without it). For the SeleniumRC tests we used RSpec as the harness and created page objects to query the page. Here’s a typical example:



context "Having not logged in" do

specify "going to the create_event page redirects to the login page
and upon logging in redirects back to the create_event page" do
navigate_to.home_page
home_page = get_home_page
home_page.go_to_create_event_page
login_page = get_login_page
login_page. login "Username" “Test”
get_create_event_page
end

end

The context specifies a given condition (having not logged in) but the text is not executed. The specify is a “specification” or what most people think of as a test. We have a navigator object that knows how to get around the app, a page_mother object that can get page objects, and page objects that know how to do and get things on a given page. So, to take the spec line by line:


  navigate_to.home_page

Send the brower go to the right url.


  home_page = get_home_page


Get the home_page object and run some assertions to check to see if the browser is really on the home_page.


  home_page.go_to_create_event_page

The home_page object knows how to click a link that takes it to the create_event_page


  login_page = get_login_page

The get_login_page method gets the login_page object and makes sure we are indeed on the login_page (we’ve been re-directed there as only logged in users can create events).


  login_page.login "Username" “Test” 

The login_page knows how to log in with a given username and password.


  get_create_event_page

And finally the get_create_event_page method makes sure we are on the right page.

All tests are run in an instance_evel of a page_mother instance that has an instance of navigator returned by navigate_to.

Sound complex? Well it was. This was one of the most well tested apps I’ve ever worked on and, as it turned out, that was totally not what the client wanted.

Crap.

We spent a lot of time writing and running these tests because:

Google maps is hard to test – you have to write some fun javascript on your page to keep trac of any markers you place on the page because the GMap object won’t give them back to you once they’ve been set on it. And other stuff like that.

Google maps takes a bunch of time to render. We had to put our timeouts at 30 seconds and still tests failed b/c the map wasn’t back by the time we made an assertion. This, of course, caused the test to run way slow as lots of pages had GMaps.

The page_mother, navigator, and page object framework we invented was very elegant but pretty heavyweight. When you created a new page you had a lot of stuff to write just to test that page.

So our test suite, while impressive, ran slow and took a bunch of time to write. Which would have been fine if the client was interested in a bug-free site at all cost. Everybody on the ThoughtWorks team had just come from projects where we worked on apps that handled money and sensitive information. You really do not want to screw up that stuff. But his project was a little social network site where nobody would die if some data got lost. What we found out after 6 weeks is that the client just wanted something out there that looked good and in a hurry so they could see if anyone was interested in this site. Luckily when Obie and Zed joined the team they were able to see how much this reliability was slowing down the team and find out that the client didn’t much care about the value of rigorous testing. So we went in and ripped out all the SeleniumRC tests.

It was a sad day for me.


I was really proud of that test suite and it had taken a lot of hard work. Much more important was the fact that I had failed as a consultant. I had let an unspoken assumption slip through which stopped the team from delivering what the client wanted.

After we stopped writing and running Selenium tests (we still wrote unit tests with RSpec) we could go way faster and knock out a bunch of functionality in the last two weeks. Which is awesome: The site is looking good. Obie and Zed plan to go back and put in some Selenium tests once the site becomes more stable.


Now before I get beat-up for saying testing make you go slow, I should say this: I know that in the long term testing makes you faster. Speed you get now by not testing comes at the price of fragility and buggyness and trouble refactoring later. This is true. But for now the client is willing to accept a bit of ruff edges if it means speed – And we, as consultants, have to respect that.

1 comments :

Aaron Erickson said...

Indeed - you are experiencing what I call taking out a technical mortgage. Personally, I don't think it is anything to be ashamed of. At some point, agile has to include the option to not do as rigorous testing in cases that demand it.

While some might consider the concept heresy, it happens, whether we want it to or not. Where we need to spend out energy is in distinguishing cases where you really should NOT have technical debt (mission critical, handling money, people might die if it fails) versus those where it might be ok (first to market prototype, non-mission critical, etc.).