Category Archives: Agile

Agile Testing with Lisa Crispin – Part 2

Here is Part 2 of my notes on Lisa Crispin’s talk on Agile Testing. If you haven’t go catch up on Part One.

Lisa noted that her team stopped committing at the sprint level. They just work hard, don’t waste time and just focus on delivering the software. This works by letting the customer know that you are working by being transparent to them.

Teams need time to learn, experiment, and need slack. Need to give the team time to innovate and catch up on the latest technology, as well as to have time to move to the latest technology.

Automated tests need as much care and feeding as the code.

She noted that by learning the business it helped cut down time dealing with production support. She noted that they found scenarios where they could automated support tasks, or were even solving the wrong business problems. Lisa gave an example where a user kept requesting a report and it was being delivered to the user as they understood it, but it took sitting down with the end user to actually understand the report as the user was actually requesting it.

Lisa made reference to look at: Daniel Pink and Intrinsic Motivators, The Agile Samurai by Jonathan Rasmusson, and Jim Heismith and Israel Gat and their research into measuring technical debt, and her article Selling Agile to the CFO.

The quote of the evening seemed to be: “If it doesn’t have to work, you don’t have to test it.”

Emphasized that QA shouldn’t be treated as separate from development; QA time is part of development time.

Lisa pointed out that the most value was not in the actual integration tests, but was in the communication between the developers and testers that resulted from the interaction.

If you have too many thing going on at the same time, you task switch too much, the result is that you have a hard time predicting when you are done.

Encouraged us to try to get away from labels and just try to deliver the best value and best quality software that you can.

Encourage cross pollination across different teams in the area. You never know where new ideas come from. She talked about how she brought back the idea of an impediment backlog from when she visited over in the UK. When she took this idea back to her team she noticed that just making the impediment visible helped the team address those issues. –This reminded me of the Craftsman Swap that both Obtiva and 8th Light encourage, as well as Corey Haines’ journeyman tour.

Agile Testing with Lisa Crispin – Part 1

This past Wednesday, Apr-20-2011, I attended the DFW Scrum meetup with guest Lisa Crispin, @lisacrispin, presenting over Skype, and I managed to take a wonderful 7 pages of notes in my composition book on her presentation. Because of this I will be breaking this up into a number of posts to help make it more digestible. I hope I didn’t butcher her talk up too much as I was busy trying to keep up with all of the gems she was throwing out to us. Apologies to Lisa if I did.

The big thing she started with was: before a team tries to go off and make any decisions, or do anything, they need to answer the question: “What does a commitment to quality mean?” Once answered, only then can they procede to improve the quality of their product.

On Reducing Show-Stoppers

The steps Lisa’s team went about reducing the number of show-stoppers they had in their product.

  • 1st they setup the basics: Continuous Integration and a dedicated test environment.
  • Once they had those in place, they setup a police light for show-stoppers. And anytime someone would report a show-stopper, that person then had to turn on the light. This had a two fold effect; it made the business person look silly if it was really a trivial bug, and it got annoying for the team if that light was constantly on.
  • Development started TDDing their code. She made a quick side note that TDD is hard to learn, and really, any test automation is hard to learn. She pointed out that it took the developers 8 months to get over the hump of TDD.
  • In the meantime, they wrote manual test scripts over the critical parts of the application. It was painful, and a great motivation for automating tests.
  • Got UI based automated tests running.
  • Worked to get functional automated tests instead of UI testing. Lisa mentioned that her team used Fitnesse.
  • They started with a happy path case, after they had that going they woud then add tests around the more boundary and error condition cases.
  • She noted that it took lots of baby steps over 8 years with a commitment to testing.

Testing is Not a Phase

The goal is a short feedback loop, as it is easier to recall the code an hour later as opposed to a month or two later. She noted that testers may be against this at first since it means testing the same thing multiple times, but that is important to shortening the feedback loop and improving the quality. I would also personally venture that it would help emphasize the importance of getting tests automated against a baseline set of expected functionality.

Lisa advised against calling a story done until all of the exploratory testing has been done.

She then pointed out some things to watch out for when planning. Watch out for overcommitting, since it usually doesn’t take into account the testing activities and anything they uncover. Also watch out for testing estimates that are not inline with development effort/estimate. Giving the example that if the testing effort is 2X the development effort, that may mean development might be missing something.

Continued…

I will be posting part two soon as this was only two-and-a-half pages of the seven pages of notes.