Wednesday, May 26, 2010

Upcoming Speaking Engagements

Saturday June 6th, 2010 
SQL Saturday! Pensacola Code Camp 2010
Zen Coding

More Info

Wednesday, May 19, 2010

Lessons from the Road

I have spent the past two weeks on-site with a client. This is a Greenfield project. Essentially it is making data accessible via the web. The iterations are one week long.

Although the assignment is not yet complete, there are some important lessons I want to share about my experience so far.

TDD Works
I am an avid “test first” developer, but it is often on an existing codebase. This project gave me the perfect opportunity to go TDD all the way. The number one rule was no domain or model code gets written without a corresponding unit test, and that test must be written first. Period, no exceptions.

Being disciplined and following this one simple rule allowed me to breeze through the code. It also allowed me to easily and constantly refactor the code without fear or hesitation. I let the design unfold organically and only implemented code that solved the problem at hand. I was unconcerned with what might be needed for a future iteration or feature. When the requirements changed or it was time to add a new feature, it was a painless process.

Access to the Client Works
I realize this one is not always practical. In this case, the office I was working in was two doors down from my main user. If there were any questions about how something should be implemented, feedback was immediate. Which leads me to…

Specless”, Simple Design Works
There was no predefined technical specification, functional specification, or DSD (Design Specification Document), and no “big up front design” (BUFD). We simply had a set of user stories, written after two meetings with the client to discuss their needs. Our user stories formed the basis for all of the unit tests which guided the development of the code, architecture, and application design.

In fact, using this approach, along with the client’s accessibility, ensured the design was flexible enough to survive any of their changes.

We had a feature that was implemented incorrectly. At our next demo, the client noticed it right away. Two hours later, it was refactored to meet their needs precisely and in production. If we had been using a BUFD approach, we wouldn’t have known there was an issue until after the release date.

In a budget sensitive project (and they all are), there is never enough time to go back and fix those critical imperfections. The client is left wanting, the users are disappointed, the check-singers are angry, the developer looks like an amateur, and the company looks unreliable. It is a fail for everyone.

Following the techniques described above have kept the code lean and clean, kept the client happy, and kept the project on schedule and under budget, so far…

Wednesday, May 12, 2010

The Cost of Quality

Last week I read this story about the Census Bureau and their “Paper Based Operations Control System” (the emphasis is mine):

A computer system that the Census Bureau needs to manage its door-to-door count of the U.S. population remained buggy and prone to crash a day before enumerators were set to begin their work, government officials said Friday.

Great! A computer based software system for counting results from the Census – in 2010 – imagine that.

The bureau's Paper Based Operations Control System did not function reliably in tests and, despite hardware and software upgrades, "may not be able to perform as needed under full operational loads," the U.S. Government Accountability Office said in a report.

Translation – the system was slow and unstable and rather than apply quality software practices, we decided to get a bigger box – brilliant!

The paper-based system's hasty design began in early 2008, after the census bureau scrapped plans to use a handheld-computer method that ended up costing more than $700 million but did not operate adequately.

I’m sorry, could you please repeat that, I don’t think I heard you correctly. Did you say two years of design and $700 million (and let us not forget, that is 700 million tax dollars) and the system still doesn’t work!?!? Who was coding this (a very important detail missing from the story)? I wonder how much of that budget was spent on QA. At what point in the process did the project manager decided to scrap unit testing to “save costs” and “meet the schedule”?

Returning to paper-based method boosted the cost of the census by about $3 billion that using the handheld computers was supposed to have saved.

So not only was the system a complete failure, and the development effort cost $700 million, the total cost the American tax payers will be stuck with is $3.7 billion.

"We will get the census done with this system," he said after the hearing. "The question is, will everyone be smiling when it's done."

Well at least the users have found a way around the system. And by the way, the developers are not only smiling, but laughing all the way to the bank with $700 million check in tow.

Wednesday, May 5, 2010

Upcoming Speaking Engagements

Thursday May 27th, 2010 
Acadiana .NET User Group
Zen Coding
211 East Devalcourt
Lafayette, La 70506