Test Driven
Test Driven
PRACTICAL TDD AND ACCEPTANCE TDD
FOR JAVA DEVELOPERS
LASSE KOSKELA
MANNING
Greenwich
(74° w. long.)
For online information and ordering of this and other Manning books, please visit
www.manning.com. The publisher offers discounts on this book when ordered in quantity.
For more information, please contact:
Special Sales Department
Manning Publications Co.
Sound View Court 3B fax: (609) 877-8256
Greenwich, CT 06830 email:
©2008 by Manning Publications Co. All rights reserved.
No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in
any form or by means electronic, mechanical, photocopying, or otherwise, without prior written
permission of the publisher.
Many of the designations used by manufacturers and sellers to distinguish their products are
claimed as trademarks. Where those designations appear in the book, and Manning
Publications was aware of a trademark claim, the designations have been printed in initial caps
or all caps.
Recognizing the importance of preserving what has been written, it is Manning’s policy to have
the books we publish printed on acid-free paper, and we exert our best efforts to that end.
Manning Publications Co.
Sound View Court 3B
Greenwich, CT 06830
Copyeditor: Laura Merrill
Typesetter: Gordan Salinovic
Cover designer: Leslie Haimes
ISBN 1-932394-85-0
Printed in the United States of America
1 2 3 4 5 6 7 8 9 10 – MAL – 13 12 11 10 09 08 07
To my colleagues,
for bugging me to finish this project.
And to my love Lotta,
who gave me the energy to do it.
brief contents
PART 1
A TDD PRIMER .............................................. 1
1
■
The big picture 3
2
■
Beginning TDD 43
3
■
Refactoring in small steps
4
■
Concepts and patterns for TDD
75
99
PART 2 APPLYING TDD
TO SPECIFIC TECHNOLOGIES ........................ 151
5
■
Test-driving web components
6
■
Test-driving data access
7
■
Test-driving the unpredictable 249
8
■
Test-driving Swing
vii
279
153
195
viii
BRIEF CONTENTS
PART 3 BUILDING PRODUCTS
WITH ACCEPTANCE TDD.............................. 321
9
■
Acceptance TDD explained 323
10
■
Creating acceptance tests with Fit
11
■
Strategies for implementing acceptance tests
12
■
Adopting TDD
appendix A
■
Brief JUnit 4 tutorial 467
appendix B
■
Brief JUnit 3.8 tutorial
appendix C
■
Brief EasyMock tutorial
appendix D
■
Running tests with Ant 475
435
470
473
364
396
contents
preface xvii
acknowledgments xix
about this book xxi
about the cover illustration
xxvii
PART 1 A TDD PRIMER ..............................................1
1
The big picture 3
1.1
The challenge: solving the right problem right 5
Creating poorly written code
actual needs 6
1.2
5
Solution: being test-driven
■
Failing to meet
7
High quality with TDD 8 Meeting needs with
acceptance TDD 10 What’s in it for me? 11
■
■
1.3
Build it right: TDD
14
Test-code-refactor: the heartbeat 15 Developing
in small increments 19 Keeping code healthy with
refactoring 24 Making sure the software
still works 28
■
■
■
ix
x
CONTENTS
1.4
Build the right thing: acceptance TDD
What’s in a name? 31
a shared language 33
1.5
■
31
Close collaboration
Tools for test-driven development
32
■
Tests as
36
Unit-testing with xUnit 36 Test frameworks for acceptance
TDD 37 Continuous integration and builds 37 Code
coverage 39
■
■
1.6
2
Summary
■
41
Beginning TDD 43
2.1
From requirements to tests
45
Decomposing requirements 45 What are good tests made
of? 47 Working from a test list 47 Programming by intention
■
■
2.2
■
48
Choosing the first test 48
Creating a list of tests 49 Writing the first failing test 50
the first test pass 54 Writing another test 56
■
■
Making
■
2.3
Breadth-first, depth-first
58
Faking details a little longer 59
2.4
Let’s not forget to refactor
■
Squeezing out the fake stuff
63
Potential refactorings in test code 64
2.5
60
Adding a bit of error handling
■
Removing a redundant test 65
66
Expecting an exception 66 Refactoring toward smaller
methods 68 Keeping methods in balance 69 Expecting
details from an exception 70
■
■
2.6
Loose ends on the test list
Testing for performance
2.7
3
■
Summary
72
71
■
A looming design dead-end
73
73
Refactoring in small steps 75
3.1
Exploring a potential solution
76
Prototyping with spikes 77 Learning by writing
tests 77 Example spike for learning an API 78
■
■
3.2
Changing design in a controlled manner 80
Creating an alternative implementation
81
■
Switching over safely
86
CONTENTS
3.3
Taking the new design further
Keeping things compatible 90
3.4
4
Summary
xi
90
Making the switchover 95
■
98
Concepts and patterns for TDD 99
4.1
How to write tests and make them pass
100
Test-selection strategies 101 Implementation
strategies 104 Prime guidelines for test-driving
■
106
■
4.2
Essential testing concepts
108
Fixtures are the context for tests 108 Test doubles stand in for
dependencies 110 State and interaction-based testing 110
■
■
4.3
Closer look into test doubles
113
Example of a test double 113 Stubs, fakes, and
mocks 115 Mock objects in action 116
■
■
4.4
Guidelines for testable designs
118
Choose composition over inheritance 119 Avoid static and
the Singleton 120 Isolate dependencies 122 Inject
dependencies 124
■
■
4.5
Unit-testing patterns
Assertion patterns
4.6
■
127
128
■
Fixture patterns 132
Working with legacy code
■
Test patterns 137
144
Test-driven legacy development 145 Analyzing the
change 146 Preparing for the change 147 Test-driving
the change 148
■
■
4.7
Summary
■
148
PART 2 APPLYING TDD TO SPECIFIC TECHNOLOGIES 151
5
Test-driving web components 153
5.1
5.2
MVC in web applications in 60 seconds
Taming the controller 156
Test-driving Java Servlets
controllers 168
156
■
154
Test-driving Spring
xii
CONTENTS
5.3
Creating the view test-first
173
Test-driving JSPs with JspTest
Velocity templates 179
5.4
174
6
Test-driving
TDD with component-based web frameworks
Anatomy of a typical framework
pages test-first 186
5.5
■
Summary
185
184
Fleshing out Wicket
■
193
Test-driving data access 195
6.1
Exploring the problem domain
196
Data access crosses boundaries 197
the DAO pattern 198
6.2
■
Separating layers with
Driving data access with unit tests
199
Witnessing the tyranny of the JDBC API 200 Reducing pain
with Spring’s JdbcTemplate 205 Closer to test-driven nirvana
with Hibernate 211
■
■
6.3
Writing integration tests before the code 219
What is an integration test? 220
the database 222
6.4
Integration tests in action
■
Selecting
225
Writing our first Hibernate integration test 226
the database schema 230 Implementing the
production code 233 Staying clean with
transactional fixtures 234
■
Creating
■
■
6.5
Populating data for integration tests
Populating objects with Hibernate
with DbUnit 237
6.6
File-system access
Summary
247
243
■
Best of both
245
A tale from the trenches
file access 246
6.8
Populating data
Should I drive with unit or integration tests? 243
TDD cycle with integration tests
worlds 244
6.7
236
235
■
245
■
Practices for testable
CONTENTS
7
Test-driving the unpredictable 249
7.1
Test-driving time-based functionality
250
Example: logs and timestamps 250 Abstracting
system time 252 Testing log output with faked
system time 256
■
■
7.2
Test-driving multithreaded code
259
What are we testing for? 260 Thread-safety
Blocking operations 266 Starting and
stopping threads 268 Asynchronous
execution 271 Synchronization
between threads 274
■
261
■
■
■
7.3
Standard synchronization objects
Semaphores 275
Futures 277
7.4
8
Latches
276
What to test in a Swing UI
280
Summary
■
Barriers 276
277
Test-driving Swing
8.1
■
275
279
Internal plumbing and utilities 281
layout 281 Interaction 282
■
Rendering and
■
8.2
Patterns for testable UI code 283
Classic Model-View-Presenter 284 Supervising
Controller 284 Passive View 287
■
■
8.3
Tools for testing view components
Why do we need tools?
tools 292
8.4
290
■
290
TDD-friendly
Test-driving a view component 297
Laying out the design 298 Adding and operating
standard widgets 300 Drawing custom
graphics 304 Associating gestures
with coordinates 314
■
■
■
8.5
Summary
319
xiii
xiv
CONTENTS
PART 3 BUILDING PRODUCTS
WITH ACCEPTANCE TDD............................. 321
9
Acceptance TDD explained 323
9.1
Introduction to user stories
Format of a story 325
of user stories 326
9.2
Acceptance tests
■
325
Power of storytelling
325
■
Examples
327
Example tests for a story 327 Properties of acceptance
tests 328 Implementing acceptance tests 333
■
■
9.3
Understanding the process
The acceptance TDD cycle
an iteration 343
9.4
334
334
■
Acceptance TDD inside
Acceptance TDD as a team activity
348
Defining the customer role 348 Who writes tests with the
customer? 350 How many testers do we need? 350
■
■
9.5
Benefits of acceptance TDD
351
Definition of “done” 351 Cooperative work 353 Trust and
commitment 354 Specification by example 354 Filling
the gap 354
■
■
■
9.6
■
What are we testing, exactly? 355
Should we test against the UI? 355 Should we stub parts of our
system? 357 Should we test business logic directly? 358
■
■
9.7
Brief overview of available tools
359
Table-based frameworks 359 Text-based frameworks 361
Scripting language-based frameworks 361 Homegrown tools 362
■
■
9.8
10
Summary
362
Creating acceptance tests with Fit 364
10.1
What’s Fit? 365
Fit for acceptance TDD 366 Test documents contain fixture
tables 369 Fixtures: combinations of tables and classes 371
■
■
10.2
Three built-in fixtures
372
ColumnFixture 373 RowFixture 376
xtending the built-in fixtures 382
■
■
ActionFixture
379E
CONTENTS
10.3
Beyond the built-ins with FitLibrary
DoFixture
10.4
384
■
SetUpFixture 388
Executing Fit tests
■
384
There’s more
390
390
Using a single test document 391 Placing all tests in a folder
structure 391 Testing as part of an automated build 392
■
■
10.5
11
Summary
394
Strategies for implementing acceptance tests 396
11.1
What should acceptance tests test?
397
Focus on what’s essential 398 Avoid turbulent
interfaces 399 Cross the fence where it is lowest
■
■
11.2
Implementation approaches
400
401
Going end-to-end 401 Crawling under the
skin 404 Exercising the internals 407 Stubbing out
the irrelevant 409 Testing backdoors 411
■
■
■
■
11.3
Technology-specific considerations
411
Programming libraries 412 Faceless, distributed
systems 413 Console applications 415 GUI
applications 417 Web applications 421
■
■
■
■
11.4
Tips for common problems
425
Accelerating test execution 426 Reducing complexity of
test cases 431 Managing test data 432
■
■
11.5
12
Summary
434
Adopting TDD 435
12.1
What it takes to adopt TDD
436
Getting it 436 Sense of urgency 437 Sense of
achievement 438 Exhibiting integrity 438 Time
for change 439
■
■
■
12.2
Getting others aboard
■
440
Roles and ability to lead change
12.3
441
■
Change takes time
How to fight resistance 444
Recognizing resistance 444 Three standard responses to
resistance 448 Techniques for overcoming
resistance 449 Picking our battles 453
■
■
■
443
xv
xvi
CONTENTS
12.4
How to facilitate adoption
454
Evangelize 454 Lower the bar 457 Train and
educate 458 Share and infect 459 Coach and
facilitate 461 Involve others by giving them
roles 463 Destabilize 464 Delayed rewards 465
■
■
■
■
■
■
12.5
Summary
■
465
appendix A Brief JUnit 4 tutorial
467
appendix B Brief JUnit 3.8 tutorial 470
appendix C Brief EasyMock tutorial 473
appendix D Running tests with Ant 475
resources 481
index 487
preface
Seven years ago, in the midst of a global IT boom, programming shops of all
shapes and sizes were racing like mad toward the next IPO, and the job market
was hotter than ever. I had been pulled into the booming new media industry and
was just starting my programming career, spending long days and nights hacking
away at random pieces of code, configuring servers, uploading PHP scripts to a
live production system, and generally acting like I knew my stuff.
On a rainy September evening, working late again, my heart suddenly skipped
a beat: What did I just do? Did I drop all the data from the production database?
That’s what it looked like, and I was going to get canned. How could I get the data
back? I had thought it was the test database. This couldn’t be happening to me!
But it was.
I didn’t get fired the next morning, largely because it turned out the customer
didn’t care about the data I’d squashed. And it seemed everyone else was doing
the same thing—it could have been any one of us, they said. I had learned a lesson, however, and that evening marked the beginning of my journey toward a
more responsible, reliable way of developing software.
A couple of years later, I was working for a large multinational consulting company, developing applications and backend systems for other large corporations.
I’d learned a lot during my short career, thanks to all those late nights at the computer, and working on these kinds of systems was a good chance to sharpen my
skills in practice. Again, I thought I knew my stuff well when I joined the ranks.
xvii
xviii
PREFACE
And again, it turned out I didn’t know as much as I thought. I continued to learn
something important almost every day.
The most important discovery I made changed the way I thought about software development: Extreme Programming (XP) gave me a new perspective on the
right way to develop software. What I saw in XP was a combination of the high productivity of my past hack-a-thons and a systematic, disciplined way to work. In
addition to the fact that XP projects bring the development team closer to the customer, the single biggest idea that struck a chord with me was test-driven development (TDD). The simple idea of writing tests before the code demolished my
concept of programming and unit-testing as separate activities.
TDD wasn’t a walk in the park. Every now and then, I’d decide to write tests
first. For a while, it would work; but after half an hour I’d find myself editing production code without a failing test. Over time, my ability to stick with the test-first
programming improved, and I was able to go a whole day without falling back on
my old habits. But then I stumbled across a piece of code that didn’t bend enough
to my skills. I was coming to grips with how it should be done but didn’t yet have
all the tricks up my sleeve. I didn’t know how to do it the smart way, and frequently I wasn’t determined enough to do it the hard way. It took several years to
master all the tricks, learn all the tools, and get where I am now.
I wrote this book so you don’t have to crawl over the same obstacles I did; you
can use the book to guide your way more easily through these lessons. For me,
catching the test-first bug has been the single most important influence on how I
approach my work and see programming—just as getting into agile methods
changed the way I think about software development.
I hope you’ll catch the bug, too.
acknowledgments
Taking an idea and turning it into a book is no small feat, and I couldn’t have
done it without the help of the legion of hard-core professionals and kind souls
who contributed their time and effort to this project.
First, thanks to Mike Curwen from JavaRanch, who started it all by connecting
me with Jackie Carter at Manning in early 2005. Jackie became my first development editor; she taught me how to write and encouraged me to keep going. Looking back at my first drafts, Jackie, I can see that what you did was a heroic act!
I’d also like to thank the rest of the team at Manning, especially publisher Marjan Bace, my second development editor Cynthia Kane, technical editor Ernest
Friedman-Hill, review editor Karen Tegtmeyer, copy editor Laura Merrill, proofreader Tiffany Taylor, and project editor Mary Piergies. It was a true pleasure
working with all of you.
I didn’t write this book behind closed doors. I had the pleasure of getting valuable feedback early on and throughout the development process from an excellent cast of reviewers, including J. B. Rainsberger, Ron Jeffries, Laurent Bossavit,
Dave Nicolette, Michael Feathers, Christopher Haupt, Johannes Link, Duncan
Pierce, Simon Baker, Sam Newman, David Saff, Boris Gloger, Cédric Beust, Nat
Pryce, Derek Lakin, Bill Fly, Stuart Caborn, Pekka Enberg, Hannu Terävä, Jukka
Lindström, Jason Rogers, Dave Corun, Doug Warren, Mark Monster, Jon Skeet,
Ilja Preuss, William Wake, and Bas Vodde. Your feedback not only made this a better book but also gave me confidence and encouragement.
xix
xx
ACKNOWLEDGMENTS
My gratitude also goes to the MEAP readers of the early manuscript for their
valuable feedback and comments. You did a great job pointing out remaining discrepancies and suggesting improvements, picking up where the reviewers left off.
I wouldn’t be writing this today if not for my past and present colleagues, from
whom I’ve learned this trade. I owe a lot to Allan Halme and Joonas Lyytinen for
showing me the ropes. You continue to be my mentors, even if we no longer work
together on a day-to-day basis. I’d like to thank my fellow moderators at JavaRanch for keeping the saloon running. I’ve learned a lot through the thousands
of conversations I’ve had at the ranch. And speaking of conversations, I’d especially like to thank Bas Vodde for all the far-out conversations we’ve had on trains
and in hotel lobbies.
Special thanks to my colleagues at Reaktor Innovations for their encouragement, support, enthusiasm, and feedback. You’ve taught me a lot and continue to
amaze me with your energy and talent. It’s an honor to be working with you.
I’d also like to thank my clients: the ones I’ve worked with and the ones who
have attended my training sessions. You’ve given me the practical perspective for
my work, and I appreciate it. I wouldn’t know what I was talking about if it weren’t
for the concrete problems you gave me to solve!
My life as a software developer has become easier every year due to the tools
that open source developers around the world are creating free of charge for all
of us. Parts 2 and 3 of this book are full of things that wouldn’t be possible without
your philanthropic efforts. Thank you, and keep up the good work. I hope to
return the favor one day.
Finally, I’d like to thank my family and loved ones, who have endured this
project with me. I appreciate your patience and unfailing support—even when I
haven’t been there for you as much as I should have. And, most important, I love
you guys!
about this book
Test-driven development was born in the hands and minds of software developers
looking for a way to develop software better and faster. This book was written by
one such software developer who wishes to make learning TDD easier. Because
most of the problems encountered by developers new to TDD relate to overcoming technical hindrances, we’ve taken an extremely hands-on approach. Not only
do we explain TDD through an extended hands-on example, but we also devote
several chapters to showing you how to write unit tests for technology that’s generally considered difficult to test. First-hand experiences will be the biggest learning
opportunities you’ll encounter, but this book can act as the catalyst that gets you
past the steepest learning curve.
Audience
This book is aimed at Java programmers of all experience levels who are looking
to improve their productivity and the quality of the code they develop. Test-driven
development lets you unleash your potential by offering a solid framework for
building software reliably in small increments. Regardless of whether you’re creating a missile-control system or putting together the next YouTube, you can benefit
from adopting TDD.
Our second intended audience includes Java programmers who aren’t necessarily interested in TDD but who are looking for help in putting their code under
test. Test-driven development is primarily a design and development technique; but
xxi
xxii
ABOUT THIS BOOK
writing unit tests is such an essential activity in TDD that this book will lend you a
hand during pure test-writing, too—we cover a lot of (so-called) difficult-to-test technologies such as data-access code, concurrent programs, and user-interface code.
Whether you’re simply looking to get the job done or have a larger goal of personal improvement in mind, we hope you’ll find this book helpful.
Roadmap
You’re reading a book that covers a lot of ground. In order to structure the material, we’ve divided the book into three parts with distinct focuses. Part 1 introduces the book’s main topics—test-driven development and acceptance testdriven development—starting with the very basics.
Chapter 1 begins with a problem statement—the challenges we need to overcome—and explains how TDD and acceptance TDD provide an effective solution
in the form of test-first programming, evolutionary design, test automation, and
merciless refactoring.
Chapter 2 gets our hands dirty, extending our understanding of TDD through
an in-depth example: a homegrown template engine we test-drive from scratch.
Along the way, we discuss how to manage the tests we want to write in a test list
and how to select the next test from that list.
Chapter 3 finishes what chapter 2 started, continuing the development of the
template engine through an extensive design change, starting with a spike—a
learning experiment—and then proceeding to make the change to the template
engine in a controlled, disciplined manner.
Chapter 4 brings our perspective back to a higher level to explain the strategies in our toolkit, from selecting tests to making them pass. We also talk about
essential testing concepts such as fixtures, test doubles, and the differences
between state- and interaction-based testing. After giving some guidelines for creating testable designs, chapter 4 ends with an overview of a number of key test patterns and a section on working in a test-first manner with legacy code.
Part 2 is about getting dirty again, demonstrating through working examples
how we can apply TDD when working with a variety of technologies that are sometimes referred to as being “difficult to test-drive.” After part 2, you’ll know that
folks who say that don’t know what they’re talking about!
Chapter 5 starts our journey through the trenches of web development. We
learn to test-drive request/response-style web layers using plain old Java Servlets
and Spring Controllers, and we learn to test-drive the presentation layer built with
JavaServer Pages and Apache Velocity templates. The chapter also contrasts these
request/response examples with test-driving web applications using a componentbased framework, Apache Wicket.
ABOUT THIS BOOK
xxiii
Chapter 6 explains how to test-drive the data-access layer behind our web components. We’ll see examples of test-driving data-access objects based on raw JDBC
code, the Spring Framework’s JdbcTemplate API, and the de facto object-relational
mapping (ORM) tool, Hibernate. We’ll also discuss how to deal with the database
in our unit tests and how to fill in the gaps with integration tests. Finally, we share
a few tricks for dealing with the file system.
Chapter 7 takes us to the land of the unknown: nondeterministic behavior.
After first examining our options for faking time, we turn our attention to multithreading. We begin with a discussion of what we can and should test for, exploring topics such as thread safety, blocking operations, starting and stopping
threads, and asynchronous execution. Our trip to the world of the unpredictable
ends with a tour of the new synchronization objects from java.util.concurrent
that were introduced in Java 5.
Chapter 8 is about face—the face of Java Swing applications, that is. Again, we
begin by figuring out what we should test for when test-driving UI code. Then, we
look at three design patterns that make our test-driven lives easier, and we briefly
introduce two open source tools—Jemmy and Abbot—for unit-testing Swing components. We finish chapter 8 (and part 2) with an extended example, test-driving
the face and behavior for a custom Swing component.
Part 3 is a change of tempo. We move from the concrete world of test-driving
objects and classes into the fuzzier world of building whole systems in a test-first
manner with acceptance TDD.
Chapter 9 gets us going with an introduction to user stories for managing
requirements, and to the essence of acceptance tests. Once we’re up to speed with
the what, we focus on the how—the process of acceptance TDD and what it
requires from the team. We also crystallize the benefits of and the reasons for
developing software with acceptance TDD. The chapter ends with a discussion of
what kinds of aspects our acceptance tests should specify about the system we’re
building and an overview of some of the tools in our disposal.
Chapter 10 makes acceptance TDD more concrete by taking a closer look at Fit,
a popular acceptance-testing tool. Our Fit tutorial begins with a description of how
the developer can use Fit to collaborate with the customer, first sketching acceptance tests in a tabular format and then touching them up into syntax recognized
by Fit. We then see how to implement the backing code that glues our tabular tests
into interaction with the system, first going through the three standard fixtures
built into Fit and then looking at additional utilities provided by the FitLibrary, an
xxiv
ABOUT THIS BOOK
extension to Fit. Finally, we learn to run our precious Fit tests from the command
line and as part of an Apache Ant build.
Chapter 11 expands our perspective by looking at a number of strategies for
implementing our acceptance tests independent of the tools in use. After going
through our options for connecting tests to the system we’re developing, we discuss the kinds of limitations and opportunities that technology puts in our way.
We also share some tips for speeding up acceptance tests and keeping complexity
in check.
Chapter 12 ends part 3 as a black sheep of sorts—a chapter on ensuring the
success of TDD adoption. We begin by exploring what ingredients should be in
place for us to achieve lasting change, both for ourselves and for our peers. We
then focus on resistance: how to recognize it and how to deal with it. Finally, we go
through a long list of things in our toolbox that can facilitate the successful adoption we’re seeking.
Because writing unit tests is so central to test-driven development, we’ve also
provided three brief tutorials on some of the essential tools; you can use them as
cheat sheets. Appendices A and B are for the JUnit unit-testing framework, illustrating the syntax for versions 4.3 and 3.8, respectively. Appendix C does the same
for EasyMock, a dynamic mock-object framework we can use to generate smart
test doubles.
Test-driving code in the comfort of our favorite IDE is cool, but we need to
make those tests part of our automated build. That’s why we’ve included appendix D: a brief tutorial for running JUnit tests with Apache Ant, the standard build
tool for Java developers.
Code conventions
The code examples presented in this book consist of Java source code as well as a
host of markup languages and output listings. We present the longer pieces of
code as listings with their own headers. Smaller bits of code are run inline with
the text. In all cases, we present the code using a monospaced font, to differentiate
it from the rest of the text. In part 2, we frequently refer from the text to elements
in code listings. Such references are also presented using a monospaced font, to
make them stand out from plain English. Many longer listings also have numbered annotations that we refer to in the text.
Code downloads
The complete example code for the book can be downloaded from the Manning
website page for this book, at This includes