Testing And Me: A Love/Hate Relationship
updated almost 5 years ago; latest suggestion almost 5 years ago
A talk primarily aimed at those who haven't had the motivation to/been given the opportunity to/don't think it's worth their time to test their applications
As a developer, I have always found it hard to be passionate about writing tests. Yes, yes, I know, how can a project survive without a good test suite?
The short answer is: it can't.
So, let's get passionate about testing. Not testing everything all the time, but identifying opportunities for implementing good tests, selecting when they are required for ensuring code quality, and balancing time spent writing tests with time spent actually making cool things. How to test pragmatically.
This talk will provide some opinionated answers to:
- Why should I care about testing?
- When in the development cycle should I write tests?
- How much should I be testing?
- What kinds of tests are really important?
- While TDD/BDD is a great ideal to shoot for what alternatives are there?
- Can we implement enough coverage without having to go all the way?
- Argh! You've convinced me! How do I get started with adding tests to a legacy project?
We will explore some of the available testing methodologies and introduce a great way of keeping acceptance/integration tests maintainable by using the Window Driver pattern, along with my personal favourite test stack.
The less test-happy amongst us will walk away with new ideas about where tests can be particularly effective (ie. how best to target your effort), ways of achieving maintainable full stack integration tests and ways that testing can be applied -- even after the horse has already bolted.
I'd definitely like to hear about what to test (not to test).
Coming from a background where the focus was never really on testing until recently, the way the Ruby (and Rails) community focus on it was eye-opening, but as a newcomer to this way of working it would be good to have some idea what parts of, and how much of my code should be tested, and not fall into a trap of writing too many.
Some tips on test optimization would be good too, I've come across many blog posts and comments about slow tests on big apps. Knowing how to avoid or at least mitigate that would be great
Most of us have already heard hundreds of talks while testing is important. I think it would be much more interesting to hear how to recognise you have too many tests and what not to test.
Ok, I've folded some of that in.
I'd be interested to hear suggestions from attendees about anything they want to hear about in the realm of testing.
Is there anything in particular that people would like to hear about test workflows or tools that are available?
Thanks for the clarification - that's really useful (you may also wish to consider folding it in to the main body of the proposal if possible).
@lazyatom: Yes, this is primarily aimed at those who haven't had the motivation to/been given the opportunity to/don't think it's worth their time to test their applications.
While I think that it is generally accepted that TDD/BDD is a great ideal to shoot for, I'm not convinced that the Ruby community should consider it the only way of testing or that it's always the right way of testing.
I'd be encouraging people to take away some new ideas about where tests can be particularly effective (ie. target your effort), ways of achieving maintainable full stack integration tests and ways that testing can be applied -- even after the horse has already bolted.
It sounds like this proposal is aimed mostly at those who haven't aren't drunk the TDD/BDD Kool-aid. Is that accurate?
My experience of testing has been one of evolutionary thinking. From Test First to Test-Driven Development to ultimately Test-Driven Design. I'm interested to know what insights someone who has taken a different approach to their test education can offer.