I was happy to receive a very well thought out response to my recent post (thank you Martin) on the lack of ROI for developing GUI automation for web applications. Instead of burying the response in the comment section, I have posted the poster’s comments and interspersed my thoughts as well.
Here are Martin’s comments with my responses in quotes throughout.
Unfortunately, your blog entry does not mention how automation was done in either of the mentioned cases. It also does not seem to consider that test automation often is or should be aimed at regression testing (finding new issues in existing functionality that was changed). But I would like to take a step back for a moment.
John’s first thoughts:
Automation always helps to reduce the time to complete regression testing, and portions of your automation should also be run as part of your regular build process. The important point, though, is that you can build solid automation focused on at other levels (web services, unit tests, etc..) that will be far more reliable to run and maintain, providing huge value without the same maintenance requirements as you will find with GUI-based automation.
I have been building web applications for years and know that the above types of automation find the majority of the bugs. You will, of course, find a large #s of bugs is in cross platform, cross browser, tests cases. For example, you will often find bugs on the Mac with Safari that you do not see on Windows XP with IE 7. The problem, however, is finding GUI automation tools that are capable of running across the platforms you must support. If you can find, or build one, you must often build additional complexity into your test automation framework, leading to larger maintenance headaches. I still contend that these types of issues can be found more cheaply by manual resources.
Let us consider what factors affect the ROI of a test automation project, for functional testing. Actually, the kind of interface (web, GUI, messaging, whatever) is not the essence, so the below would seem to apply to all test automation.
First of all, how do you define ROI? In terms of money only? That would depend on why you are automating in the first place. What business objectives is it meant to support? Lower cost? A common objective, certainly. But shortening time to market, higher quality, a better grip on the test process, and more effective use of scarce resources can also play their roles. Often a combination of these is desired, and sometimes cost is considered rather less important than others. But to keep it simple, I will focus on cost here.
“Businesses must always focus on ROI in terms of money, both short term expenses as well as long-term investments which include everything from maintenance costs to turnover of employees that are not happy with their career growth. While I will not go into a thorough multi-page analysis of the ROI, consider the following:
- Ignore the cost of the tools. 10 years ago these tools were expensive and they no longer are.
- The yearly salary of automation engineers is significantly higher than manual testers. My preference would be to have a smaller number of automation engineers focused on unit testing, web services testing (more expensive, I agree, than GUI automation engineers) with a few more manual testers. In simple terms I can replace every GUI automation engineer with two manual testers, a good trade-off for test coverage and for our economy with more workers working.
- From a career growth perspective, QA engineers have a better career path because those doing automation are working with languages like Java and C# as opposed to tool-specific languages (like SilkTest 4Test as an example).
- The cost for each business will vary from company to company. However, analyze the bugs in your product for the last year and ask yourself:
- What % of bugs would have been found with unit or web services testing?
- What % of your application bugs were platform specific? With those bugs would a manual tester have found the other UI bugs as well? You need to invest in testing the other platforms anyway and you cannot do it with most functional automation tools.
- How many hours per week do you spend in maintaining your automation?
- When you run your entire automation suite, how many hours of manual testing are you replacing it with?
- For the location that you are hiring at, do the skills exist to hire/replace existing people will needed?
There are clearly more questions but those listed above are the most critical.”
Also, it is not so much the extent to which the testing is automated that matters. Dorothy Graham says it could make sense even to automate just 2% of your testing, and I would agree. The question is whether the investment pays off, not how much you invest or how much you automate. An for a pilot project, trying out automation on 2% could also be a good idea.
I think what determines ROI is (mainly) the following:
- The setup cost (licenses, perhaps servers, training),
- the cost of creating the testware (automated test + supporting framework / software),
- the cost of maintaining the testware, and
- the cost reduction in manual testing.
You are right, of course, regarding the fact that you do not need to automate 100% of the test cases. In fact, you would fail if you tried to automate all test cases, but that’s another discussion entirely.”
The setup cost depends on whether you buy or build, whether you use open source or commercial tools, etc. More and more people are finding out that free software can offer a lot of bang for no bucks (with no vendor lock in …).
The time and effort spent on creating the testware is interesting, but usually much less than the time and effort spent on maintaining it, as with all software. So let us consider the total effort. It depends to a large extent on:
- The development process (e.g.: how much time between releases),
- The product planning (e.g.: how much change per release),
- The test automation approach (an advanced one using a DSTL requires much less maintenance than Record & Playback; is automation complete when the product is ready for testing or is that when automation can start – DSTL=Domain Specific Test Language), and
- The skill of the automator (the quality of the DSTL, the maintenance sensitivity and maintainability of the testware).
The cost reduction I will leave for you to determine.
The amount of time between releases does not have to be an issue if the amount of manual maintenance is low enough. A good DSTL (high level, no irrelevant execution details, tooling details and interfacing details in the test) helps a lot by reducing the maintenance to the test. A good automator helps a lot by reducing the maintenance to the rest of the testware. The page object concept, for example, can be a great help here, both for web and GUI apps.
Again, good points. However, the number of great automators vs. average automators ensures you never achieve the ideal situation above. Even if you do, however, investing in other forms of automation combined with manual testing against the web front end is almost always cheaper.
One guy’s research indicated that about two out of every three test automation projects fails sooner or later. A major factor is that many are not aware that test automation is software engineering, and people with the wrong skill set do it. As a result, they may work hard, but not very smart. It is a bit like the guy who is asked what he would do to empty a bath tub when given a tea spoon, a regular spoon and a really big spoon. When he choses the big spoon to be able to work faster, he is asked why he does not simply pull the plug …
Of course, there are limits even to what a skilled automator can do. But in my experience of 10+ years of creating solutions for automated testing of all sorts of systems, maintenance was never a problem.
While I don’t know who this ‘one guy’ is, the numbers make sense. Software projects fail at a high rate, a rate that is sometimes overblown due to the lack of a solid definition of project success versus project failure. Automation development, if you ignore simple record/playback, is software development.
I am fine with agreeing to disagree but wanted to thank you for the detailed and passionate response. Passion is what makes life great, don’t lose it.