This question is often raised by testing teams, what should we automate, or should we automate everything.

I would recommend teams to find answer themselves by asking the question what value will all the automated tests provide vs cost & time to be invested to build the automation suite. We automate to get faster feedback and shortened the testing time, which leads to a huge saving in terms of time and money for the team.

The automation requirements define what needs to be automated looking into various aspects. The specific requirements can vary based on product, time and situation, but still I am trying to sum-up few generic tips.

Test cases to be automated
  • Tests that need to be run with every build of the application (sanity check, regression)
  • Tests that use multiple data values for the same actions (data driven tests)
  • Complex and time consuming tests
  • Tests requiring a great deal of precision
  • Tests involving many simple, repetitive steps
  • Testing needed on multiple combinations of OS, DBMS & Browsers
  • Creation of Data & Test Beds
Test cases not to be automated
  • Usability testing – “How easy is the application to use?”
  • One-time testing
  • “ASAP” testing – “We need to test NOW!”
  • Ad hoc/random testing – based on intuition and knowledge of application
  • Device Interface testing
  • Back-end testing
Prioritize those tests that are to be automated, give them weightage on risk and ease of automation, a simple matrix can then be drawn up to show which tests will take the most time to automate and their relative risk/importance to automate. You can then place them into priority order and plan accordingly.