Going Beyond Regression - What Other Benefits could End-to-End Testing Provide?

In last week's post, I gave a few examples about when Polymorphic End-to-End Testing testing makes sense. In this post, I would like to take a step back and list a few additional benefits that could be derived from end to end testing in general, regardless of the fashion they were written in.

Note: some of these points can also apply to non-end-to-end tests too, that were written in a black box style with proper test abstractions. While it could be argued that then the system is that one component, I would rather just focus on the benefits we can derive on top of the regression / specification of the system.

Disclaimer: note this post has been filed under untested ideas - they sound good, but I haven't gotten around to implementing all of them.

Big Refactorings and Rewrites

Change is inevitable, and they often violate previous assumptions. Sometimes whole components (or systems) have to be rewritten. Having a set of tests that operate on a much higher abstraction level (e.g.: HTTP GET/POST requests) can provide the required safety net to avoid regressions and making sure all relevant scenarios are addressed.

Some changes where this Page (Application) Object abstraction has helped us:

  • when converting a single page checkout process to a multi-step wizard style checkout
  • when the article numbers used in our system changed
  • when we had to synchronize data into a new system - we could just expand our assertions in the end-to-end tests to make sure that every known scenario is written correctly into the new system

It's easy to reconstruct a system from its tests, but much harder to do it the other way around. It has been great to only adjust a few driver API methods and get the same amount of functional coverage as before, without having to rewrite the test suite.

Forces the team to think about the user interface (and experience)

While it is not required, it often made us reduce the complexity in the UI - when we find that a certain step is being exercised by a method with a single parameter from the tests, yet that method then derives a bunch of additional parameters to POST against the page, it suggests one of two things:

  1. we are missing some test cases for these extra parameters
  2. maybe we don't need these parameters to be provided by the end user, but we could derive them in the application too.

Often the helper methods created can expose the need for additional support interfaces that won't probably come up during the specifications phase, only after go-live.

Last, but not least, end-to-end gives us tests for the UI, yet the tests remain maintainable - usually a single test API method is all that needs to be fixed after template changes (and designers are even harder to get to write tests than developers :)). Don't be afraid of many test failures!

Correlate tests with other business metrics

While I recall people suggesting we run applications with a coverage profiler attached, the performance penalty is usually prohibiting.

However, I haven't yet seen a web application without a ton of external metrics related to the urls in the app. If our tests are written against urls too, after some data munging (primary keys and actual form values surely won't match test values exactly, but translating them to GET to view 1, POST to view 2) we can correlate our tests with these metrics.

Some such metrics:

  • application (webserver) access logs
  • Google Analytics or equivalent
  • ...

What can we learn from these correlations/comparisons?

  • are we concentrating our tests in the least visited areas?
  • are we testing what our users are doing? Sure, it's nice that in our tests people sequentially finish their checkout, without wondering off the known path, but is this how they behave in production?

TestCase similarity analysis

Some test scenarios will come up in multiple aspects of the system. Placing an order will trigger a bunch of actions in other modules - fulfillment, customer profile updates, marketing classification, invoice rendering, notification emails, etc.

Sometimes these features are added with big time gaps in between, maybe even the team members have changed over time. The ability to compare the requests the different TestCases make, and say that these two (three, four, etc.) TestCases seem to execute the same kind of requests up to a point as the TestCase being added, but they also have the following extra paths they all execute, but the new TestCase doesn't... Causing the developer to realize - of course, there are special rules for orders from educational institutions!

Reducing the gap between end user error reports and tests

Probably this is the least unexpected idea in the list, but worth stating nonetheless.

In place help (for trusted users)

Sure, this might require careful considerations, but giving the users the ability to browse the test cases/methods that matches their workflow up to the current page, in a searchable fashion (if I place an order like this now, when will the X email be sent) could greatly reduce support work for the developer team. Note: the purpose of this is not to isolate the developers from the users!!!

Always up-to-date screenshots and videos

Those manuals that have screenshots from many releases ago... Adding (or marking) some test cases to be linked against documentation sections, and having the tests actually take screenshots about them or record them as video sounds like a pretty useful idea for me. One could go further, such as highlighting the values/input fields where the test sends input, and the parts of the page that are asserted against...

And if the support team has access to the same API, they could even create these screenshots/videos for the customer as they are answering their question (Here, take a look at this video, this is how the thing you asked for is done).


There must be many more ideas out there - let me know about them, and I'm happy to add them to this list (or link to wherever you published it)!

What do you think? I would love if you would leave a comment - drop me an email at hello@zsoldosp.eu, tell me on Twitter!

Your email address