Unit Tests vs. UI Tests

I recently submitted an audition video to Pluralsight which focuses on UI testing, along with some of there differences.  In the video I made a statement that “UI testing was not as important as Unit Testing”.  In order to play devil’s advocate I’m trying to think of an example when this would not be the case.  Please keep in mind that I made this statement given the fact that I work on LOB applications with the following qualities:

  • Often Legacy
    • Old- We’re taking Java 1.6 back to Mainframe old
    • “Legacy code” i.e. – code without unit tests
  • Technologies
    • C#.NET Web/Desktop Applications
    • Enterprise Java applications

So you can see from this perspective why I would consider Unit Tests to be more important.  The problem isn’t that something in the UI is buggy/broken and/or exhibiting unexpected behavior… the problem is that  you can’t touch any part of the application without modifying a lot of code.  When trying to get a requested feature out the door on time (I’m a consultant) it’s quite difficult and often risky.

In the audition video I have a diagram quite similar to this:

 

UI Testing Diagram

 

I also made a point of how UI tests can be designed so you include the full round-trip to the database in a particular test. So it’s really full-stack integrated testing.  Already I’m backtracking and can see a use case for just doing JUST UI tests, and it’s obvious- if you’re just testing the UI!  An example might be that you want to write tests across different screen resolutions and see if it’s possible to render all the content.

So in general I’d like to amend my statement to be: “In large in-production applications with a lot of code and complex business logic unit tests should take priority over UI testing”.  Indeed, hopefully they’ve been developed since the beginning of your application’s lifecycle.  After all there isn’t much point in rendering business information to a screen if you can’t even be sure that it’s accurate.