on

How does your development team handle testing? Is it a well-documented routine process or is it simply up to the developer working on the website? When do you begin testing? What tools do you use? Do you feel confident after launching a site? These questions were brought to the forefront as our team grew and new members inevitably entered into the enigmatic testing phase on web projects. As a team, we recognized the need to set some ground rules, but were unsure of what it was exactly that we wanted to test. Luckily, the process of creating the rules itself resulted in the answer.

When I first started diving into writing these testing rules, I read anything I could about what other development teams were doing. What was successful? What had failed? What might we adapt to our process? To be honest, there’s not much out there, which is why I’m sharing our process with you. Amidst this research, one article that really stood out was on agile quality assurance by Huge, Inc. My mind was blown – we already build in an agile environment, so why not test in one? We quickly adapted this idea to our process by giving project manager’s the expectation of a fully tested website at the end of each sprint (I talk more about this here). As the website is being built, developers test over and over again. In addition to this, developers frequently switch which browser they’re building the website in to eliminate being blindsided by browser inconsistencies at the time of testing.

The next big step in our process was to determine what to actually test: Which browsers, versions, operating systems, or devices? For some clients, we are told what to test (and retro test) during the initial project scope, but what about the client’s who rely on our expertise? How can we make them feel confident in their website launch? To answer this question, I worked backwards. I started analyzing all of the data we already collected from the client via Google Analytics and looked for patterns & consistencies in the traffic between browsers, devices and screen sizes. Then, I compared those results to other client’s website analytics just to make sure I wasn’t missing something in the gap. Finally, I used statistics and trends from sites like builtwith.com to round out my research.

From these results, the development team created a testing list. As of June 2016, we test for the latest browser versions of Chrome, Safari, and Firefox, as well as IE11+. We also test for Windows 7+, IOS8+, and Android 4.4+. These picks aren’t always the same per project – depending on the research we’ve done on the client sometimes the testing rules change to better fit them. We feel really confident in what we have selected to test, because we are backing up our picks with data collected from the client’s current website users.

quote

To document and track the testing process, we originally created a simple Google spreadsheet for each website project. With each project that used the template, it grew. It became more refined and actually useful; until eventually, it became our browser testing template. The template includes which browsers, operating systems, and screen sizes to test against each of the site’s unique modules (it is the developer’s job to document a list of modules at the beginning of each project.) While testing, the developer answers a list of questions for each module. Does it visually pass inspection? When did the test occur? If it did not pass, why? At this point, we are only testing for visual accuracy. Next, we run unit tests to test for functionality.

browser testing example

Unit testing begins with an in-depth look at all of the modules in the website. The developer forms questions to test against the functionality of each module and also writes down the expected result for each question. For example, when creating tests for the header module, a developer might write: Question: Does the logo link to the homepage? Expected Result: Yes, the logo should link to the homepage. When the website enters testing, the tester runs through a series of questions and documents the actual results and then compares them to the expected results. The tester documents if the test passed or failed, the date, the person responsible for fixing a failed test, and any extra comments. Eventually these questions will turn into an automated unit test that can be run frequently throughout the site’s build.

unit testing example

While the initial set-up of the browser and unit test templates might seem tedious, the overall results increase efficiency and accuracy in the testing phase. While a developer is responsible for setting up the tests, they are not solely responsible for testing – any team member is able to document their results.

Creating applicable browser and unit testing templates for website projects has made our development team more efficient and uniform when it comes to testing. Alongside, testing at the end of each sprint, building in different browsers, and pairing client analytic research next to recent web trends & statistics. All of these steps combined have resulted in a pretty solid list of testable items that our team can agree upon and a process for how we test websites. By launch time, we feel confident in our work and getting through the testing phase is no sweat.

Tina Castillo
Tina Castillo
Senior Designer & Front End Developer

A web developer and designer who loves to merge form with function, Tina specializes in user-interface design and front-end development. She has previously worked as a web developer at the University of Pittsburgh and a web design instructor at CCAC. She has built and launched many award-winning websites for the university’s schools and departments. At Smith Brothers Agency, she works on almost every digital account - working with the team to create and maintain our websites. She holds a B.A. in Digital Media Arts and a M.S. in Media Technology from Duquesne University.