Quality Assurance

Part Duex

What kind of cases are there?


There are really three types of cases that Quality Assurance needs to be concerned with

  • Roles
  • User Stories
  • Test Cases

Roles

A role is a type of end user that will be using the site. Usually this is based on who has access to what, but this can also be decided on by demographic or by some other means. For example, some roles would be "administrator", "user", "CMO", or "premium subscriber".

User Stories

User stories are set up as one or two sentences describing the action in which a user wants to preform. Usually these user stories are tied to a ticket and the test cases derive from the user story. An example of a user story would be "As a <role> I want to be able to <task> so that <reasoning>." 


In general, these stories are written by a customer representative about the needs of the stockholder. 


In the future, these tickets should be tied to milestones.

Test cases

There are two types of test cases:


  • Formal Test Cases are those cases that have a given input and an expected output before they are tested. Each case has two tests associated with it, a positive and a negative test. 

  • Informal Test Cases are those cases that don't have specified inputs and outputs, but instead follow a role through the user stories associated with a ticket and evaluate that user's experience through the system.

What kind of tests will be run


Automated Tests

  1. Unit Testing
  2. Smoke Testing
  3. Regression Testing
  4. Functional Testing

Manual Tests

  1. Requirement Testing
  2. Negative Testing
  3. Exploratory Testing
  4. Jerk Testing

Unit Testing

Unit tests are tests against the code itself. These are run without interfacing with the site at all, and tested on a local level against the actual code.


These tests are written by developers, as they are part of the code itself and run against functions and methods, rather than something that can be written through user interface. They will still be executed by QA.

Smoke Testing


Smoke tests are those tests that can emulate a typical user's experience through a vertical. These tests are usually written per role and just ensure that a usual experience through the site isn't broken in any obvious way.

Regression Testing

Regression testing involves running every test that we have ever written for a particular product. These tests will be written in CasperJS and run at the end of every sprint.

Functional Testing

Functional testing represents the testing done against a particular ticket. Functional tests are the automated tests written from the test cases written about that ticket.

Requirement Testing

Requirement tests are the test cases that get run manually against a ticket. After this testing phase, functional tests are written for this ticket. 

Negative Tests


Negative tests ensure that the expected event occurs when a particular feature or requirement has failed. 

Exploratory Testing

Exploratory testing isn't associated with any particular ticket, but rather the QA-er takes on one of the roles and tries to experience the site as that person would.

Jerk Testing

Jerk testing is testing done as a negative party against the site. This kind of testing mindset is someone trying to hack the site or a disgruntled user doing their best to break the site.

Policies

We've learned a lot from Premium Reporting's QA session. Because of some of the failures entailed with that session, we've created several policies to mitigate these issues cropping up in the future. 

Policy #1

A developer will never QA their own work.

One of the major issues we had is with the developer also being the person to QA the ticket. This let a lot of hubris in and things were not properly tested (from conversations with other companies, this is a big no-no).

Policy #2

Be as transparent as someone wants.

Transparency is important when it comes to the results of a test run. We have implemented the policy of as much transparency as someone wants, so that if they want broad strokes those are available. But, if they want fine grained details, we have those reports as well.

Policy #3

Communication is key.

If a problem arises during the QA process (for example a lack of requirements or confusion about the reproducibility of an issue), then the QA analyst will contact the Project Manager immediately and bring these issues up.

Policy #4

If there is a need for access to third party software, it will requested immediately and the ticket will be blocked until access is granted.

There should be no expectation on the QA analysts part of the data or information of third party software. If access to third party software is needed, the test cases will be blocked until that access is granted.

Looking forward

The QA Department has a specific vision for the future to ensure quality of not only upcoming projects, but of the workflow and user experience as well.

Also, the QA team would like to create shortcuts to make everyone's lives easier.

Upcoming projects


  • User stories are absolutely necessary going forward.
    • Playing Jeopardy is hard.
    • If user stories are not present, then they will have to be made by QA, but that will break any estimates made about the length of QA and will take longer with an injected exploration phase to figure out those user stories.
  • With provided user stories, QA will create test cases and have completed guidelines before the ticket reaches QA.

Workflow

When tests fail or requirements fail, QA can determine where in the workflow those things originated (or should have originated) from.
Currently, there are three types of failure originators:
  • Requirement Failure
    • Something was not included in the requirement. Kicked back to the Project Manager.
  • User Design Failure
    • Something in the functionality causes problems with the user's experience. Kicked back to design.
  • Code Failure
    • A code failure occurred, kicked back to the developer.

User Experience


In the future, QA would like to work with product to ensure the quality of the user experience by participating in various kinds of usability testing and review.

Shortcuts

In the future, QA would also like to create options and procedures to make the lives of the engineering department easier.

These shortcuts include:

  • Documentation of workflow (wiki pages).
  • Helper scripts for installation and committing.
  • Training Manuals
  • Introduction packets to team.
  • Other items for those outside of the department to consume and get a better idea of how this department works.

Interviews


To better put ourselves in a role, we would also like to start conduction very informal interviews with various stakeholders in the company. 

This will allow us to "think like they do" while we are conducting exploratory tests and running through user stories.

Quality Assurance

By Kevin Baugh

Quality Assurance

  • 684