top of page

Remote Work Tools

  • Smoke Tests

  • User Acceptance Testing (UAT)

Techniques

  • As frequently as possible, re-test previous batches of work to ensure that new modifications have not interfered with past expectations. 

    • TIP: To save time, create a “smoke test,” a subset of highly visible functionality that is a good representation of the quality of the whole product. 

  • Keep an updated list of tests associated with each work item for quick traceability of potential new conflicts. 

    • CAUTION: Use automation wherever possible so that list maintenance does NOT rely on people remembering to document each new link. 

  • ​Test the application in the same way a customer might use your product, flowing from feature to feature to accomplish their own goals.

Guidance

P5.1 Regression Testing

Remote Work Tools

  • Leadership by Intent (see David Marquet, “Turn This Ship Around”)

  • Gherkin Test Scripts

Techniques

  • How teams write tests can make a huge difference in their success; focus on defining the outcomes you want to verify happening, rather than testing the execution steps to get there. 

    • TIP: The reason you don’t need to track execution is that if the steps taken by the team to achieve the outcome are wrong, the outcome will also not be achieved; if, however, they do achieve the desired outcome through different steps, often they’re better steps. 

    • TIP: If the team uses the “wrong” steps to achieve the right outcome, consider that there may be an outcome in your head that didn’t make it into the description; try defining additional outcomes that lead them in a better direction. 

  • User experience and usability of a product are also classified as outcomes; Outline qualities that you would like your customers to experience, like “easy” vs “fast” or “optimized for experts” vs “novices”.

    • TIP: An effective technique you can use is to describe emotional states of the customer that you either want to engender or avoid, e.g. “the customer is very frustrated with the amount of time it takes to…” or “the customer is absolutely frustrated that it takes 10 clicks to…” 

  • Switching to outcome verification is fundamental to team empowerment and developing active, thinking partners that help you to move faster, instead of employees who only do what you tell them to do.

    • WARNING: In the early stages of this transition, it can feel like it is taking a lot longer to collaborate through defining outcomes only. Take a wider view; record the total amount of time invested across the entire team, if desired; the effectiveness and efficiency of empowerment is seen best at scale. 

Guidance

P5.3 Outcome Verification

Remote Work Tools

  • DevOps

  • Test-Driven Development

  • Acceptance Test-Driven Development

Techniques

  • As teams create more features, the quantity of Regressions tests quickly grows too large for manual execution; automation is the easiest way to test existing features with minimal effort.

  • Write automated tests first to define the scope of your solution, before you create the solution.

  • Review automated tests periodically and remove obsolete tests that no longer apply.

Guidance

P5.2 Automated Verification

  • Confusion

  • Defects (Bugs)

  • Disgruntled Employees

  • Impediments & Delays

  • Lack of Communication

  • Late-breaking Requirements

  • Low Output

  • Rework

  • Siloed “Not My Job” Thinking

  • Unhappy Stakeholders

Problems Experienced Without These Practices

Verification

Product

P5.

Verify

Remote Work Tools

  • Usability Testing

  • Design Thinking

Techniques

  • Before official release, test the release with actual users of the product.

    • WARNING: Do NOT test just anyone (no staff, relatives, or friends); they must be actual users of the product to avoid false positives, i.e. changes that look good to non-users but frustrate expert users. 

    • CAUTION: Do NOT use internal customers as representative of external customers; test with both types of customers. 

    • TIP: You will find 80% of the issues with 3-5 users; you do not need a lot of customers for effective customer testing. 

  • While finding the majority of bugs through internal testing does create a better experience, the earlier testing with customers occurs, the more effective it will be. 

  • Tell test subjects that they are evaluating you; the average user loves to give advice, but hates being evaluated.

    • WARNING: Never “lead” a test subject and tell them how to interact with your product; you want them to be making the decisions and you watch; after your product is released, customers must be able to use it without you being present. 

    • TIP: Always have two people running each test: one person to interact with the test subject, and one person to take notes; the person taking notes is called the “silent observer” and should never speak, only document what they see. 

  • Encourage test subjects to talk out loud about their experience and their thoughts during testing.

    • CAUTION: When people are really focused on thinking through a task, they will usually stop talking; you may need to prompt them multiple times (in a friendly tone) to continue talking out loud. 

    • TIP: Allow people to go on thought tangents when talking out loud; in our brains, “neurons that first together wire together,” which means that whatever tangent they go on is related to the task they are currently doing; use the tangent to uncover how the related topic affects the user’s experience with your product! 

  • Never ask customers to predict their future behavior; if you find yourself talking in the future tense, find a way to either rephrase the query in the past tense, or better yet, simulate the situation and observe what the person actually does.

    • TIP: “Users are notoriously bad at predicting their own behavior.” - Anonymous 

Guidance

P5.4 Direct Customer Testing

bottom of page