This interview with Alan Page is part of our series of “Testing Smarter with…” interviews. Our goal with these interviews is to highlight insights and experiences as told by many of the software testing field’s leading thinkers.

Alan Page has been a software tester (among other roles) since 1993 and is currently the Director of Quality for Services at Unity. Alan spent over twenty years at Microsoft working on a variety of operating systems and applications in nearly every Microsoft division.


Alan Page

Personal Background

Hexawise: Looking back on over 20 years in software testing at Microsoft can you describe a testing experience you are especially proud of?

Alan: There are a lot of great experiences to choose from - but I'm probably most proud of my experiences on the Xbox One team. My role was as much about building a community of testers across the the Xbox ecosystem (console, services, game development) as it was about testing and test strategy for the console. We had a great team of testers, including a handful of us with a lot of testing experience under our belts. We worked really well together, leveraged each others strengths, and delivered a product that has had nearly zero quality issues since the day of release.

Hexawise: What did you enjoy about working in such a large software company, Microsoft, for so long?

Alan: The only way I survived at Microsoft so long was that I could change jobs within the company and get new experiences and challenges whenever I felt I needed them. I love to take on new challenges and find things that are hard to do - and I always had that opportunity.

The biggest thing I look for in testers is a passion and ability to learn. I've interviewed hundreds of testers... The testers who really impress me are those who love to learn - not just about testing, but about many different things. Critical thinking and problem solving are also quite important.

Hexawise: What new challenges are you looking forward to tackling in your new role at Unity?

Alan: I'm looking forward to any and all challenges I can find. Specifically, I want to build a services testing community at Unity. Building community is something I feel really strongly about, and from what I've seen so far, I think there are a lot of opportunities for Unity testers to learn a lot from each other while we play our part in the coming growth of Unity services.

Views on Software Testing

Hexawise: What thoughts do you have in involving testers in A/B and multivariate testing? That stretches the bounds of how many people categorize testers but it could be a good use of the skills and knowledge some testers process.

Alan: My approach to experimentation (A/B testing) is that there are three roles needed. Someone needs to design the experiment. A product owner / program manager often plays this role and will attempt to figure out the variation (or variations) for the experiment as well as how to measure the business or customer value of both the control (original implementation) and treatment / experiment.

The second role is the implementer - this is typically a developer or designer depending on the nature of the experiment. Implementing an experiment is really no different than implementing any other product functionality.

The final role is the analyst role - someone needs to look at the data from the experiment, as well as related data and attempt to prove whether the experiment is a success (i.e. the new idea / treatment is an improvement) or not. I've seen a lot of testers be successful in this third role, and statistics and data science in general, are great skills for testers to learn as they expand their skill set.

image of book cover for How We Test Software at Microsoft

Hexawise: During your 20 years at Microsoft you have been involved in hiring many software testers. What do you look for when choosing software testers? What suggestions do you have for those looking to advance in their in software testing career?

Alan: The biggest thing I look for in testers is a passion and ability to learn. I've interviewed hundreds of testers, including many who came from top universities with advanced degrees who just weren't excited about learning. For them, maybe learning was a means to an end, but not something they were passionate about.

The testers who really impress me are those who love to learn - not just about testing, but about many different things. Critical thinking and problem solving are also quite important.

As far as suggestions go, keep building your tool box. As long as you're willing to try new things, you'll always be able to find challenging, fun work. As soon as you think you know it all, you will be stuck in your career.

Combinatorial testing is actually pretty useful in game testing. For example, consider a role-playing game with six races, ten character classes, four different factions, plus a choice for gender. That's 480 unique combinations to test! Fortunately, this has been proven to be an area where isolating pairs (or triples) of variations makes testing possible, while still finding critical bugs.

Hexawise: Do you have a favorite example of a combinatorial bug and how it illuminated a challenge with software testing?

Alan: I was only indirectly involved in discovering this one, but the ridiculously complex font-picker dialog from the Office apps was a mess to test - but it was tested extensively (at least according the person in charge of testing it). A colleague of mine showed them the all pairs technique, and they used it to massively decrease their test suite - and found a few bugs that had existed for years.

Hexawise: It seems to me that testing games would have significant challenges not found in testing fairly straightforward business applications. Could you share some strategies for coping with those challenges?

Alan: Combinatorial testing is actually pretty useful in game testing. For example, consider a role-playing game with six races, ten character classes, four different factions, plus a choice for gender. That's 480 unique combinations to test! Fortunately, this has been proven to be an area where isolating pairs (or triples) of variations makes testing possible, while still finding critical bugs.

Beyond that, testing games requires a lot of human eyeballs and critical thinking to ensure gameplay makes sense, objects are in the right places, etc. I've never seen a case where automating gameplay, for example, has been successful. I have, however, seen some really innovative tools written by testers to help make game testing much easier, and much more effective.

Hexawise: That sounds fascinating, could you describe one or more of those tools?

Alan: The games test organization at Microsoft wrote a few pretty remarkable tools. One linked the bug tracking system to a "teleportation" system in the game under test, including a protocol to communicate between a windows PC and an xbox console.

This enabled a cool two-way system, where a tester could log a bug directly from the game, and the world coordinates of where the bug occurred were stored automatically in the bug report. Then, during triage or debugging, a developer / designer could click on a link in the bug report, and automatically transport to the exact place on the map where the issue was occurring.

Hexawise: What type of efforts to automate gameplay came close to providing useful feedback? What makes automating gameplay for testing purposes ineffective?

Alan: I would never automate gameplay. There are too many human elements needed for a game to be successful - it has to be fun to play - ideally right at that point between too challenging, and too easy. If the game has a story line, a tester needs to experience that story line, evaluate it, and use it as a reference when evaluating gameplay.

Tools to evaluate cpu load or framerate during gameplay are usually a much better investment than trying to simulate gameplay via automation.

That said, there are some "what if" scenarios in games that may provide interesting bugs. For example simulating a user action (e.g. adding and removing an item) thousands of time may reveal memory leaks in the application. It really comes down to test design and being smart about choosing what makes sense to automate (and what doesn't make sense).

Hexawise: What is one thing you believe about software testing that many smart testers disagree with?

Alan: I don't believe there's any value from distinguishing "checks" from tests. I know a lot of people really like the distinction, but I don't see the value. Of course, I recognize that some testing is purely binary validation, but this is a(nother) example of where choosing more exact words in order to discern meaning leads to more confusion and weird looks than it benefits the craft of testing.

Industry Observations / Industry Trends

Hexawise: Do you believe testing is becoming (or will become) more integrated with the software development process? And how do you see extending the view of the scope of testing to include all the way from understanding customer needs to reviewing actual customer experience to drive the testing efforts at an organization.

Alan: I believe testing has become more integrated into software development. In fact, I believe that testing must be integrated into software development. Long ship cycles are over for most organizations, and test-last approaches, or approaches that throw code to test to find-all-the-bugs are horribly inefficient. The role of a tester is to provide testing expertise to the rest of the team and accelerate the entire team's ability to ship high-quality software. Full integration is mandatory for this to occur.

It's also important for testers to use data from customer usage to help them develop new tests and prioritize existing bugs. We've all had conversations before about whether or not the really cool bug we found was "anything a customer would ever do" - but with sufficient diagnostic data in our applications or services, we can use data to prove exactly how many customers could (or would) hit the issue. We can also use data to discover unexpected usage patterns from customers, and use that knowledge to explore new tests and create new test ideas.

There's an important shift in product development that many companies have recognized. They've moved from "let's make something we think is awesome and that you'll love" - i.e. we-make-it-you-take-it to "we want to understand what you need so we can make you happy". This shift cannot happen without understanding (and wallowing in) customer data.

Hexawise: In your blog post, Watch out for the HiPPO, you stated: "What I’ve discovered is that no matter how strongly someone feels they “know what’s best for the customer”, without data, they’re almost always wrong." What advice do you have for testers for learning what customers actually care about? To me, this points out one of the challenges software testers (and everyone else actually) faces, which is the extent to which they are dependent on the management system they work within. Clayton Christensen's Theory of Jobs to Be Done is very relevant to this topic in my opinion. But many software testers would have difficulty achieving this level of customer understanding without a management system already very consistent with this idea.

Alan: Honestly, I don't know how a product can be successful without analysis and understanding of how customers use the product. The idea of a tester "pretending to be the customer" based purely on their tester-intuition is an incomplete solution to the software quality problem. If you hear, "No customer would ever do that", you can either argue based on your intuition, or goo look at the data and prove yourself right (or wrong).

There's an important shift in product development that many companies have recognized. They've moved from "let's make something we think is awesome and that you'll love" - i.e. we-make-it-you-take-it to "we want to understand what you need so we can make you happy". This shift cannot happen without understanding (and wallowing in) customer data. Testers may not (and probably won't) create this system, but any product team interested in remaining in business should have a system for collecting and analyzing how customers use their product.

Hexawise: I agree, this shift is extremely important. Do you have an example of how using such a deep understanding of users influenced testing or how it was used to shift the software development focus. Since software testing is meant to help make sure the company provides software that users want it certainly seems important to make sure software testers understand what users want (and don't want).

Alan: First off, I prefer to think of the role of test as accelerating the achievement of shipping quality software; which is similar to making sure the company provides software customers want, but (IMO), is a more focused goal.

As far as examples go...how many do you want? Here a few to ponder:

  • We tracked how long people ran our app. 45% ran it continuously - all the time. This is the way we ran the app internally. But another 45% ran it for 10 minutes or less. We had a lot of tests that tried to mimic a week of usage and look for memory leaks and other weirdness, but after seeing the data, we ended up doing a lot more testing (and improving) of start up and shut down scenarios.
  • We once canceled a feature after seeing that the data told us that it was barely used. In this case, we didn't add the tracking data until late (a mistake on our part). Ideally, we'd discover something like this earlier, and than redesign (rather than cancel)
  • We saw that a brand new feature was being used a lot by our early adopters. This was pretty exciting...but as testers, we always validate our findings. It turned out after a bit more investigation, that the command following usage of the brand new feature was almost always undo. Users were trying the feature... but they didn't like what it was doing.

Staying Current / Learning

Hexawise: What advice do you have for people attending software conferences so that they can get more out of the experience?

Alan: Number one bit of advice is to talk to people. In fact, I could argue that you get negative value (when balanced against the time investment) if you show up and only attend talks. If there's a talk you like in particular, get to know the speaker (even us introverts are highly approachable - especially if you bring beer). Make connections, talk about work, talk about challenges you have, and things you're proud of. Take advantage of the fact that there are a whole lot of people around you that you can learn from (or who can learn from you).

Additionally, if you're in a multi-track conference, feel free to jump from talk to talk if you're not getting what you need - or if it just happens that two talks that interest you are happening at the same time.

Hexawise: How do you stay current on improvements in software testing practices; or how would you suggest testers stay current?

Alan: I read a lot of testing blog posts (I use feedly for aggregation). I subscribe to at least fifty software development and test related blogs. I skim and discard liberally, but I usually find an idea or two every week that encourages me to dig deeper.

Biggest tip I have, however, is to know how essential learning is to your success. Learning is as important to the success of a knowledge worker as food is to human life. Anyone who thinks they're an "expert" and that learning isn't important anymore is someone on a career death spiral.

Profile

image of book cover for The A Word by Alan Page

Alan has been a software tester (among other roles) since 1993 and is currently the Director of Quality for Services at Unity. Alan spent over twenty years at Microsoft working on a variety of operating systems and applications in nearly every Microsoft division. Alan blogs at angryweasel.com, hosts a podcast (w/ Brent Jensen) at angryweasel.com/ABTesting, and on occasion, he speaks at industry testing and software engineering conferences.

Links Books: How We Test Software at Microsoft, The A Word

Blog: Tooth of the Weasel

Podcast: AB Testing

Twitter: @alanpage


Related posts: Testing Smarter with Dorothy Graham - Testing Smarter with James Bach

By: John Hunter on Mar 16, 2017

Categories: Combinatorial Software Testing, Customer Success, Software Development, Software Testing, Testing Smarter with..., Interview

We are excited to announce an ongoing partnership with Datalex to improve software test efficiency and accuracy. Datalex has achieved extreme benefits in software quality assurance and speed-to-market through their use of Hexawise. Some of these benefits include:

  • Greater than 65 percent reduction in the Datalex test suite.
  • Clearer understanding of test coverage
  • Higher confidence in the thoroughness of software tests.
  • Complete and consistently formatted tests that are simple to automate

An airline company’s regression suite typically contains thousands of test cases. Hexawise is used by Datalex to optimize these test cases, leading to fewer tests as well as greater overall testing coverage. Hexawise also provides Datalex with a complete understanding of exactly what has been tested after each and every test, allowing them to make fact-based decisions about how much testing is enough on each project.

Hexawise has been fundamental in improving the way we approach our Test Design, Test Coverage and Test Execution at Datalex... My team love using Hexawise given its intuitive interface and it’s ability to provide a risk based approach to coverage which gives them more confidence during release sign-off.

Screen shot 2016 10 04 at 12.58.53 pm

Áine Sherry

Global Test Manager at Datalex

“As a senior Engineer in a highly innovative company, I find Hexawise crucial in regards to achieving excellent coverage with a fraction of the time and effort. Hexawise will also facilitate us to scale onwards and upwards as we continue to innovate and grow,“ – Dean Richardson, Software Test Engineer at Datalex.

By eliminating duplicative tests and optimizing the test coverage of each test case Hexawise provides great time savings in the test execution phase. Hexawise can generate fewer test scenarios compared to what testers would create on their own and those test cases provide more test coverage. Time savings in test execution come about simply because it takes less time to execute fewer tests.

Related: How to Pack More Coverage Into Fewer Software Tests - Large Benefits = Happy Hexawise Clients and Happy Colleagues

By: John Hunter on Nov 10, 2016

Categories: Business Case, Customer Success, Testing Case Studies

At Hexawise we aim to improve the way software is tested. Achieving that aim requires not only providing our clients with a wonderful software tool (which our customers say we’re succeeding at) but also a commitment from the users of our tool to adopt new ways of thinking about software testing.

We have written previously about our focus on the importance of the values Bill Hunter (our founder's father) to Hexawise. That has led us to constantly focus on how maximize the benefits our customers gain using Hexawise. This focus has led us to realize that our customers that take advantage of the high-touch training services and ongoing expert test design support on demand that we offer often realize unusually large benefits and roll out usage of Hexawise more quickly and broadly than our customers who acquire licenses to Hexawise and try to “get the tool and make it available to the team.”

We are now looking for someone to take on the challenge of helping our clients succeed. The principles behind our decision to put so much focus on helping our customers succeed are obvious to those that understand the thinking of Bill Hunter, W. Edwards Deming, Russel Ackoff etc. but they may seem a bit odd to others. The focus of this senior-level position really is to help our customers improve their software testing results. It isn't just a happy sounding title that has no bearing on what the job actually entails.

The person holding this position will report to the CEO and work with other executives at Hexawise who all share a commitment to delighting our customers and improving the practice of software testing.

Hexawise is an innovative SaaS firm focused on helping large companies use smarter approaches to test their enterprise software systems. Teams using Hexawise get to market faster with higher quality products. We are the world’s leading firm in our niche market and have a growing client base of highly satisfied customers. Since we launched in 2009, we have grown both revenues and profits every year. Hexawise is changing the way that large companies test software. More than 100 Fortune 500 companies and hundreds of other smaller firms use our industry leading software.

Join our journey to transform how companies test their software systems.

Hexawise office

Description: VP of Customer Success

In the Weeks Prior to a Sale Closing

  • Partner with sales representatives to conduct virtual technical presentations and demonstrations of our Hexawise test design solution.

  • Clearly explain the benefits and limitations of combinatorial test design to potential customers using language and concepts relevant to their context by drawing upon your own “been there, done that” experiences of having successfully introduced combinatorial test design methods in multiple similar situations.

  • Identify and assess business and technical requirements, and position Hexawise solutions accordingly.

Immediately Upon a New Sale Closing

  • Assess a new client’s existing testing-related processes, tools, and methods (as well as their organizational structure) in order to provide the client with customized, actionable recommendations about how they can best incorporate Hexawise.

  • Collaborate with client stakeholders to proactively identify potential barriers to successful adoption and put plans in place to mitigate / overcome such barriers.

  • Provide remote, instructor-led training sessions via webinars.

  • Provide multi-day onsite instructor-led training sessions that: cover basic software test design concepts (such as Equivalence Class Partitioning, the definition of Pairwise-Testing coverage, etc.) as well as how to use the specific features of Hexawise.

  • Include industry-specific and customer-specific customized training modules and hands-on test design exercises to help make the sessions relevant to the testers and BA’s who attend the training sessions.

  • Collaborate with new users and help them iterate, improve, and finalize their first few sets of Hexawise-generated software tests.

  • Set rollout and adoption success criteria with clients and put plans in place to help them achieve their goals.

Months After a New Sale Closing

  • Continue to engage with customers onsite and virtually to understand their needs, answer their test design questions, and help them achieve large benefits from test optimization.

  • Monitor usage statistics of Hexawise clients and proactively reach out to clients, as appropriate, to provide proactive assistance at the first sign that they might be facing any potential adoption/rollout challenges.

  • Collaborate with stakeholders and end users at our clients to identify opportunities to improve the features and capabilities of Hexawise and then collaborate with our development team to share that feedback and implement improvements.

Required Skills and Experience

We are looking for a highly-experienced combinatorial test design expert with outstanding analytical and communication skills to provide these high touch on-boarding services and partner with our sales team with prospective clients.

Education and Experience

  • Bachelor’s or technical university degree.

  • Deep experience successfully introducing combinatorial test design methods on multiple different kinds of projects to several different groups of testers.

  • Set rollout and adoption success criteria with multiple teams and put plans in place to achieve them.

  • Minimum 5 years in software testing, preferably at a IT consulting firm or large financial services firm.

Knowledge and Skills

  • Ability to present and demonstrate capabilities of the Hexawise tool, and the additional services we provides to our clients beyond our tool.
  • Exhibit excellent communication and presentation skills, including questioning techniques.
  • Demonstrate passion regarding consulting with customers.
  • Understand how IT and enterprise software is used to address the business and technical needs of customers.
  • Demonstrate hands-on level skills with relevant and/or related software technology domains.
  • Communicate the value of products and solutions in terms of financial return and impact on customer business goals.
  • Possess a solid level of industry acumen; keeping current with software testing trends and able to converse with customers at a detailed level on pertinent issues and challenges.
  • Represents Hexawise knowledgeably, based on a solid understanding of Hexawise’s business direction, portfolio and capabilities
  • Understand the competitive landscape for Hexawise and position Hexawise effectively.
  • A cover letter that describes who you are, what you've done, and why you want to join Hexawise.
  • Ability to work and learn independently and as part of a team
  • Desire to work in a fast-paced, challenging start-up environment

Why join Hexawise?

salary + bonus; medical and dental, 401(k) plans; free parking and very slick Chapel Hill office! Opportunity to experience work with a fast-growing, innovative technology company that is changing the way software is tested.

Key Benefits:

Salary: Negotiable, but minimum of $100,000 + Commissions based upon client license renewals Benefits: Health, dental included, 401k plan Travel: Average of no more than 2-3 days onsite per week Location: Chapel Hill, NC*

*Working from our offices would be highly preferable. We might consider remote working arrangements for an exceptional candidate based in the US.

Apply for the VP of Customer Success position at Hexawise.

By: John Hunter on May 12, 2016

Categories: Hexawise, Career, Software Testing, Lean, Customer Success, Agile