The Testing Show: Assessing Quality Practices

January 18, 2023
/
Transcript

Agile Testing and some would say Modern Testing is built around understanding the processes and quality practices necessary to deliver a quality product to customers. How do we know if the practices our company or organization is using actually deliver what we hope them to.

To help answer that, Selena Delesie and Janet Gregory join Matthew Heusser and Michael Larsen to talk about their new book “Assessing Agile Practices With the Quality Practices Assessment Model (QPAM)” and help determine if the practices an organization is using are effective and ultimately will help deliver quality products in the first place.

rss

Panelists:

References:

Transcript:

Hello, and welcome to The Testing Show.

Episode 131.

Assessing Quality Practices

This show was recorded on Thursday, December 29, 2022.

Agile Testing and some would say Modern Testing is built around understanding the processes and quality practices necessary to deliver a quality product to customers. How do we know if the practices our company or organization is using actually deliver what we hope them to.

To help answer that, Selena Delesie and Janet Gregory join Matthew Heusser and Michael Larsen to talk about their new book “Assessing Agile Practices With the Quality Practices Assessment Model (QPAM)” and help determine if the practices an organization is using are effective and ultimately will help deliver quality products in the first place.

And with that, on with the show.

Matthew Heusser (00:00):
Well, welcome back everybody. Thanks for coming to the show. This week, we’ve got a couple of old friends, Janet Gregory and Selena Delesie, and I’ve known them… is forever a time period? I don’t know. Selena went to the University of Waterloo, which I believe is actually mentioned in Peopleware for its portfolio programs with a degree in mathematics specializing in Combinatorics and then went to Research In Motion when they made the Blackberry and was their primary test manager and quality executive as that company scaled from, I don’t know how many, a hundred people to 5,000. Something like that. Is that even close to right, Selena?

Selena Delesie (00:43):
Well, I’d say a few hundred to like 15,000 by the time I left. Yeah.

Matthew Heusser (00:48):
So tell us a little bit about that. What else am I missing from that story?

Selena Delesie (00:52):
I would just say I was one of many test managers there, but I was one of the primary leads for the handheld testing department. Once it got growing larger, it was pretty incredible to see the growth of the organization from maybe 30 <laugh> ish testers when I started to thousands. Yeah, it was a pretty incredible journey. I chose to leave before they… before they started to crumble, but it was definitely an opportunity that really shaped where my career went from there. It brought me into connection with Janet Gregory down the road and with the both of you and many others in the industry.

Matthew Heusser (01:36):
So if you haven’t picked up on it, Selena’s from Canada. Has a little bit of the accent. Janet… Selena from Toronto ish. Janet in Vancouver ish?

Janet Gregory (01:49):
<laugh> Vancouver ish is about 12 hours away from me.

Matthew Heusser (01:53):
Oh my goodness.

Janet Gregory (01:54):
Yeah. So how about Calgary ish?

Matthew Heusser (01:57):
Sure. And Janet being a longtime programmer that got involved in Agile testing very early on, has co-authored three books now on Agile testing?

Janet Gregory (02:08):
Three books on Agile testing and then the newest book that I just co-authored with Selena.

Matthew Heusser (02:14):
Yeah, let’s talk. So did I miss anything there? I mean you’ve done so much. It’s hard to sort of cover it in 30 seconds.

Janet Gregory (02:20):
Yeah, no, you covered it well enough. I think most people know who I am and it’s easy to figure it out if you Google me.

Matthew Heusser (02:27):
So how did you get the idea, and I’ll start with Janet, I guess. The book is “Assessing Agile Quality Practices with QPAM”. We’d love to hear about how that idea moved into a book idea and how you started working with Selena on it.

Janet Gregory (02:44):
Well, I’ve been doing quality assessments for companies for years and years and years. Most of the time it was very informal. I’d go into a company, I would observe, I would talk to people, and I had no structure as such. I was just looking as we went and asking questions. In 2020 I was asked to do a quality assessment for a company and I said, “Sure”. And I realized that I couldn’t do it the same way because it was going to be remote. I couldn’t just go into the company and observe and ask questions and sit with people. I had to have more structure. So I asked around and Lisa Crispin, my other co-author had been using Alan Page’s quality culture assessment guide or something like that.

Selena Delesie (03:40):
Yeah, quality culture transition guide.

Janet Gregory (03:42):
That’s it. That’s it. Thank you Selena. But it wasn’t quite enough for what I needed, so I started to adapt it, but I realized I needed help and I know Selena had done so much work with it, so I asked her if she could help me fill out this structure. So she very nicely helped me out. Anyhow, it worked really, really well and I shared it with Lisa Crispin and she did one using it and we said, “Okay, how can we share this?” So I was chatting with Selena again and we kind of juggled several things and came up with the idea of writing a book on it. That’s where it started and kind of come from.

Michael Larsen (04:22):
So if I can jump on this, cuz of course my role in this podcast is often to set up the softball or obvious questions. The easiest question to ask is if we’re looking at agile quality, you’re talking about the whole idea of when you’re working with a remote team, assessing quality is not as straightforward and I would argue and question, “Is assessing quality ever straightforward?” If I was somebody who was a manager in a group and said, I am looking to assess the quality of our practices or our organizations, where would you even start with that conversation?

Janet Gregory (05:00):
It’s not assessing quality, it’s assessing quality practices,

Michael Larsen (05:04):
Right!

Janet Gregory (05:04):
And it’s not necessarily Agile practices or anything. We really went down and thought about what kinds of things do we want to talk about? What are the most important things? So for example, we have categories, the highest categorization, I guess the one that we thought was the most important when we’re thinking about quality practices is the feedback loops. How does a team work? How do they talk to each other? How is information passed from one to another? How fast is the feedback loops? Those sorts of things. So we have 10 different categories or aspects as we call them that we kind of look at and encourage people to think about

Selena Delesie (05:47):
And looking at these practices, this is something that you can do with a team in person in real life or with remote teams. It’s not focused specifically on one or the other. It’s really just looking holistically about the practices that a team is moving through as they’re learning about and understanding and designing and creating and developing and delivering their products and putting it together in a frame to understand how a team might move through and grow in those practices over time and helping them to understand where they might wanna move to next.

Matthew Heusser (06:24):
Here’s a silly question, what does QPAM stand for?

Selena Delesie (06:28):
That’s a good question. <laugh> QPAM is Quality Practices Assessment Model. So we wanted to give our model it’s own unique name. QPAM <laugh>. It’s just the easiest and most direct I think translation for us. Janet?

Janet Gregory (06:44):
<laugh>. Yes, I agree. <laugh>, it’s hard to put it into something that isn’t… that’s easy to remember as well. So the model actually has 10 quality aspects and then four… we call ’em dimensions, that you could move through or not. What we try to encourage people to think is that it’s not a maturity model. That’s the first thing people look, but people might be perfectly content to be where they are. It’s just nice to know where you are in your practices. So the four dimensions that we talk about is Beginning and that’s where people start, whatever that is. And then Unifying, which is we’re adopting some of the Agile practices and we’re learning how to do them and we’re trying to do things. Practicing, which is we’re pretty good at it and we understand what we’re supposed to be doing, we understand why we’re doing it. And then Innovating, which is pushing the boundaries, going beyond sort of thing.

Matthew Heusser (07:44):
Can you give us an idea? What other practices did you expect the team to do with regards to quality?

Selena Delesie (07:50):
Like Janet was describing, we have 10 different quality aspects and then within each of those aspects there are different defined practices that you might consider. Taking the example of feedback loops, again, some of the practices might be how do we have feedback loops within our team? And then feedback loops between the team and management, feedback loops between the stakeholders or customers and the team. And then each of the other aspects that include culture, learning and improvements, the development approach, quality and test ownership, testing breadth, code quality and technical debt, test automation and tools, deployment pipeline and defect management. All of those have their own defined practices within them that we go into in detail in the book.

Matthew Heusser (08:43):
It feels like these are a little more high-level and abstract -and tell me if I’m wrong- than the old capability maturity model integration used to say code review, do you actually have written requirements? Are those requirements reviewed and approved? Do you have a written project plan? I guess those are mostly artifacts, they’re also practices. We’ve evolved past a lot of the old CMMI stuff, don’t get me wrong, but it seems like you’re more holistic or higher level than some of those things.

Janet Gregory (09:19):
Holistic is a good word. We don’t look at specific practices like do you do a code review? We encourage people to think about how are they making sure their code is good. So a question that I might ask a team is, “How do you do your code reviews?” or “How do you do something?” It’s not, “Do you do a code review, but how do you get your code reviewed”, or something along that line. I’d have to go back and look. We have a whole series of questions that are meant for a facilitator to have open-ended questions to let the team respond about how they do things versus do you do, yes or no? If that makes a little bit of sense.

Michael Larsen (10:00):
Yeah, I think it does. I’ve had experiences with companies over the years, especially companies that have either gone through an Agile transformation or have been on the… the long tail end of a tragile… Tragile <laugh> Freudian slip, there? <laugh>, tragic Agile transformation, <laugh> what I mean… I have actually been in organizations that have gone through Agile transformations or have been long-running Agile companies. Ostensibly we did these things for quality purposes to improve quality and yet occasionally I feel like we’re just holding onto the shells of practices that we agreed to. I mean sure we do standups or we do retros in some cases and I’ve been in organizations where those exist but they’re not very good and they don’t really seem to get very good quality information and I’m sure that I’m not the only one who’s experienced that. Let’s say that somebody’s… I wanna be able to utilize this. What would QPAM do or how would that help us to look at these and say, “Are we actually doing what we think we’re doing with this?”

Selena Delesie (11:12):
Let’s use the example that you just mentioned, like retrospectives, just going through the motions. Yes, we’re doing a retrospective but maybe it’s not very good. And that’s actually one of the practices that we have in the learning and improvements aspect where over the four dimensions we look at how might they improve over time? If we’re just doing them irregularly, maybe we’re in the beginning pattern. If we have other retrospectives that are rather formulaic, maybe you’re just constantly doing start/stop and maybe improve or “I like this/I don’t like this” <laugh> very basic, very formulaic kind of retrospectives. That would be something that we would put under the unifying dimension. You know, the team isn’t doing a great job of completing their improvements or maybe they’re not even understanding their improvements very well or prioritizing them very well. And then moving through from there, s practicing retrospectives for a team might be that they focus retrospectives on different topics over time.

(12:20):
They schedule improvements into their iteration, they actually show up to get those things done and they plan for that as part of their planning practices. And maybe in the innovating dimension, the team is doing a lot of very frequent and very varied retrospectives. So they’re bringing in a lot of different types of perspectives, topics, approaches, so that they’re really engaged and improving as a team. It doesn’t feel like we’re just going through the motions. Everyone’s like in on this and excited about this, desiring to make these improvements. And that really shows in the style of retrospective that they’re doing. So using our model that we’ve developed kind of helps you to see how practices that the team does might improve over time as they progress and deepen in their practices. It’s not about yes or no, are we doing this, but are we growing in the depth and breadth of how we approach our thinking and our doing rather than just checking off, yes we do this activity but <laugh> maybe it’s not very good <laugh>.

Matthew Heusser (13:37):
Speaking of which, and I think that’s great. You get to a qualitative difference between a good retro and a “meh” retro. I think that’s very important. Do you ever, as you do your work, find that there’s a difference in the answers you get on levels? Like there’s 10 teams. Senior management says, “We always do retros, of course we do retros.” You find one team that did a retrospective once and other people don’t know what it is. I was sitting in a cubicle and then there was a call once with the manager who told me his team always did unit testing an hour before and he was kind of condescending to me and telling me how to do things and he had a call with his team to show off and one of his team members on the call said, “What’s a unit test?” So do you ever find that difference between what is spoken and how it’s practiced and what do you do about it if you do?

Janet Gregory (14:26):
Oh absolutely. There’s a difference. In fact, what Selena and I are currently doing is writing a follow-up book on how to facilitate an assessment. I think it’s really important to ask the questions but also to observe what is actually happening because you will see differences. For example, I did an assessment for five different teams within a company. All five of them are in different places. Absolutely. So you have to treat them differently and look at it. But even between people, I like to start with a process retrospective, tell me what you are doing right now. And that gives a lot of insight into how people perceive things. One person might say, “You know what? We get our user stories this way” and somebody else will say, “Yeah, but before that this happens.” And when you get into what actually happens, you’ll ask questions. When I facilitate, I try to make it visible, open, so everybody sees what is happening and what I’m thinking because I will ask questions.

(15:37):
I then might choose to do some interviews with specific people to say, “You said this, can you tell me a little bit more about that?” And when I write up my report at the end to whoever has asked for these assessments, I will put observations and one of the observations might say, “There’s a discrepancy in what people say and what they’re doing.” And so that puts them not in the Practicing for this particular quality aspect, but it would put them back in the Unifying because they’re still trying to figure it out. You can’t just say yes or no. You have to go in.-depth a little bit more and look at what is actually happening because it’s often not what they say

Selena Delesie (16:24):
In this book, each of the chapters for each of the quality aspects and the practices specifically, there is a set of questions included in the book. So people can look at these questions and ask these questions to kind of get into understanding and the way that Janet’s describing, so it’s not a yes or no, it’s they’re open-ended questions to kind of understand what’s really going on. So that’s a gift for people to take a look at these practices and aspects and learn to ask questions that are really powerful to really understand what’s truly going on.

Matthew Heusser (17:01):
Did I understand you correctly that you might be innovating in defect management, which could be not just old-school bug reporting, but a very well-defined process. Say you’re an existing software company and you flow these things through and you write them up so well that customer service can use them when they get phone calls. “Oh yes, that’s a known issue, da da da da.” But you might be in the beginning of your deployment pipeline because you’re a traditional piece of software that’s been around for a very long time and you have old practices in place that are hard to change. So you could be in different dimensions for different practices or did I get that wrong?

Selena Delesie (17:39):
Yes, you can be in different dimensions for different practices. The descriptions you gave are a bit different than how we’ve defined them. But yeah, you could be all over the place. Absolutely. Depending on what you’ve chosen to focus on improving in your organization.

Janet Gregory (17:55):
Let’s take that. So defect management is our number 10 quality aspect. It’s the lowest. We don’t put a lot of importance in how you write up defects for example. That would be in your beginning, you write up every defect, you’re thinking about the metrics and how much they are as you go into unifying defects might not be fixed right away or those reported by say the customer are entered into the defect tracking system and you think about how you wanna fix ’em or maybe when you wanna fix them. Into Practicing, we’re hoping that people are concentrating more on defect prevention. We’re spending more time on understanding the stories and good coding practices so that there aren’t as many defects. The goal is to think about how do we get them fixed immediately if they are there and perhaps any defects that are found by the customer or found in the cycle some time are fixed immediately, maybe in the next cycle. Whether the next iteration if you’re doing iterations and are innovating is we’re kinda looking at it as we don’t have a defect backlog. We might not even need any kind of defect tracking system because as soon as a bug was reported, we have a mechanism for dealing with it and we fix it immediately. So that’s kind of how we looked at defects versus how well do you write a bug report.

Matthew Heusser (19:30):
Yeah, I can see that as a style and value system that works for many companies.

Janet Gregory (19:35):
Yeah, so it’s a little bit different than your typical, “Is my bug reporting getting better and better and better?” We wanna think about how can you improve that whole process before you even report a bug.

Michael Larsen (19:48):
So of course there’s a book that’s come from this, but I guess the simple question that I would want to ask, was there a moment or a series of moments that prompted either on your end or on Selena’s end to say, “Hey, you know what? This might be a good opportunity for us to do a book about this.” Or was this something to where it was a long time consideration? Or did a particular need make you go, “Huh, you know what, we can do something about this.”

Selena Delesie (20:17):
You know, <laugh>, so when Janet talked about, you know how this came together earlier, that checkpoint I think was, I think it was early 2021. Yeah Janet? it’s like April or so. And we’re talking about like what are we gonna do with this model that we’ve developed now? Like we should do something with it. And we started looking at designing some workshops and maybe doing a class for it. We can teach people how to use this and certify people. I just got kind of complicated and we’re like, “Well how are we gonna actually do something with this? What if we just write a book and we just give it out for free?” I mean people buy the book, right? But otherwise they can just take the book and use it. So that just seemed like the easiest approach for us to get it out into the world because we saw the value in Lisa Crispin and some other people that we involved to do some reviews of the book for us.

(21:12):
We’re getting a lot of value out of it in the model even before we started writing the book. So yeah, we just wanted to share it out in the world to help others to really be able to do a deeper dive into how are you really working. And it could benefit a lot of our peers who are also consultants and coaches too in their works. Now Janet did say we’re working on a follow-up book on how to facilitate assessments using our model. It could be helpful for just facilitating assessments in general and there are other things that we have in mind to come later following these two.

Michael Larsen (21:51):
Fantastic, thank you.

Matthew Heusser (21:54):
So I’m assuming when you say the basics, that would be like Scrum standups every day, sprints, completed work that is predictably completed, handoffs and work process, retrospectives and planning.

Selena Delesie (22:12):
Yeah, just getting those things working at a regular rhythm for some organizations or for some parts of organizations can be difficult. I think we’ve all worked enough in the industry to know that there are various things that influence the ability of teams to make those transitions. Culture is a big part of that. What is talked about as being valued versus what is demonstrated as being valued can often be two different things. I’ve been consulting and coaching and training for 12, 13 years now. Those are common things that I see where there’s a clash between the vocalized values and desires versus the hidden values and desires. Especially with really large organizations, those can take longer to turn around. Hence the struggle to adapt those things.

Janet Gregory (23:06):
And sometimes it’s a matter of making it visible.

Selena Delesie (23:10):
Absolutely. And sometimes that visibility, if the value is on being busy, say there are middle to senior management layers who are used to directing work, who are not wanting to let go of that <laugh>, that can really get in the way. And even though it’s visible, it doesn’t mean it’s going to change very quickly necessarily. So there’s a lot of hidden coaching that needs… hidden to some people, let’s say. Coaching that needs to happen to help to change the mindsets of people who may be unduly influencing a way from the transformation that’s been asked for. We kind of touched on that stuff in the Culture quality aspect in our book. That’s a whole other area of conversation that we could talk about maybe separately.

Janet Gregory (24:02):
Yes. Because without that culture in place, it’s really hard to make any kind of progress.

Selena Delesie (24:08):
Yeah. And that’s why we ordered our quality aspects, like looking at feedback loops first and then culture, learning, improvement, like those are the top three and they’re ordered that way very intentionally because if you can get those things working really well, the things that teams or divisions of companies or companies are struggling with, those things helped us smooth those struggles out and helped them to really get moving a lot better. Getting the communication open and transparent and clear between members of the team, between members of the team and customers and stakeholders and management. The value and respect that we have for the work and for the people, like all of that stuff is really critical to having all of the rest go smoothly.

Matthew Heusser (24:56):
So let me ask you a question. Michael and I have been working on this theory. We’re starting to talk about it more and more in public that in the bad old days, someone would say, here’s the software. A tester would get the software, I’d have to test it with no context at all if they’re lucky. There’s like a poorly written specification that’s six months out of date, which is slid under the door with no communication. The goal was to minimize the need for communication between roles. So we could do analysis in the United Kingdom and development in the United States and testing in Mexico and operations in France, and we could then optimize by role by price. So we could prevent communication. In grad school, I wrote a paper that was called “on the optimization of productive resources by region through communication elimination” or something like that. And the body of the paper was shorter than the title.

(25:51):
The body of the paper was “You are screwed because it doesn’t work.” So the theory is a huge part of what the Agile zeitgeist was, was just what if we just talked to each other? What if instead of preventing communication because it’s expensive, we figured out ways to make communication cheap. Today we have Zoom and we have all these other things, retrospectives and all these other tools to use to actually talk to each other. So instead of as a testing expert mathematically figuring out the correct coverage analysis versus feature analysis versus risk versus reward to optimize our time, a lot of the agile testing stuff is, “Let’s just talk about where it might break and make a list and make sure we check on it. In fact, let’s make sure we figure that out while we’re developing it. So then it comes through for any final inspection, we’ve covered at least the high-level basis.” Is that a fair assessment and how do you think your book contributes to that little world?

Janet Gregory (26:56):
I think you’re correct because we think the feedback loops are the most important thing. And the feedback loops are really about the conversation about talking about risk really early when we can do something about it, when we can mitigate it where it makes sense. Testing after the code is written really doesn’t help. Our model is how can teams look, whether it’s having an outside facilitator or looking at it themselves, asking themselves the questions. Are we doing this at the right time? And trying to get, how do we get better at it? How do we set our, Selena mentioned mindset before, how do we learn and improve? Does our development approach support this? And those are what we want to think about to trigger the questions to say, “How can we do this better?” Yeah, the feedback loops that communication, that’s probably the most important thing.

Matthew Heusser (27:58):
You said testing after the code is written doesn’t really help. All these people with automation suites and Playwright and Selenium and Puppeteer and all this stuff, they’re wasting their time.

Janet Gregory (28:09):
That’s not quite what I meant. But if you only, and I’m putting only in quotes, test after, you cannot test quality in a lot of those things that you just mentioned, the automation, is, “you’ve talked about it beforehand. You know what you’re going to do. You know what you need to automate because everybody has that shared understanding. The automation to use probably one of your words a check, does it still do what it did yesterday?” That’s becomes your regression suite. But the preventative stuff has already been done. We’ve had those conversations. We’ve talked about the risk and the automation is just a way to make sure that we haven’t broken something.

Matthew Heusser (28:49):
Well, thanks. And now it’s the part of the show where we say, what questions did I forget to ask? What are your final thoughts and where can people go to learn more about you? Where can they get the book?

Janet Gregory (28:58):
The book is available on LeanPub right now, so it’s only available as an ebook. Our second book we’re hoping to have out in the first half of next year. We don’t have a set date. You can look up, Selena and I, we’re pretty easy to find. My website is janetgregory.ca. CA for Canada, but you can also find me under agiletester.ca a bunch of other places. Selena?

Selena Delesie (29:29):
Best place to find me is LinkedIn, honestly right now cuz I have a website overhaul to do so find me there. That’s the best spot for the time being. And when website is ready to go, it’ll be linked from there.

Michael Larsen (29:44):
All right, well hey, I wanted to say thank you to both Janet and Selena for joining us. We’re excited to see how the book is received and we’re excited to see the follow up and we would absolutely love to have you back when that second book is done, cuz I think there’s plenty more to this conversation we can have. In the meantime, thanks everybody for joining us and we will see you in a couple of weeks on the next episode of the testing show. Take care everybody.

Matthew Heusser (30:10):
Thanks, Michael

Janet Gregory (30:11):
Thanks, Michael.

Selena Delesie (30:11):
Thanks to you both. Thanks for having us.

Michael Larsen (OUTRO):
That concludes this episode of The Testing Show. We also want to encourage you, our listeners, to give us a rating and a review on Apple podcasts, Google Podcasts, and we are also available on Spotify. Those ratings and reviews, as well as word of mouth and sharing, help raise the visibility of the show and let more people find us. Also, we want to invite you to come join us on The Testing Show Slack channel, as a way to communicate about the show. Talk to us about what you like and what you’d like to hear, and also to help us shape future shows. Please email us at thetestingshow (at) qualitestgroup (dot) com and we will send you an invite to join group. The Testing Show is produced and edited by Michael Larsen, moderated by Matt Heusser, with frequent contributions from our many featured guests who bring the topics and expertise to make the show happen. Additionally, if you have questions you’d like to see addressed on The Testing Show, or if you would like to be a guest on the podcast, please email us at thetestingshow (at) qualitestgroup (dot) com.

Recent posts

Get started with a free 30 minute
consultation with an expert.