The Testing Show: Enhancing the Reality within Virtual Reality with Q Analysts

April 27, 2023
/
Transcript

Michael Larsen (INTRO):

Hello, and welcome to The Testing Show.

Episode 135

Enhancing the Reality within Virtual Reality with Q Analysts

This show was recorded on Tuesday, March 28, 2023.

In this episode, the metaverse and VR become more and more present and active. To that end it makes sense that there are companies working towards making those interactions as usable and seamless as possible. To that end, Ross Fernandes, Vikul Gupta and Prem Vishwanathan join Matthew Heusser and Michael Larsen to talk about ways that the virtual and the real are coming together and why it’s an exciting place for a tester to potentially be.

And with that, on with the show.

Matthew Heusser (00:00):
Thanks Michael. And this episode is a little bit different in that it’s actually a breaking news episode for the podcast and for Qualitest. By the time you’re hearing this Qualitest will have completed its acquisition of Q Analysts based in California. Now we’ve been talking about AI and testing and ML and testing probably, what, half of the episodes for the past couple of months. Q Analysts is a company that specializes in it. They’ve just joined with Qualitest to be part of Qualitest. We’re gonna start with the C E O of Q Analysts, Ross Fernandes, who actually founded the company back in 2003. Ross, what is Q Analysts? How do they specialize? What does specializing in AI testing even mean?

Ross Fernades (00:51):
Sure, thanks for the question. We, with 20 years of history, we have evolved along with technology. When we started to pass forward to where our presence is today, it’s with all the big technology players who are, as you can see, rapidly moving into the space and making massive investments. Our presence in the AI world really started maybe 2016 timeframe when AR VR was probably the buzzword. At that point, Meta had just acquired Oculus for $1.2 billion and they’re like, “Oh, we’re going to create this virtual world.” And so we got into that and started testing that. And then those AR VR headsets and the experiences that go with them started to leverage AI. And then that was a move in into AI is now you can do things like type on a virtual keyboard and it’s picking up your typing movements. It can be your eye gesture, your face gestures you can recognize. We are trying to put real life into avatars, for instance. So all of that started to push AI into the AR VR world. And then from that other devices evolved, you know, the home assistant devices and now intelligence is going to all sorts of devices and they kind of, they give specialists in that area of AI enabled devices. We’re calling it AI infused devices today. So that was the testing part of it. Then as that continued to evolve, we started getting requests for data sets. And the connection of quality is, you know, you can test the device and make sure that the experience of the device is real and solid, but when you put AI into a device, there’s an algorithm that actually runs that AI and for the quality of that algorithm to work, especially when it’s recognizing, let’s say, humans or any part of the human or gesture from the human or speech or the environment itself, you need to capture data sets and then make that algorithm ready and then build the algorithm. And there was that increasing need that started to come up. And the first project that came up, we were like, what is this even about? And then as we understood, we started to evolve into that space. And today we have market leaders in the world of data collection when it comes to humans, environments, scenarios for AI. So any AI-enabled tech that has to interact with the real world will need that. And then, so the facility we are sitting in is one of our facilities where we can build these real world but simulated environments to collect data.

Matthew Heusser (03:23):
If I’m hearing you correctly, you’re saying whenever someone interacts with one of these new devices as simple as a mobile device or as complex as Oculus, you want to actually capture their hand movements, their eye movements, their gestures. You wanna do a whole bunch of them. You want to gather some data on it to figure out, say, things the users never click or are harder to reach or things that are intuitive or patterns that you can reuse.

Ross Fernades (03:50):
Well, it’s more if you want to build the capability of the device to recognize those movements and gestures, you would need to collect data prior to that.

Matthew Heusser (04:01):
Oh, brilliant.

Ross Fernades (04:02):
Yes. So for example, let’s say I, I want to no longer use gaming controllers in my hands, but I’m going to use different movements of my hands to actually trigger certain things on screen. How is a computer just gonna recognize that, right? It can record it through a camera, but to put intelligence into that, to recognize each gesture and move something, you need to capture hand gestures and think of this across every possible population that could actually use the device. So the requests that typically come into us on the data collection side is not, “Hey, just go collect a bunch of random people.” It would be everything from skin tone to race to gender, to have somebody wear a watch, don’t wear a watch rings, have multiple rings. So a whole bunch of different combinations that would need to be put into a data set to make the algorithm work effectively. Then when someone actually doing those gestures, the headset or the device can read it. Facial recognition is a very good example that all of us use that to open our phones. There was a bunch of data that was collected to make that happen across different demographics and then a whole bunch of diverse attributes.

Matthew Heusser (05:12):
Wow. So if someone has a watch on, if they have a lot of rings on that might actually make the gesture not recognized or there could be all kinds of unconscious bias in the system that we design. “Oh, just click over here with your pinky.”

Ross Fernades (05:26):
Yes, I know, I give people this example all the time cause there’s real world that I live in every day. So my kids are mixed race and facial recognition technologies do not work with them. So my son’s phone and my daughter’s phone, my wife, my son, my daughter, all of them can open each other’s phones cuz the algorithms were never optimized for that or mixed race people. So think of the level of diversity you would need to collect data on in order to make even a simple facial recognition algorithm to open a device work. So it doesn’t work. Even for a Windows platform and iOS platform, it fails with them, as an example. So there’s a huge need for real world data sets as this AI space continues to grow. In the end, computers are not that intelligent. Every aspect of what you’re trying to do needs to be captured and fed to an algorithm in order for it to work effect.

Michael Larsen (06:16):
What it sounds like to me now, I’m going back a long time, back when I used to do this myself, we used to do something called human factors testing. I don’t know if that word is still in vogue or if that’s something that’s actively used anymore. But human factors testing we used to do is we would set up a camera and in many cases a detached webcam right on the screen of the computer that we were working with. And we would watch people go through workflows and we would effectively track their eye movements, track their facial movements, more from a personality and a psychological background because there were tells there that would say, “This is a usability problem.” Somebody might not say they’re having a usability problem. In other words, oh yeah, this is simple, I totally get this. But we would go back and review the information and you would realize, “Oh no, these people were having a nightmare of a time using this app.”

(07:14):
They just didn’t want to admit it. But the visibility tells. Now I’m looking at this studio space and maybe you might wanna describe a little bit about how this is set up and we have the cameras on for those listening to the audio. We may actually use this and put it up on YouTube so you can see what we’re looking at right now as to how this works. This looks like a 3D mocap minus the special suit and golf balls being attached to you. I don’t see anybody here wearing mocap, no I’m assuming that that’s not part of this equation.

Ross Fernades (07:47):
No mo cap’s happening in the next room

Michael Larsen (07:50):
<laugh>. But again, my question is looking at the size and the scope of a room like this, are you still using any of those older human factors techniques as part of what you’re doing here?

Ross Fernades (08:04):
Yeah, so good point that you brought up. I’m very familiar with that. I’m kind of dating myself here, but yes,

Michael Larsen (08:09):
<laugh>, welcome to the club.

Ross Fernades (08:10):
Very, very familiar with human factors engineering. So think of what we do as reversing that cuz then you capture the experience. We are creating the experience.

Michael Larsen (08:20):
Mm-hmm <affirmative>.

Ross Fernades (08:21):
So we wanna say, “Hey, if you frown, what is the device going to do in terms of reacting to that?” Think of the metaverse and in a virtual metaverse, the whole focus right now from a lot of companies is to take your avatar versus your real self and make it as lifelike as possible. So now if you frown in a conversation, we want the avatar to frown, which it doesn’t do right now. It just looks like a cartoon character. To just make it an avatar frown, you may need to capture thousands of individuals across the diverse population. Frowning, literally capture that, build into the algorithm. So now if some unknown face frown, the movements of the avatar would map to how that face moves.

(09:06):
And now you build a capability for an avatar and metaverse to actually drown as a simple example. So rec in this room here in is surrounded by this Optitrack camera system. This is about 60 to a hundred thousand dollars piece of equipment and what it can do, there’s multiple cameras all around this room. Now this is commercially available but how we would typically do data capture is we come in with a device that wanted to do the capture. Let’s say, it’s a phone. The phone had to do facial recognition, which I was talking about earlier. So you would now come in with the device and maybe it’s a prototype for the most part and you just have a bunch of people come in on a scheduled basis. And that’s one of our value that’s cuz we built this big diverse database of potential participants in these studies across demographics.

(09:57):
And you just have them record different types of frowns per person. The device is capturing the frown and say you have an experimental algorithm in there that’s trying to capture the frown. All of these cameras are capturing from multiple angles. You will compare the data from both sides and say “Hey, here’s what the device recognized, here’s what actually happened from a 3D standpoint”, and see where it’s off. And then you tweak the algorithm to make it work more effective. A lot of what we do is it’s kind of, you know what your human factors thing, but kind of in reverse, right? We are fine-tuning an algorithm instead of just capturing it outright. Cause we just didn’t have any baseline data. And then that’s how you build, make an algorithm better and better over time.

Matthew Heusser (10:38):
Wow. Well I think I understand the value Q Analysts adds to the delivery universe. I think I understand that you’re checking these algorithms to make sure that they’re playing nice across all the demographics. Seems like there’s a big value add there. Why Qualitest? Seems like you’re doing just fine, got your own little niche.

Ross Fernades (11:00):
Sure.

Matthew Heusser (11:00):
Why would you wanna partner with a company like Qualitest?

Ross Fernades (11:02):
Well thank you for that. So we are definitely, I feel, market leaders both when it comes to AR/VR testing as we do it for all the major corporations in the world and in this ground through data collections space with these investments we have made in facilities like this. But then we grew from just the startup in a Silicon Valley guest bedroom an international company and offices around the world. But that growth reached a certain limit to where in order to really scale up at a significant level, we had decided strategically that we really needed to find a partner in another firm to merge with to take that to scale. And Qualitest was the perfect fit across a whole bunch of areas. Number one being culture, very important for us for the continuity and retention of our people that we find a company that had similar value system as we did.

(11:56):
We saw that in Qualitest and all the leadership folks we had interacted with throughout the whole process. The next was a focus, right? So in the end our focus is quality. Qualitest is the world’s largest company in quality engineering and say, “Okay, well that was a perfect match as well. This is going to become part of another SI and then a whole bunch of other things as we drill down. Fact that this was one of the missing pieces in Qualitest’s portfolio. Hey we, yes we do some elements of AI et cetera. That’s further down the cycle but the front end of it to actually build the capabilities of machines to interact with the real world and with humans, they didn’t have that and we had that. So it was a perfect match on multiple parameters that just made this a very smooth and efficient merger.

Matthew Heusser (12:40):
Thank you. With that, it’s probably time for us to pivot to Prem Vishwanathan, who’s the technology sector head at Qualitest. And I, I’ve gotta wonder, okay, we understand the partnership for Q Analysts. Tell me about the benefits to Qualitest, Prem.

Prem Vishwanathan (12:56):
Thank you for that, Matt. I think as Ross mentioned, the entire focus around enhancing the experience for our customers was prime. When we were looking at building capabilities, as you know, Qualitest today focuses on customers’ journey from QA to QE to digital engineering. We became more relevant to our customers businesses in many ways. And as we look into the future, we are looking at the growth around AR, VR, MR, XR, what you may have there. And we felt we found a right partner in Q Analysts. The ability for us to enhance the reality within virtual reality, mixed reality came from this acquisition and integrating that was more complimentary than competitor in our focus in the market. And I would say that was a primary reason we felt our customers would be excited. We were connecting much more to their businesses. The ability for us to enhance their experience with their end customers became even more viable and we were a trusted advisor in that case. So this accelerates our journey there.

Michael Larsen (14:01):
So of course I get to play the devil’s advocate or the vision first mind to ask these questions. So of course we’re looking at this from the perspective of, “Hey, this is a great new partnership. This is Qualitest is getting involved in this and Q Analysts is getting involved in this. But let me be that elephant in the room and ask what do either of your customers get from this synergy, if you will? What is the main value add that, say, if I’m a customer and I’m looking at either one of you, why would this excite me, intrigue me, make me wanna say, “Hey, tell me more”?

Prem Vishwanathan (14:38):
Apart from the investment and leading edge technologies, the customers relate to us as Qualitest because we understand their business much better than many others in the industry. And the applicability of these technologies in this industry becomes even more relevant through involving Qualitest. While technology can be an enabler, the applicability and the experience that are relevant to the industry becomes even more stronger with Qualitest being a part of it. Just to allude to “why”, because we have specialty in each of these industries, be it retail, banking, finance, travel and tourism, technology and high tech, government sector, you name it, we do have expertise in the industry. So how do we actually make this relevant to those industries as a very, very critical aspect. People can talk about anything from a theory perspective. The practicality of making it much more usable is where we come into picture.

Ross Fernades (15:31):
And I’ll add to that to say if you look at the portfolio of offerings that we have for our key clients, right? We’re missing everything that Qualitest has. The automation, the tool sets, all that through several of the acquisitions Qualitest has done. Qualitest is missing our piece, the AR VR testing capabilities, the AI data collection capabilities. Now we can offer our clients a whole soup to nuts portfolio of a service offerings. We actually did our first client facing presentation yesterday that I think we pretty much wowed them, but like look at the capabilities we can bring to you across the board now as a combined entity. So I think that’s the benefit to the clients that we can really be a provider, which, with pretty much anything in the world of QA, QE, D that they might need become a one-stop shop.

Matthew Heusser (16:20):
Yeah. So what I’m hearing there is we can be your first phone call and we can get it done. Maybe it’s possible you want some tiny little sniggly thing in a corner or there are other companies that do this kind of work you could talk to. But our umbrella of services now is gonna be really wide and deep.

Ross Fernades (16:37):
Yes. And then as you look at where things are headed now, the focus on AI, right? That’s the biggest buzz Now AI, generated AI, AGI, all of that stuff. Everybody’s made a pivot to that with all of the chat bots, that just came out. And then the Metaverse, which is the other big buzzword let’s say from last year, that’s where technology is headed. We’re right at the forefront of that now combined with Qualitest.

Matthew Heusser (16:59):
That’s pretty amazing. And finally, we have Vikul Gupta who we’ve had on the show before. Welcome back Vikul who is the head of the next generation center of excellence work for Qualitest in North America. And I’m sure this is qualified. This just screams “next generation software testing work”. Maybe you could tell us a little bit more about what can customers expect coming out of this, what can we do today? What can we do in six months?

Vikul Gupta (17:28):
Just to recap from a technology perspective, right? I was involved in this acquisition from the get go and from the beginning itself this fit right into our overall strategy on what our focus areas would be. All the previous acquisitions we have done are in the next gen space. Whether it is ZenQ, whether it is QAIT, ZenQ was focused on drone and blockchain. Whenever we talked about Qualitest being the world’s largest AI powered quality engineering. What were we doing till that time, Matt? We’ve talked about intelligent testing when I did the last podcast. So AI is core to whatever we did. Now if you look at our offerings, we covered the real world. In the real world, we were testing the applications, mobile application, desktop applications, web applications. We were testing the platforms, we were testing the networks. But with Q Analysts acquisition, we are taking this story beyond real world.

(18:23):
We are completing real world by adding devices now. So we are testing devices, network and applications. So real world is done, but we are also the single largest testing vendor, which also captures the virtual world. No other testing vendor can claim that. We do testing both in the real and the virtual world today. So if you look at Q Analysts capabilities, they can be grouped in two areas. One, your AI infused device assurance. So whether it is headsets, whether it is watches, anything which has AI built in, focused primarily on XR for now. That is the kind of testing we do for those devices. But on the other hand, like Ross said, we create the data for the AI algorithms. Now, last time when we had talked, I had talked about how within an AI standpoint we had three offerings. We use AI to create models to solve business problems.

(19:21):
Then we test those AI models. And the third was we use AI to do intelligent testing. That was the focus of my last session with you guys. Now look at this marriage, how does it make us better together? What does Q Analysts do? They collect data, real data, increase the efficacy of an algorithm, but then they stop there. Where do we start? We take that data to create an AI algorithm, test that AI algorithm, deploy that AI algorithm. So it helps us complete that whole lifecycle. So when Mike was asking, “What do customers get?” They get a single vendor to complete that whole lifecycle. The people who are creating the data are the people who are using the data, are the people who are testing the data and deploying and governing it, too. That’s one aspect. The other part which I think Ross alluded to, is our capabilities in the field of performance, security testing, accessibility testing, DevOps, cloud assurance, data assurance, automation, assurance,

(20:26):
they’re pretty mature. In the world of device assurance. A lot of what Ross and team were doing was having the intricate knowledge of these devices. Some of the devices we’ve been testing since inception. There is one group which knows those devices from infancy, that’s Q Analysts with this marriage. What we can bring in, how to do that task even better. How to bring in automation, how do you bring in intelligence as they’re doing testing? I’ll give you an example. We were responding to a deal where we were doing data collection. Of course we are the market leaders in data collection as Q Analysts. But now our proposal also says not only do we do the data collection, but we have an AI model which will automatically identify outliers, duplicate rules, non-value analysis, bias like I think, Mike, you and Ross were talking about. That combined capability together is where the value add for the customer is.

Michael Larsen (21:26):
So I would like to gear this just a little bit towards… of course, we have a lot of listeners who don’t use Qualitest as a vendor or a provider, but they appreciate Qualitest for hosting the show and for sharing knowledge about the testing world and how this all fits together. So I would like to approach this from your everyday software tester. What does this mean to us? What are things that we should be paying attention to here? What are things that will directly affect or perhaps enhance how we view how we approach testing and what can we learn from this?

Ross Fernades (22:07):
Sure. Think of conventional testing and really everything we do today is 2D focused. It’s a device, it’s a screen. You’re testing that. In the virtual world, you’re now moving to 3d. And someone asked me the question just the other day, so what’s the difference? That’s an evolution of the skillset, the business, the visual piece that you start out with, which goes back to the experience that we’re trying to create for the user. And let’s say in a virtual world, something doesn’t work. Well now, first of all, we need to be able to identify that. Let’s say refresh on the screen didn’t work, or some pixels were not rendering correctly on the screen. Now the bug becomes a conventional bug that you’re gonna report. But then you have to go and debug that and figure out what’s really happening on the back end, cause it’s not just simply about the fact that hey, the colors were off or something like that. What is the actual root cause of that? So it takes the skill set up a notch to go from just running test cases to adding this visual acuity. And we like to think that our testers that have made a career with us, have developed all those enhanced skills for the virtual world rather than just the regular 2D world. That’s definitely an evolution and it’s a more, i think, exciting space to be in.

Vikul Gupta (23:21):
Mike, from my perspective, if you talk to a normal tester, his world is functional testing, performance testing, security testing, accessibility testing. And in some cases customer experience testing. This just takes it a level beyond, which is the human and device experience testing. Customer experience is a one-way screen. You just look at what is done and it doesn’t have an impact on the functional testing, the performance testing. Whereas imagine you’re wearing a XR headset, the weight of the headset will have an impact on your functional testing performance testing. So the world is different now. The XR experience testing, the device experience testing, is another field which is getting added from a testing world perspective. And it’s going to become more and more prevalent and relevant as we move on to more and more of metaverse coming into picture, either from a gaming perspective or from a virtual world.

Prem Vishwanathan (24:17):
I can add to what Vikul and Ross just mentioned. Ross mentioned from an activity perspective, Vikul spoke from a technology perspective. If I can come from the business and the usability perspective of what the end user feels, it is any one of this. We are consumers, we are the users in many ways. It is all about our experience. So a wealth of information sometimes can create a poverty of attention and just imagine creating the right experience, allowing us to learn and train in a completely different fashion, could be the impact various industries would have in whatever their business envisions. And that’s what we harness and we power with both technology and the activities that are important. So from a business, technology and the process angle, we touch upon all to ensure that that experience is given to each one of us and what will we do.

Matthew Heusser (25:05):
Okay, well it’s about that time for us to wrap the show up. Usually we ask people for parting thoughts and I’d like to do that here. If there’s a particular resource, a webpage, something someone can go to learn more, how to do it themselves, open source tools, that sort of thing would be… conferences we’re gonna be at. This would be a good time to talk about that. Maybe, I don’t know, if we’re gonna be at Star East, but I’m sure there are a lot of people that would love to hear more about this. Ross, do you want the last word?

Ross Fernades (25:36):
Sure. I think, if you look at the technology trend in the next evolution of technology, it’s going to be an AI driven role, where AI is incorporated into everything we do on our daily lives, even beyond where we’re at today. And that’s where your listeners and folks listening to the podcast should be prepared for from a career standpoint, skilled standpoint, et cetera. So I would say pay attention to webinars, seminars, the evolution of technology itself. And the entry point, the thing of it is, like we don’t have people with 10 years of experience in any of this. It’s all relatively new. So it’s easy to go, “Hey, I want to get into this world and to get a foothold into that”, then build your career from there. And so there’s tons of opportunities that are going to come up with people who align with this direction with technology.

Vikul Gupta (26:27):
Matt, for more information on what we bring together from a capability standpoint, listeners can always go to qualitestgroup.com. There are blogs. There are webinars. Prem published one blog on the same topic, which they can go subscribe to. There’ll be many more coming in the next three months. So it’ll be an exciting space to watch for. We’ll be publishing some of those through LinkedIn and through our normal website channels.

Matthew Heusser (26:56):
Fantastic. Let’s try to get that in the show notes, Michael.

Michael Larsen (26:59):
Yes, definitely.

Matthew Heusser (26:59):
Not, not just Qualitest. but here’s where you can go to read more about it, what we’re doing, how it works. Okay,

Prem Vishwanathan (27:06):
So I think just a parting thought, if we can include this, the way it’s revolutionizing the industry is very, very critical. We can be open-minded about it. There are things that we need to unlearn, but the learning of this is so much more exponential in terms of what the futuristic experiences would be and what we need to harness and embrace even more than being anxious about learning something new. So I see tremendous opportunities, as Ross mentioned, for us to have new streams of capabilities, new streams of work, new streams of experiences and business opportunities that are immense in this area.

Matthew Heusser (27:40):
Sounds wonderful. Looking forward to more, but for today we’re gonna have to say goodbye. So thanks everybody, appreciate your time.

Prem Vishwanathan (27:46):
Thank you.

Matthew Heusser (27:47):
Thank you.

Vikul Gupta (27:47):
Thank you, guys. Thanks for having… Thank you both.

Michael Larsen (27:49):
Thank you <laugh>. Thanks for having us as always.

Michael Larsen (OUTRO):
That concludes this episode of The Testing Show. We also want to encourage you, our listeners, to give us a rating and a review on Apple podcasts, Google Podcasts, and we are also available on Spotify. Those ratings and reviews, as well as word of mouth and sharing, help raise the visibility of the show and let more people find us. Also, we want to invite you to come join us on The Testing Show Slack channel, as a way to communicate about the show. Talk to us about what you like and what you’d like to hear, and also to help us shape future shows. Please email us at thetestingshow (at) qualitestgroup (dot) com and we will send you an invite to join group. The Testing Show is produced and edited by Michael Larsen, moderated by Matt Heusser, with frequent contributions from our many featured guests who bring the topics and expertise to make the show happen. Additionally, if you have questions you’d like to see addressed on The Testing Show, or if you would like to be a guest on the podcast, please email us at thetestingshow (at) qualitestgroup (dot) com.

Recent posts

Get started with a free 30 minute consultation with an expert.