January 17th, 2025
39 Mins
Listen On
Andrea Jensen (Guest)
Tester & Team Lead,
NavisKavya (Host)
Director of Product Marketing,
LambdaTestThe Full Transcript
Kavya (Director of Product Marketing, LambdaTest) - In today's fast-paced remote work environment, collaboration has become more than just a buzzword; it's a necessity. Testing teams often find themselves spread across different time zones and domains, making effective collaboration even more challenging.
And traditional approaches to testing, which can sometimes feel isolated or disconnected, are evolving to embrace new methodologies. One such approach is ensemble testing, a collaborative technique where teams test together in real-time, leveraging diverse perspectives to uncover bugs and improve software quality.
And today, we'll explore more about it and how a team's productivity but also fosters a culture of shared ownership and continuous learning. Hi, everyone. Welcome to another exciting session of LambdaTest XP Podcast Series. Through XP Series, we dive into a world of insights and innovation featuring renowned industry experts and business leaders in the testing and QA ecosystem.
I'm your host, Kavya, Director of Product Marketing at LambdaTest, and it's a pleasure to have you with us today. Today's topic, collaborative remote testing and how to set up and run effective ensemble sessions, is all about discovering how to drive collaboration and innovation in remote testing environments.
But before we get started, let me introduce you to our guest on the show, Andrea Jensen. Andrea's journey into tech began by chance in 2011, but her passion for testing quickly turned into a thriving career. Over the years, she's not only honed her expertise as a tester but has also taken on leadership roles that have challenged and inspired her.
Currently working as a Test Lead, Andrea brings a wealth of experience in cross-functional collaboration and innovative testing methods. Beyond her professional life, she's a tea enthusiast and a fan of Tolkien's works, which speaks to her appreciation for both details and storytelling. We're thrilled to have you here, Andrea. Thank you so much for joining us today.
In today's session, Andrea will take us through the concept of ensemble testing and its transformative impact on remote teams. So let's jump straight into the heart of the discussion. Andrea, it would be great if you could help us learn more about your journey and, of course, share your testing journey with the audience.
Andrea Jensen (Tester & Team Lead, Navis) - Hey, Kavya. Thanks for having me. Sure, happy to share what I know about ensemble testing, what my personal experiences are, and what we learned as a team.
Kavya (Director of Product Marketing, LambdaTest) - Awesome. So, let's jump straight on to the first question. Can you elaborate on the specific challenges that you faced in getting buy-in for ensemble testing and how did you navigate those challenges?
Andrea Jensen (Tester & Team Lead, Navis) - Yeah, sure, I can. So the interesting thing is before we started ensemble testing in the team, I tried different approaches to get the team closer together. It all started when we were in the middle of the global pandemic called COVID.
All have been working from home, and we haven't seen us for ages back in the day. And I felt like it would be nice to do something together. So first, we tried different approaches. None of them have really a long-lasting effect would be nice for the first time, second maybe, but then it quickly dissolved.
So I came across during the pandemic through the amazing testing community the idea of mob programming, aka mob testing or assemble testing or teaming. However, you want to call it. And I thought, well, that would be something super interesting. I did not really ask for permission to do it. I just did it.
I'm a fan of ask for excuses later on or ask for pardon rather than asking for permission. So I just set up a session where I invited everyone who wanted to join and didn't really reveal what we were going to do. I just said, hey, let's do something fun as a team. By coincidence, our director of engineering back in the day was participating. I did not know, he did not know.
So the first session ran and that was quite a success. of course, it was a bit bumpy and rocky in the beginning. That's to expect. But eventually, the session turned out very smooth and nice, and we all had great learning, and we all had a good time together.
So he instantly, in the feedback in the retrospective round, said, well, you know what? I like that we should do this more often. And that was basically what I needed to get the momentum to get the boy in.
And for me, it was by coincidence. If I should shoot any suggestions as people hearing this now, maybe you can try my approach. If it works in your team, your culture, and your company, you know this way better than I do because you are living there daily. But you also could try on a small scale just with a couple of friendly office people.
You know that you can trust them, you can try something out on a small scale, kind of an experiment situation, and then see how it works. If that turns out well, maybe avoid someone from the management when you need the buy-in, or just start it on your own. As an agile, people-ourself, organizing, self-responsible team, you may also try this approach.
Kavya (Director of Product Marketing, LambdaTest) - Thank you so much, Andrea. That's a very interesting perspective. And of course, I think not asking for permission, rather pardon that seems to be something that, you know, team leads across, you know, the organizations can implement irrespective.
And, of course, thank you for sharing those challenges and how you address them. Moving on, can you also give us a breakdown of the absolute essentials for teams to have in place so that they can get started with remote ensemble testing?
Andrea Jensen (Tester & Team Lead, Navis) - Yes, sure. So from my perspective, I would say it's the technical side, of course, because we are remote. So we need to adjust to these circumstances. What I would recommend from the infrastructural side or from the technical side is that you have good headset microphones.
I think by now, we all know how annoying it is if you are in an online meeting, five people trying to talk over each other, and you can't understand a single word anymore. So, good headsets and microphones.
Then in the remote session, it is also important from the technical perspective, how is your tool? Is it a web tool? Can you deploy it somewhere on a server, and everyone has access to it from their own machine? Is it maybe something mobile and you need to have really a device? Or can you emulate it? Or is it, as it is in my case, a desktop on-prem installation?
So for our sense, what we did initially, was I shared my screen, and I also shared then the remote control of my computer. That is all nice and fun until some funny guy starts to open your command line and wants to format your C drive. I had this. He didn't hit the enter, but you really need to trust your folks.
You're in the meeting what you can also do is set up a dedicated computer. Everyone can reach it remotely and install it in advance. So that is also something I did later on for the sessions as we moved on. I installed it in advance.
Of course, if you are the host, if you're maybe the driver of the initiative, you need to keep an account that is a bit of an extra time for you. So that is from the technical side. Then, of course, from the people's side, there's also something to consider.
So when we started, I had a little bit of a checklist for myself to feel safe. I'm a check-lister. I had it next to my laptop, taking marks as I went. You can have this as a kind of structure. If you're getting started, how do you want to organize the meeting?
Give it a bit of a start, explain what is needed, and so on and so forth. That was something from that side so that also everyone in the team knows what to expect in the 60 or 90 minutes, however long your session will be, will happen. So that is something I could recommend.
Also, from the people perspective, it depends a bit on your personal dynamics in the team, how you want to run the ensemble testing and maybe think about this beforehand. Because if you just get started right away with five to six, eight, nine people, it can get a bit of a mess.
So think in advance, do you want to have a time structure, or do you want to have a rotating system? Can everyone who wants to talk chime in? What really was particularly good for us was to create a third row. By default, you have two rows. You have the driver and the navigator.
Maybe I will elaborate a bit on that so that people who are not familiar with that know about it. Otherwise, you just move forward in the podcast and skip the next minute. So the navigator is the person who walks the folks through the interaction. What is the goal? What do we want to test?
What We want to program that goes for any job you want to do together. They explain on the highest abstraction level what they expect from the driver, while the driver is the person who controls the keyboard, the mouse, other input devices and the application in general.
So the driver just doesn't do anything on their own. They just get guidance on how to operate the system, the application, the mobile app, name it. And then we had a third role, which I personally call the spectators. So in a remote setting, it's really hard to keep track if more than two people are talking at the same time in a meeting.
So the spectators are the ones who watching or observing the interaction between the navigator and the driver. And then after the interaction is through, you can do it based on a timer or you just say, well, unless this test idea is explored, the question is asked and answered, we can move on.
Then we just move the roles. So whoever was the navigator, maybe once that lean back and relax a bit observe and someone else jumps in the row as well as goes for the driver. So that bit from the people side. Later on, I also figured out for newbies, I just created a one-pager. What is ensemble testing in our sense?
How we understand it and send it up beforehand? So if someone gets curious, I could just send it out rather than explaining myself over and over again. And then, from the testing side itself, think about what do you want to test in the session.
Initially, I started with something which I called like, hey folks, let's explore this new feature we just created, or we're just in the being of the creation. Let's get a first feedback. That was quite nice, but some people later on approached me and said, you know, you are a tester. I'm not.
We do this in a cross-functional way. And I have a of trouble to understand when you say just explore this feature. What do you want me to do? So, I reflected on that, sat down, thought about it, and I was coming up with the idea of test chargers, mainly shaped by Elizabeth Henriksen.
If you don't know it, find your book, explore it, Elizabeth Henriksen, test chargers, you will find a ton of information about that. So I wanted to give it a try and was starting to formulate test chargers. I started them in our sessions as the mission to accomplish, adding some gamification to it.
What do we want to do? With that, I got the feedback that they have a more clear understanding or idea of what the test should be about. So that helped. So I would basically say the three factors. The tech tells the people what they can expect and prepares the session.
Kavya (Director of Product Marketing, LambdaTest) - This is awesome, Andrea. Thanks for breaking that down so clearly. Sounds like these are also surprisingly, these are also something that can be easily implemented within the teams. And I'm sure our listeners can refer to what you just shared as a mini playbook to start off with, right?
Yeah. And if you can also walk us through a specific example of how you might adapt the ensemble testing method for a smaller team versus a larger one, given how we have listeners working as a freelancer all the way up to those who are in the enterprises in a large team set up.
Andrea Jensen (Tester & Team Lead, Navis) - Yes, certainly. So when we started, we have been maybe around eight. Later on, as we iterated, the news travels fast in our office, I believe the same goes for everyone, every office, every company. So more people got interested and joining. Sometimes we have been 12 to 15.
And I'm not saying that is the ideal number of people in an ensemble testing, but I did not want to exclude anyone. So I was happy that people just wanted to test with me together because back in the days we have been not too many testers in the organization.
And I was just happy that I could show what I'm doing day by day. So if you are in a larger team, would say they, and if you're in the beginning, if they are not so mature, if you haven't done it for years, I would say the team probably needs a bit of more of a guidance, but more rules.
Otherwise, it can quickly go into anarchy. And those meetings are not fun for no one. So we don't want to do that, right? At least, I don't want to. So what I felt worked well for us, and you could try this out and see and try out if it works for you or adapts to your personal needs. We started, and I started a bit of a running order.
So we did not do it based on a timer. The timer felt a bit artificial to us, but for some teams, the timer putting it to five minutes and then changing the roles works super good. However, what we did was we had a running order, and the first person was the navigator, ready to get the question into the driver.
And once that was answered, I just was calling on the next person. So I made little checks every time they had, they went through. And that really helped for the first couple of iterations. We maybe did it for the first three times.
After that, some people know what it was and the crowd got also a bit smaller because the new thing is no, no, and now people are joining in who had the time and the interest to. And then with a smaller group, you usually have more of a group dynamic where people naturally chime in.
Also, for a smaller group, I feel you maybe need less time and preparation like the missions. If you're in a five, four-person ensemble, you can quickly state what you're going to do here. People have time to ask clarification questions. If you're 15 people, if everyone asks for clarification, you're almost done with your time.
I would say that is a bit; I would recommend like creating a bit more guidance upfront when you're on a bigger team and bear in mind depending on what you're testing and also on your setup, which you are using as a team, maybe more people in their meeting software itself might be more harder to handle to think about that.
Personally, I would recommend ensembles of four to eight people. Then by that, everyone can interact, and have the time to also take over the roles. If you're more than, it’s more likely that some people are staying a bit in the background observing, which is fine. But the essence of an ensemble is that you are working collaboratively together. So that's my bit of advice on that question.
Kavya (Director of Product Marketing, LambdaTest) - Thanks, Andrea. I think it is pretty practical. So even within large teams, what you're recommending is if ensemble testing is happening between a smaller group within that larger group, that is more effective primarily. So as you mentioned, so four to eight-person team is what you have come across as the optimum size of it.
Just out of curiosity, how do you also define the running order? Does it mean that people are interchangeably getting, I mean, this is just for my understanding, does it mean that people are getting, you know, in one session itself, they are becoming the navigator, and then they can move on to the spectator role and so on?
Andrea Jensen (Tester & Team Lead, Navis) - Yes, so the running order was just happening by coincidence. It was just the person I saw in the meeting, I put the name first of the person I saw. There was no hacking order, no alphabetical order, nothing. It was just for my own peace of mind in the beginning.
How could we operate this that it is a nice thing for everyone? And also talking about the roles, certainly in every ensemble session, everyone can take over any role. depending a bit on the tech setup, if you share your own computer and want to trust your people enough to give them remote control.
Or if someone has access to the dedicated testing machine, sometimes people for us chiming in who just chime in on short notice joining, and then we do not the time to set up the login for the test environment.
In that sense, they probably are not the navigate, not the driver, sorry, not the driver in that session for the moment, but they can be the navigator and spectator. Whenever you have all the access, when you have it installed on your own computer, we just take on any role.
And usually we try also to change the driver quite frequently. Although a funny story, I had once someone who was like, observing a bit in the beginning because she was not too sure if if such a technical thing is something for her. She's based out of the sales team but was joining out of curiosity.
And of course I enjoy invite every profession to join because that is what the ensemble testing makes so powerful in my understanding. And she was a bit in the looking in the background observing. And after a while she said, well, you know what? Next time I think I can be the driver.
So the next time it happened, she was the driver, but she did not want to give up the role at any time. So it ended up that she was running the entire software for 60 minutes in a row. I always checked in, like, hey, do you want to move? No, no, no, I'm fine. I'm enjoying this so much. I don't want to stop. But generally, you move the roles.
Kavya (Director of Product Marketing, LambdaTest) - Sorry, it was weird. Thank you so much for sharing that story. Moving on to the next question, how has implementing ensemble testing impacted your team's efficiency and effectiveness in delivering high-quality software? I'm sure this is one question that a lot of test leads and QA leads think about.
Andrea Jensen (Tester & Team Lead, Navis) - Yes, that's a great question. Although a very tough one for me personally because now I think when we think about effectiveness, and efficiency, everyone in a Tesla position or a director position thinks about, we have KPIs here, right?
So how can we measure the effectiveness? Sorry to disappoint. I don't have any KPIs tracked. Back in the days when I started, I had no idea that we are going to do this more than one time. And I also had back in the days, not the interest in to do an F for the sake of KPIs.
So I cannot tell the silver bullet that it will increase your efficiency by X amount of percentage. But what I can say is to us and the team or not only in the team, because the team is like, it's a different set of people always when we do the sessions. I figure, or I feel that it has impacted us.
In fact, in different layers. It is on the product layer, of course. It increases our quality because if we do it together, we find more, I don't want to say bugs because it's not always a bug. Sometimes it's a design flaw. Sometimes it's just something which is working fine, but it's far away from reality how our users need that.
So we just made the use and the sense and the purpose of our software better. And of course we found some bugs but it also impacted us individually. I get, I, whenever I'm doing a session, I learned something and you learn something and you share your knowledge with others.
So this happens in every single ensemble testing session I've ever part of. And I think that will continue because learning never stops. so that is really great thing. And also in the team or in the culture we get a lot of training. We get hands-on communication training.
No matter if you want to go to a communication course or not, if you are in an ensemble testing session, you just get it. like, give me an example. If the navigator usually has the idea of like explaining the thoughts on the highest level of abstraction, tells the navigator to load the setting file into the application.
And the navigator stays silent and does not move anything there's a clearly a feedback the driver does not understand what the navigator wants so then the navigator gets the feedback like maybe my explanation was not clear enough so they could ask like hey do you know how to load the setting file in our application and then the driver could say no never done that where's the entry?
So by that, you learn a lot about your team, how every individual communicate and talking about communication, it's not about only to be understood, but also we are now international teams. A lot of people working remotely, having team members on different continents. It also is a lot of language which we use.
Sometimes maybe you use a word and the other person is just not used to that. So you can help to clear any doubts and share the clarification alongside. And then there might be people in the room, in the virtual room in our case, who have not heard about that themselves, but so think, okay.
So the driver, they have the really much domain knowledge, not so much maybe software knowledge, but the navigator can explain this very well. So, you'll know also who to approach later on with certain sets of questions you may have in your day-to-day life.
Kavya (Director of Product Marketing, LambdaTest) - Thank you so much, Andrea, for explaining that. What I'm also able to understand is that the ensemble testing framework as a whole, it also keeps evolving within the team as the team sort of comes together to work on various projects. Is that right? Is that the right understanding?
Andrea Jensen (Tester & Team Lead, Navis) - Yes, I mean there is literature outside and when I say literature, I'm talking about practitioners setting up guides and ideas on how they do that. It is, I mean, there is no judgment if you do it in a different style if that works for your team. I also adapted it very much for our personal needs in the team or for the organization.
I mean, initially when I started, was nothing, not much out for remote ensemble testing, but we have been in the pandemic. So we needed to adapt how to do this in an online set up. But yeah, there is clearly some publishing outside where you can go through, where you can see what works for you and then pick the bits and pieces you can adapt for your team.
The only thing coming back to one of your initial questions, what is the essence of installing ensemble testing? I would say the essence is still that you have a driver and navigator and some other folks, because if you only have a driver and navigator, you're just a pair.
And to set up the technicalities so that everyone potentially can take up each of the roles. So that's for me the essence, which is not really negotiable but everything else you can adapt it to your team's needs. Some maybe run assemble sessions just for a quick interaction to clarify things. Other teams do it maybe eight hours a day. So whatever floats your boat.
Kavya (Director of Product Marketing, LambdaTest) - Thank you so much. So it also means that it sort of goes beyond collaboration or times with the focus on so many different factors so as to drive the quality. Moving on to the next question, how can teams maintain momentum and keep ensemble testing as a valuable practice?
Andrea Jensen (Tester & Team Lead, Navis) - Yes, that's a great question. I'm asking myself at the moment. So we started in 2020 was doing that. Last year, I took over a new role, which allows me less time to motivate others to do ensemble testing.
So what I feel or what I realized at the moment is that you really need to have someone who's just feeling responsible or who's the driver to initiate that we have this kind of collaboration, that we have this get togethers and setting up sessions. I would love to say, well, once the team is used to it, they just do it on their own.
Maybe one day it will happen. At the moment, we here are not mature enough to do that. I know teams would just do this naturally. They just sit together every morning and then they start the ensemble. In our sense, that is not happening.
And I'm now working on different ideas how to get the momentum up a bit, because to be honest, at the moment, I don't spend so much time in running ensemble sessions. So it's a bit like on pause at the moment. I feel, especially in the early days, you really need someone who has the time and the dedication to do that.
And time is one aspect, but you need also really to have the willingness to do it. Otherwise, if it feels like a burden you need to do, also the team will know. And then it just feels like a good job you have to do to make a checkmark on. And personally, I feel that is not fair for the folks, not for yourself and not for your ensemble testing idea. So yeah. Still trying to figure out if I have to serve a bullet, I will let you know.
Kavya (Director of Product Marketing, LambdaTest) - Absolutely, that would be great. And, of course, our audience would know where to reach you. Well, of course, be adding you to LinkedIn. So it would be great if you stay in touch with our audience. And thank you for sharing. It is a very thoughtful approach on how to keep a sustained engagement in a way.
The idea of ensemble testing brings different factors again, as I mentioned in the previous question this collaboration, this communication, and then there's this joint mission to solve a task at hand at the end of the day. It matters that you're keeping things very aligned with the teams.
Speaking of which, I also wanted to explore how this fits into the specific challenges of the maritime domain itself what of your team's challenges does it help to tackle?
Andrea Jensen (Tester & Team Lead, Navis) - So speaking about the maritime industry or the shipping domain we are working in, you need to have kind of a certain set of domain knowledge to understand our software because we produce software basically for merchant vessels, storage experts, and storage offices who plan how the vessel will be packed once it sails the ocean.
Also, we have other tools checking emissions compliance and so on. that requires some understanding about the domain, about their particularities. Not everyone who walks out of university with a CS degree naturally has this knowledge.
On the other side, there are people who have the knowledge because they sailed the oceans, they are seafarers, and they may be stowed in containers for a couple of years in terminals or they worked hands-on in the yard and knew how to build a vessel. They may not have all the computer science knowledge.
So in our company, those folks come together. I'm basically from the business side, but not from a domain expert from the maritime industry. Before, I was working in different domains, and when I joined the company, yeah, I know that a vessel did swim, but that was it. I had no idea about the particularities.
So what I feel what it helps a lot is to gain more understanding about each other's domain or field of expertise. So I learn a lot from the developers because they showed me little shortcuts in the system or show me little shortcuts running and operating the system I did not know before. While I share with them what, from a tester's mind or from a tester's perspective, I would try out to see if the software is any good.
And then we have the domain experts also joining our sessions once in a while who are seafarers, worked at yards, and know how to build a vessel so they can explain or tell us, well, your new feature is nice, but it has nothing to do with reality. In reality, we do it differently or let me tell you about how a turnbuckle in the lashing system looks like.
And for the first time, I was like, turnbuckle? I have no idea. But then they explain to us what is the characteristics, what do we need to know. They share pictures, they share stories. So we learn from each other. And that is also really the great thing about ensemble testing. So especially if you're working in a domain where you need a lot of background knowledge to understand if the software you're testing is any good.
I would recommend pair up or group up, collaborating with a group domain experts, and yes, in the onboarding phase you have conversations with them, but in the first six months, you cannot grasp all the knowledge you will learn it on the go in the years to come, so if you do the sessions regularly, you will learn a lot from each other.
Kavya (Director of Product Marketing, LambdaTest) - It's incredible to hear how ensemble testing also addresses very niche challenges in such a collaborative manner, especially in specialized domains such as maritime. That's not a mainstream industry that you can think about implementing. So kudos to you and your team, of course, for implementing it within the industry.
Looking ahead, what's next for remote ensemble testing in an evolving remote work landscape? As you mentioned, you started with COVID-19 during the pandemic period, and now we are here four years later. How do you see remote ensemble testing evolving with the rise of remote work?
Andrea Jensen (Tester & Team Lead, Navis) - Ensemble testing is a great chance. things change now; yes, some companies want to the folks back to the office, but not all companies. And some of the companies move to kind of a hybrid mode, where you're some days maybe in the office, some days you're not there.
My company just introduced us to a work-from-anywhere policy where we can spend a couple of days in a year working from wherever we want. So that day I'm clearly not going to the office then. So it creates the possibility and the opportunity to collaborate, to team up with folks which are in a different part of the world.
We are a spread organization around different continents. So for me, it has the beauty that I can pair up with my team members in the Indian office, or I can do it also in the folks who are sitting who are living in the Americas, having sessions together, it's suspense continents.
Although, you just meet up with your colleagues who are just working from home these days because they cannot make it to the office. So that is something I really see as a great evolving idea. Of course, it would be nice if the tools and the tool vendors were keeping up with that idea and provided us also with more possibilities to create a better ensemble.
Wait, I need to go back. I would also go for the two vendors to create the folks who are running ensemble tests or collaborate through the applications with a better collaborative experience. Like I said, I gave someone the remote control of my computer, and he wanted to format my C drive. That is a risky thing, right?
So maybe we can have access controls that just limit access to a certain application or something like that. Also, I feel remote ensembling is something that just does not go away. I think it will evolve over time.
I hope it's my sincere hope, and I'm trying to do my fair share to that that it will be kind of more of a practice for new people joining the IT and the testing field that this is a way of how we can work in the future, not just sitting alone at our desk and throwing poor requests at each other, but just doing it together.
Kavya (Director of Product Marketing, LambdaTest) - That's a very interesting approach and outlook, And just wanted to add to my previous question, what new tools do you think might enhance remote collaboration?
Andrea Jensen (Tester & Team Lead, Navis) - So, as I said, the meeting too, so they could do a bit more. That would be super nice. And also maybe we can think about how we can better share the software we create with each other so that everyone can access it.
We do not need to set up a test computer somewhere in our server system. Everyone needs to have access controls to it. So everyone needs to log in; we need to ask our lovely IT people to create the log-ins because that always creates a bit of an upfront effort, and you always need to remember that.
And maybe it takes a couple of days to get started. I believe with more and more applications going into web-based applications into cloud computing, that also would get easier. But to when this, don't forget the folks who are still working on on-prem applications; they are there.
And they will be there. Some are there for 30 years, and they do not just go away within two months. So something like that makes it a bit easier to create a joint platform maybe. That would be something I would really love to try out.
Kavya (Director of Product Marketing, LambdaTest) - Thank you so much for sharing that. That's, again, great insight. As we wrap up today's discussion, I just want to mention that it has been such an insightful session, Andrea. Really thankful for you for joining us today and sharing all the practical advice on ensemble testing in remote environments.
And I'm sure your stories and strategies would definitely provide our audience with a fresh perspective on how collaboration can actually elevate the quality of software and strengthen team dynamics. Really appreciate it, Andrea. So to our audience, thank you for joining us today.
We hope you found it as engaging and informative as we did. Those who are listening in, please Subscribe to our YouTube Channel and stay tuned for more episodes in the LambdaTest XP Podcast Series. Until then, keep testing. Thank you. Thank you so much, Andrea, really appreciate this.
Andrea Jensen (Tester & Team Lead, Navis) - The pleasure was mine.
Guest
Andrea Jensen
Tester & Team Lead, Navis
Andrea began her tech career by chance in 2011 and has worked as a tester ever since. In 2023, she took on a leadership role, embracing the challenges it brought. Today, she is working as a tester & team lead in the maritime industry. Outside of work, she enjoys tea and the works of Tolkien.
Host
Kavya
Director of Product Marketing, LambdaTest
With over 8 years of marketing experience, Kavya is the Director of Product Marketing at LambdaTest. In her role, she leads various aspects, including product marketing, DevRel marketing, partnerships, GTM activities, field marketing, and branding. Prior to LambdaTest, Kavya played a key role at Internshala, a startup in Edtech and HRtech, where she managed media, PR, social media, content, and marketing across different verticals. Passionate about startups, technology, education, and social impact, Kavya excels in creating and executing marketing strategies that foster growth, engagement, and awareness.