XP Series Webinar

See Why Your Testing Framework Is Incorrect, Incomplete, or Inefficient — And I’ll Show You Why

March 06th, 2025

30 Mins

Watch Now

Listen On

prodcastspotifyMusicyouTube
Karla Mieses

Karla Mieses (Guest)

Head of Quality Engineering, Orbem

Karla Mieses
Kavya

Kavya (Host)

Director of Product Marketing, LambdaTest

LambdaTest

The Full Transcript

Kavya (Director of Product Marketing, LambdaTest) - Hi, everyone. Welcome to another exciting session of the LambdaTest XP Podcast Series. Through XP Series, we dive into a world of insights and innovation featuring renowned industry experts and business leaders in the testing and QE ecosystem. I'm Kavya, Director of Product Marketing at LambdaTest, and it's a pleasure to have you all with us today.

Today's session is all about recognizing and fixing inefficiencies in your testing framework. The testing frameworks, as you know, are the backbone of any QA strategy. But when they are incorrect, incomplete or inefficient, they can slow down progress and impact overall product quality.

Our discussion today will uncover the common pitfalls in testing frameworks, helping you optimize your strategy for better results. Let me introduce you to our incredible speaker for today's session, Karla Mieses, Head of Quality Engineering at Orbem. Karla is a highly experienced software quality engineer with nearly nine years of expertise in the field. And she has played a pivotal role in multiple startups.

So in today's session, she will basically be talking about how to uncover whether your testing framework is holding your team back. So without much delay, over to you, Karla. We have a lot to discuss today, but it would be great if we could share a bit about your quality engineering journey with our audience.

Karla Mieses (Head of Quality Engineering, Orbem) - Yeah, thank you so much for the invitation. And yes, we have a lot of things to go through today. So as you mentioned, I have around nine years of experience working in the software quality environment. I have worked as a Manual Tester, Automation Test Engineer, Project Manager, QA Lead, Head of Quality and Testing, and currently working as a Head of Quality engineering at Orbem.

Kavya (Director of Product Marketing, LambdaTest) - That's amazing. Thank you so much, Karla, for sharing that. So moving on right to the very first question that we have in place. What are some clear warning signs that you know any testing framework is causing problems and slowing down the development rate? How would the testing and development team get to know about the warning signs?

Karla Mieses (Head of Quality Engineering, Orbem) - Yes, absolutely. So I will say that there are very obvious and most known warning signs when it comes to slowing down the development, like tests taking too long, high flakiness rate or excessive maintenance of testing.

But I really like going into the hidden ones because those are the ones that tend to generate a lot of issues, but they are not very easy to find, and I think that in order to find those, I need to be very self-reflected about them. I think a good place to start is by asking yourself or by asking the team different questions, such as, do your manual quality engineers or software tester know what is being automated?

Are they running the tests manually that are being already automated? These are very good questions because here you can find overlaps between the manual test and the automation test engineers. And this creates extra time in the software life cycle and as well rework and overloading as well the software life cycle. There are as well other questions such as, do software automation engineers know what is in unit testing?

Are they duplicating test scenarios across multiple layers of the software life cycle? Are your automation test finding bugs? Are you using proper tags on your test to build smart automation test strategies? As well like when there is a bug linked to production, do you analyze on which test stage the bug should have been identified and addressed under the right stage of testing?

These are also very important because when something links to production, it's not only about, let's fix it and then manually test it as well. You want to prevent that from happening again. to prevent that, you could add that to your manual test, to your unit testing stage or integration or end-to-end testing.

And another warning signs, and more than warning signs, I like to put this as questions because at the end of the day, each development team has or have their own needs in the best way to reveal what are the warning signs. think it's better to start from questioning the current process you have.

And it is very important, this lastly to answer the question, it is like about the project managers. Like are the requirements clear? Is this feature ready enough to continue doing the rest of the testing? So, I think if you ask these questions, you are going to be able to have your own customized wiring insights based on the process you already have in place.

Kavya (Director of Product Marketing, LambdaTest) - Great answer, Karla. I think that is really insightful. And it also ties back, ties to the next question that we have in place, which is beyond recurring test failures, right? What are some other red flags that indicate a framework might be incomplete?

Karla Mieses (Head of Quality Engineering, Orbem) - Yes, absolutely. So, one of the red flags could be about when you need a lot of manual steps to trigger some part of your testing framework. As we already know, not all development teams or applications have the beautiful and straightforward development process like development, QA, staging, and production environment, and then you run everything in the CI software or applications can get very complex, especially now that AI is integrated into the software development life cycle.

More companies are building their own AI models or they're interacting with other third parties like ChatGPT or any other out there in the market. So it is very important to recognize if you need to trigger manually a part of your test framework, especially when you are interacting with third parties.

So that is number one. Number two is the lack of scalability. Are you repeating yourself too much? Do you have everything properly configured? Probably if you are an automation engineer, you could ask to the software developers, hey, can you do a core review of what I have to see how you can improve your coding practices within your testing framework? So that's very important.

And the other stuff, which for me, it is one of the most important signs, it is like making sure the manual engineers or the software, the manual software tester know what is automated. I think this is crucial because I have worked in teams where manual engineers. They worked in a way like there is no automation test in place at all.

So it is extremely important that there is this constant communication between the two, the person that is automating with the person that is doing manual tests, because that will make your framework very successful. And as well, you're going again to be able to reduce duplicated work and testing effort.

Kavya (Director of Product Marketing, LambdaTest) - Thanks, Karla. Interesting because with GenAI transforming how testing is happening, think there's so much that manual testers are also thinking about. So this would also be a food for their thought, I would say, when it comes to what are the red flags that they can catch very early before they start planning the test strategies itself. And moving on to the next question, how can you tell if the quality engineering team is using the wrong testing framework for their project's needs?

Karla Mieses (Head of Quality Engineering, Orbem) - Yeah, so I will say number one, is when your framework is unable to meet project complexity. And by that, am thinking, for example, let's say you are working on this company and you are developing your testing framework. But if you're only developing the testing framework that utilizing, sorry, a testing framework that only satisfy a web application.

What happens when the company decides to launch a mobile app? What happens when the company decides to launch a desktop app? you need to be very close. The quality engineering or software testing is very interesting because you need to be very close from the product and the business and as well very close from the technical and software engineers because that will define your strategy.

So you really need to know where the company is going and as well you need to know and align that with the tech stack that the software development team is building the application. So I think it's very important to understand project complexity to understand if you have hardware dependencies, third party integration, what are the challenges.

It's very good that you always choose a framework that has very good documentation because that will help you to be successful. And when another way that you can identify that you have chosen or you're using the wrong testing framework, is when you don't have parity with the tech stack, with the developer team, that is very important because you really want to be, you really want to allow developers to get closer to quality.

The best way it is if everybody speaks the same language. And I know that not all the time that is going to be possible, but do you need to aim for that? And because most of the time you're, it's going to be, you're going to be able to do it. So I will say those are their two indicators.

Kavya (Director of Product Marketing, LambdaTest) - Awesome. Thank you so much for those insights, Karla, that really adds a lot of value. Can you also elaborate on how a bad framework can affect developers and testers? Because we have touched upon all the different, know, matrices that sort of gets affected as part because of the consequences, but how does even a code decision like this can derail the testing cycle?

Karla Mieses (Head of Quality Engineering, Orbem) - Yes, absolutely. So a bad framework can impact in many ways the development team. And I will say it can generate overwork, frustration, burnout. And for me personally, which it is the worst, is the lack of trust on the automation testing effort. I remember there was when I started to do automation testing, I had a lot of flaky tests.

And I remember there was one time where the test started to fail and I was like, they are failing because they are just flaky. But there was a bug actually and a huge one. And I realized about the bug like hours after I noticed that they were all failing. And the reason why I didn't look into the failures earlier on that day was because they were used to be flaky tests.

So for me, it was like, oh, they are just flaky. So it is very important that the team never goes to that path. Automation testing are highly valuable. it is in the goal of that. is actually the whole opposite of what it can generate about framework. So if there is overwork, if there is frustration in burnout and the lack of trust on automation testing, that means you need to do something about it because automation testing it is to actually alleviate, mitigate and eliminate all that pain.

Kavya (Director of Product Marketing, LambdaTest) - Thank you so much, Karla, for explaining how a bad framework can absolutely derail and affect the developers and testing teams. Now moving on to the next question. This might be something that a lot of your peers might be wondering to how can inefficiencies in our testing framework lead to a decrease in overall software quality.

Karla Mieses (Head of Quality Engineering, Orbem) - Yeah, thank you so much for that. So an inefficient in the test framework leads to more inefficient down the road, especially when it comes to manual testing. At the end of the day, you're going to have maybe a team or a few people building automation testing that probably is not adding any value because they are flaky, nobody trust them.

So you're having high costs probably running your tests and maintaining them. But at the end of the day all the heavy work is still going to manual testing. And then you have like this team that is overworked, that is bored now, that doesn't trust on your automation testing. And then you have the manual test engineers that they are doing all the heavy work and probably they are tired, they are frustrated, that could lead to overtime.

And that can definitely impact the quality of the testing they are doing and this leads as well to excessive test maintenance. And instead of you having like a team expanding the automation framework, then you have a team reworking over and over and over again the already existing test framework while you have the manual tester doing all the job that should be in the automation.

So that is definitely for sure it's going to generate a decrease in quality. And if the quality is not decreased, then it means that the quality is not going to improve because the blocker for bugs not linked into production are going to be the manual test engineers rather than the automation testing. So it is important to address this and put in place in the way it should be. So you're not just forcing all the time or relying so much on manual testing to be the gatekeeper of bugs from on the production environment.

Kavya (Director of Product Marketing, LambdaTest) - Those are some great phrases as well that you you have used we would absolutely be resharing those with our audience. You know it also brings to me something that I read on reddit just a while back which is how testers brain is sort of wired after years of testing right how it's wired to spot mistakes be it be a typo in a menu so this was a post that, you know, someone had put in and all the way up to the mistakes that happen in say family recipes.

So very interesting and aligned with what you just said. Moving on to the next question, when refactoring a testing framework seems necessary, what are some key steps to take to ensure a smooth transition?

Karla Mieses (Head of Quality Engineering, Orbem) - Yes, of course. So refactoring a test framework seems necessary when you have a high coverage of tests. could be a unit test, end-to-end integration, whatever layer, but they're not finding bugs. I feel as a head of quality engineering at Orbem or any other company, I feel very suspicious when I see everything in green.

And especially if they're in green for a long kind of time like how is this bug how are these tests are not finding bugs so that means that you need to look into your test and you need to validate the quality of those tests if you are working with unit tests and you're a developer and you're listening to this and your unit tests have never found bugs probably you should ask one of the quality engineers to review your unit tests.

Hey, I have this unit test. They've been running for a year or six months, but they have never captured any bug. That means there is something wrong because remember that all the tests are not to have like a green validation. They are there to find bugs. And if they're not doing it, it means there is something wrong for sure and that means as well that at the end of the day, you have all these tests, but you are moving the finding box stage to the manual testing engineers.

So that would be one of the reasons why I consider that probably the test framework needs to be refactored. And this applies for all the stages of testing. Another stuff would be when your tests are taken too long to run. Usually, this is something that has been improved across on each company I have worked on. Testing are running faster and faster. People are being more strategic when running testing and using parametric or as well as running the testing parallel.

So those are very good practices. But if your tests need to run fast and you need to refactor that. And FAST as well is going to depend, like the meaning of FAST is going to depend on how many tests you have and as well like what environment you are doing and based on why you are testing like AI models tend to run, take longer than typical tests or software architecture running.

So, you asked me as well about the steps to take to ensure a smooth transition. I will say identify what works and what doesn't. Define a clear goal. Why are you refactoring? Are you refactoring because your tests are taking too long to run? Are you refactoring because you feel your tests are not finding bugs, are you refactoring because you are migrating from JavaScript to TypeScript, or you're refactoring because you are migrating from a test framework?

Why are you refactoring? Those are extremely important, and those are things you need to have in mind. Are you refactoring because as well as scalability, you created this test framework, I don't know, at the beginning when you learned how to do automation and then six months has passed and now you think things can be done better. And whatever in the, whatever way you do it, I recommend to do it slowly. Like migrate gradually. That is going to give you time to move forward, adapt, any error and do things smart rather than harder.

Kavya (Director of Product Marketing, LambdaTest) - Great point, Karla. Really appreciate you sharing it with our audience. The next question, again, touches upon what you have mentioned. in your experience building frameworks from scratch at startups, what are some common mistakes or pitfalls that companies make when choosing or designing a testing framework?

Karla Mieses (Head of Quality Engineering, Orbem) - Yes, this is very interesting and I know the audience will have different experience about this. But I have identified there are a few points. I have in my experience, I have seen that sometimes I've been hired way too early. Like it's not the time that I was hired. when they have hired me, they want me to start doing end-to-end testing.

It really like, if it is an startup company, it's not, the feature is not mature enough or probably they haven't even released to production. You shouldn't be on end to end tests like that because your product or your feature or your application can go through so many interactions or so many changes. You can receive feedback from users, from them.

You can have wrong assumptions about your feature that requires modifications. You can pivot your company to provide a different service or have a different target from the first assumption you have. So there are many things. So it can happen that sometimes a quality engineer is hired too early. What I recommend in those cases, of course, you're not going to tell your boss, hey, you hired me, but you don't need me.

But I will absolutely suggest to talk with your manager and say, Hey, we are not at end to end testing yet, but we can improve our unit testing. And you involve your quality engineer into unit testing and integration testing to make those features robust without going all the way to end to end testing. And it is, this is very important because as unit tests, usually developers tend to feel very, very confident about the unit testing they have.

And I have realized that that is true until quality engineers start looking into the unit tests, like, Hey, I think you should add the scenarios and blah, blah. But what I'm trying to say is that is one of the points that sometimes it can be that quality engineer is too early. The other point it is when the company has hired a quality or the first quality engineer too late. And that is also another problem.

And when this happened, I have noticed that the expectation of the company. It is they hire you today and tomorrow we need to start doing automation testing. And that is a big mistake. It requires time to build proper automation test framework. When you're, when you're being hired too late, that means that things are stable. The company usually knows where at least this feature is going to develop and, and nowhere needs to double down when it comes to business and strategy.

But this requires the learning curve for the quality engineer to be bigger. That means like before you start doing automation testing, get to know the product, get to know where you're going to start. What are the main features? What are the most important features or the feature that users are complaining the most?

So you know that probably if the users are complaining too much, I shouldn't start from that feature. Now what is the main feature of the application? Something that most likely is not going to change. So you can start on that feature and expand on that. You need to know the flows and what are the bugs or features that tend to generate more bugs when it comes to production.

So that is another important aspect when you are hired too late, don't start automation right away. Get to know the product, get to know, do exploratory testing as much as you can. And you can combine exploratory testing with automation. I have done that and it really works. It's just like it will require rework as long as you continue getting to know the product.

And I have two more points in this. I have noticed that another error, is not asking developers what you need. When you are developing a testing framework, sometimes you need that data attributes that perfect if you can add it yourself, sorry, test attributes. And it is perfect if you can add it yourself, but sometimes you will require from developers to add it. So for whatever you need, you ask the developers.

Sometimes you need a piece of data or a new field to be returned in one of the requests because that will create your tests faster It will make your test faster and easier. It could be an ID. It could be I don't know a flag Anything you should ask the developers ask the DevOps They will help you never limit your test framework with the current knowledge you have Just because you didn't ask questions.

There's always a way that you can do things better. And if you don't know it now, that is perfect, but always ask the questions. And if you don't have the time or the ability to ask the questions, put a year a ticket and assign it to you and work on that later. But I can guarantee you that the test frameworks can be more efficient, robust, and very enjoyable to work with when you have this growth mentality when you approach them.

And the other stuff will be, is like, when you don't think your testing framework as a whole, then you are generating a lot of issues. That is why it is important, like for you to know what is in the test. So you are not repeating yourself and you are relying on that other part of your code are taking care of, of testing that functionality of testing that's featured.

Remember that testing is just scopes that is continuing growing and growing. Unit tests has a scope, integration has another, and end-to-end has another. It's not like in end-to-end you're going to test everything that happened on unit and integration. No, it's just like the scope has changed, but you need to rely that unit tests did their work, integration did its work, and now it's end-to-end testing doing what it's supposed to do to avoid overlapping in the application.

Kavya (Director of Product Marketing, LambdaTest) - Thank you so much, Karla. And now we have the last question of the day, which is, can you an example of a real-world scenario where you identified and addressed framework issues and the positive impact it had on a project?

Karla Mieses (Head of Quality Engineering, Orbem) - Yes, absolutely. So I have many stories about that. But one thing it is like I joined to this company was like a few years ago and there was a lot I was hired as an automation testing engineer and there was a lack of automation. Things were not running in the in the pipeline. Manual testing and automation testing was like everybody was working independently that led to duplication of testing efforts.

So yeah, so the steps I took to address those issues was that I used to work closely with the manual testers to understand what bugs tends to look to production, what were the features that tend to break the most or critical, and what they consider was important to automate due to time or critical relevance. Especially I did that very closely in the first three to six months. And every Friday, what I used to do, it is like.

Kavya (Director of Product Marketing, LambdaTest) - Thank you so much, Karla. That has been very insightful. And in fact, entire conversation was super insightful. I would also really appreciate you for taking out the time for shedding light on the hidden inefficiencies in testing frameworks. I'm pretty sure that it's crucial to continuously evaluate and also redefine our testing strategies, refine and redefine our testing strategies.

I'm sure that the audience will definitely be able to look at your recommendation today and they have some solid takeaways to put it in put into action and of course to our audience thank you so much for joining we hope you found the session super valuable stay tuned for more episodes and until next time keep learning and happy testing. Karla once again thank you so much for joining us today. Thank you.

Guest

Karla Mieses

Head of Quality Engineering

Karla is a Software Quality Engineer with nearly 9 years of experience, her approach to quality is customer-obsession translated into automation and manual testing. She has been instrumental in several startups, building software quality frameworks from the ground up. Karla is passionate about creating exceptional products, cultivating a culture of quality within teams, elevating quality standards across organizations, automation testing and building manual testing strategies that are realistic, scalable, impactful and useful.

Karla Mieses
Karla Mieses

Host

Kavya

Director of Product Marketing, LambdaTest

With over 8 years of marketing experience, Kavya is the Director of Product Marketing at LambdaTest. In her role, she leads various aspects, including product marketing, DevRel marketing, partnerships, GTM activities, field marketing, and branding. Prior to LambdaTest, Kavya played a key role at Internshala, a startup in Edtech and HRtech, where she managed media, PR, social media, content, and marketing across different verticals. Passionate about startups, technology, education, and social impact, Kavya excels in creating and executing marketing strategies that foster growth, engagement, and awareness.

LambdaTest
Kavya

Share:

Watch Now

WAYS TO LISTEN

SpotifyApple Podcast
Amazon MusicYouTube