February 25th, 2025
49 Mins
Listen On
Ruslan Strazhnyk (Guest)
Independent QA Consultant
Kavya (Host)
Director of Product Marketing,
LambdaTestThe Full Transcript
Kavya (Director of Product Marketing, LambdaTest) - Hi, everyone. Welcome to the LambdaTest XP Podcast Series. I'm your host, Kavya, Director of Product Marketing at LambdaTest, and it's a pleasure to have you with us today. We are here to explore an intriguing and transformational topic, shift happens, driving quality lift, a real-world journey across five teams. Let me introduce you to our guest on the show, Ruslan Strazhnyk.
With over 17 years of experience as an engineering leader, Ruslan has worked with a range of tech startups ranging from digital healthcare companies to logistics services, and he's applied his expertise in cloud networking, social media and database management software all across.
So today, Ruslan will share insights from his journey of integrating agile testing practices across five different engineering teams, each at varying stages of their quality evolution. So whether you are just starting your shift left journey or looking to fine-tune your approach, Ruslan has actionable insights that will resonate with everyone. Over to you, Ruslan, please share a bit about your journey in testing space with our audience.
Ruslan (Independent QA Consultant) - Thank you so much, Kavya. Thanks so much for introducing me and for inviting me to this beautiful webinar. I hope this session will be helpful for many of the QA folks, like the people who are just starting and people who are experienced, maybe just to revisit a couple of points and maybe, yeah, I could share my perspective on that, and the shift left approach and quality.
So I will just introduce myself a little bit talk who I am. Yeah, I am originally from Ukraine. I started working in software quality assurance 2007. By a good chance, was like I begin my journey with the big outsourcing company in Ukraine and they had already established a lot of the processes from the beginning.
And our first client, basically the end customer that we worked for was the Cisco Ironboard company and they already had really mature frameworks and approaches in their system and everything that we were exposed to were really beautifully organized and it was very good opportunity to learn from. I, yeah, over the years and from the beginning, I learned a lot of Python.
I had to learn basically everything from the beginning. I had to learn Python, MySQL, Linux, bash scripting. And I was exposed first time to test automation frameworks. But the caveat was that company, they still had been using Waterfall. And that took a lot of time to evaluate all the requirements, proceed with the development, and then with testing. It took basically a lot of time to market from the requirement stage and then to the development stage.
And it often resulted in the big amount of bugs and issues that were just encountered after the process. So we always had reiteration phases. It was unavoidable that something is not accounted or some requirements were not properly treated, which resulted with defects.
However, some things were course like the main requirements with waterfall are like fine and usually it's sat and done in the end. Yeah I later followed with the more like smaller teams and worked with other different technologies I worked with Ruby and Rails frameworks, I've worked a lot with Selenium. Then I was introduced to Scrum, which was a modern agile framework at that time.
I even had a chance to work with a very small team. didn't have a Scrum master, a dedicated Scrum master. had just a role. So one of our developers was a Scrum Master just by chance. So the team was that small that we had everything organized ourselves and that was really working really smoothly.
Yeah, I worked as an independent consultant from Ukraine but then decided to move from Ukraine. I worked one and half year in Sweden in a company called Apica. I was leading a team of engineers and Apica is doing monitoring and web performance tools and selling them. It was a pleasure to work in such environment because they already had like enterprise tools and enterprise software that they were offering.
But that also the downside that you have to you have to accept that there are some tools that you cannot change them. This is like something that you have to deal with some legacy old frameworks, the legacy old tools, and you have to support them, make it working no matter what. Even if it's like very old, you still have support because it's maybe a core of the software.
So we were just like doing maybe some updates on this. In the later roles I worked also, like then I moved to Berlin. I worked in some digital finance startups with blockchain and banking apps. And then I think was a shift actually to, kind of that was the moment that I realized that there is a strong demand for agile testing, strong demand for shifting quality left.
Because it was always a constraint to have enough QA engineers in the team, was like always some kind of a problem because I also realized that moving from Ukraine to Europe to Germany, to Sweden and then to Germany, I realized that it's a bit of difference in terms of hiring and a bit of difference how the outsourcing companies work and the product companies inside Europe work.
That's like totally different models because in the outsourcing, it's usually possible to hire or have a maybe apply some cost cutting and they usually the resources or the engineers provided by outsourcing companies are usually cost much more cheaper than if you hire the people in Germany and that means you have to actually compromise and you have to build really strong teams and you cannot replace people because they're not contractors, they're employees.
And that is the moment that I felt like there is a strong need for building quality and building strong bonding and actually strong approaches to quality assurance inside the team. So building that expertise inside the team is essential for that. I would be very interested to move forward and then I think Kavya have a lot of questions for me.
Kavya (Director of Product Marketing, LambdaTest) - This is amazing, Ruslan is just listening to and piecing it together. Your journey started from being a system administrator into entering test automation space and not just working with one kind of industry. You have worked across so many different industries. There has been so many different tech stacks that you have used. You started experimenting with Selenium and Python.
In fact, teaching yourself those technical skills, right, over the course of 17 years. That's a very impressive journey that you have had. And I'm sure the next, the very first question that I have, right, it's in line to that journey that you just shared with us, which is what are the biggest challenges you encountered while implementing a shift-lift approach across different teams with varying levels of maturity?
Ruslan (Independent QA Consultant) - Yeah, thank you. Great question. So I think one of the biggest challenges is the different levels of maturity and technical knowledge across the teams. Also, another thing is that some teams have been established previously and some teams are just maybe they are still waiting for some people to be hired or to start and they are just, you know, doesn't have a good bonding inside the team.
Some teams just coming from different backgrounds. they have, they may have engineers coming from different backgrounds and they may have like little experience with automated testing and some, some already are proficient and they may be know the domain very well. So I think it's very important to assess the team's current capabilities and basically understand where are they standing, what is like the temperature in each of the team.
And then you can tailor your approach to each maturity, to each group maturity level. And so, yeah. Like beginner teams, probably need some like some beginning foundational training in test automation tool and how to do manual testing, how to do smoke testing and so on. And then advanced teams, need to focus on the integrating performance testing or some kind of more advanced techniques in test automation.
And then for them it's important to have this approaching this testing and development life cycle. So have a good size of the pipelines and so on. So basically it's important to have to balance the space of adoption. And it also, it can also feel quite overwhelming for teams.
So it's, you have to also balance that and avoid overwhelming like those less mature teams with like a lot of stuff because you have to also apply certain techniques with some time and diligence because if you overwhelm too much with some kind of stuff that they haven't experienced before, probably you will just not be popular.
Like you will lose popularity, and some teams will just, no matter how hard you are even trying, but if you're pushing too much, it may be like, you can get resilience, and then that's maybe a negative effect of that. So it's always important to keep a good balance, always hear the feedback, and yeah, approach this like, and have some time for that.
Kavya (Director of Product Marketing, LambdaTest) - This is really interesting. Thank you so much, Tristan. What stands out is how, depending upon the growth stage of every organization, you have to also balance the pace of adoption, as you said, according to that.
Very interesting point, of course, and navigating that for quality engineers must also require a very tailored approach for every industry for every growth stage, the engineering leaders, the testing leaders, they accordingly have to also tailor it more.
That also brings us to the next question, which is, can you provide specific examples of how you empowered teams with the necessary tools and training to embrace a shift left mindset as you move ahead?
Ruslan (Independent QA Consultant) - Yeah, I can share how I did it in my last place of work. Quandu, by the way, Quandu is a service that allows to book restaurants online. They have B2B and B2C applications. It's like also we have like in Kandu they offer also widgets so when you search some restaurant or cafe in Google Maps, you can instantly book the table and time slot from that.
And these were also mobile apps. So we had like a lot of plenty of stuff to test and I introduced Playwright as unified test automation framework. Basically, I evaluated that it's a valid test framework that we have to continue on. And we also had a long history with other test frameworks. And there was a lot of tests already written in Ruby and in Cypress.
And there was a decision basically within me, within teams and my manager that we moved to Playwright because it was kind of ticking all of the boxes and most importantly, it was also enabling us to test single-segment functionality, which was important to test some other products and also fitted like other things as well.
So I was hosting interactive workshops to teach engineers how to work, maintain test automation tools, how to write automated tests with Playwright. And basically even like Playwright, I can really recommend it. It's a framework that by itself has a very good documentation.
So there is a lot of things available online. It has a good support and really great community. So I also like was and was embracing to that in our team we create comprehensive documentation that we do even some code samples and we kind of showcased everything and was also important that we kind of integrated these frameworks inside the CI/CD pipelines.
So basically, we were also sitting with the developers and their leads, their team leads to show them how exactly these test frameworks are working, how these tests are impacting their changes, doing sessions, like pair programming sessions and mentoring, which was kind of a way to bridge this knowledge gap and that's how we introduce this knowledge, this kind of culture of collaboration.
Kavya (Director of Product Marketing, LambdaTest) - Thank you so much Ruslan, I think these are great insights as well as tips that I think every QA manager or engineering needs can implement within their teams. We would also like to understand how did you navigate the complexities of bringing newly hired engineers and testers up to speed on team-specific shift lift practices. You did speak about the collaboration element, right? So how did you ensure that these newly hired folks are basically up to date or up to speed with the changing practices?
Ruslan (Independent QA Consultant) - It's a good and valid question. So I think it's important to have very good onboarding programs. Like it's when you have dedicated people and then you kind of embrace this culture, you can understand also the necessities of the one who will be just on boarding and it's important to write this nice documentation and tutorials with specific on boarding steps that can include some practical example of shift-led practices.
So we had our new engineers were like, they were going through a guided task to write some tests or a small feature and integrate it into CI pipeline. So it also started with when we hired new QA engineers in the team. For them, it was also a task to go through the whole bunch of different steps installing like not just the Playwright, but also the previous frameworks.
Like having said, all of the environments, every, this kind of stuff that they required to test and to execute everything manually. So they had to test everything manually. With developers, it was maybe a bit of a different approach because they already had their environment set up.
So you just introduce them how to install specific frameworks how to work with tests, and they were picking up this stuff really quickly because they already had good programming knowledge, so for them it was important just to know where they could actually hit. So we also did mentorship sessions with QA and senior engineers, and we were ensuring that they could ask questions, receive some guidance.
And we also try to keep the commentations and knowledge bases with some frequently asked questions and best practices. So I think that keeping the knowledge base is really what matters.
Kavya (Director of Product Marketing, LambdaTest) - Thank you so much, Ruslan. Again, great insights on it. know, onboarding can also be a challenge at times, right? mean, it sounds like you have built a strong foundation within your team so that anyone who joins in can easily transition and, you know, start on taking up newer responsibilities.
Moving on to the next question, how did you address the potential for increased pressure on developers when shifting quality responsibility to the left?
Ruslan (Independent QA Consultant) - So exactly the shifting quality responsibility to developers, it's some kind of a cultural shift. like there is also, I think in my experience, I also dealt with some kind of a, let's say, perception that some people, I just don't want to tell it's not something that I kind of share, but I know that at some time I felt like that there is kind of a culture when the developers versus QA and kind of developers, they could feel like more mature or more important somehow and QA less important or less experienced.
This is like something that I kind of I had this kind of teams. wouldn't name any kind of names or whatever. It's not important, but this is something that I also, I thought it was like just me. But then when I went through different conferences and I spoke with many people, it's something actually that happens a lot.
But, and then you have to basically, like deal with that, but you have to show that when you are a professional, are not like a second citizen, like you're not like some kind of a last experience, it's just a different role. you are, when you're a professional, you're also a software engineer, but just your shift or like the area of expertise is actually different you're not just developing software, mean, you're developing software, but you're not coding as much.
You are the one who is responsible for testing, responsible for releasing, for requirements. And actually QA, maybe many teams, maybe even much more experienced than software dev, and it's something that's reality. yeah, so when you shift quality responsibility. It's also some kind of cultural shift. I was always, like often I was saying, like it's just you, the developers have to wear a tester hat.
So basically they're not being a tester. They're not suddenly becoming a different role. Okay. But they're just wearing this hat for a moment. when they require to do some testing work, they just wear this hat. Then, when they don't need to test, they just put this hat, you know, in the other side. We, to reduce this pressure, we emphasized collaborative ownership of quality. So it wasn't just a QA responsibility.
So we kind of, we were talking a lot about the importance of quality, like showcasing some historic examples with the cost of effect, the cost of defects, how the bugs actually cost on different sites when they are very late discovered and when they were early discovered and that was kind of the point that most people can understand very easily.
We also ensure that these testing tools are easy accessible, that they have easy documentation, like approachable tools, that they have good documentation, that they could be also reduced. they kind of, developers feel that there is a supportive environment, and then the teams can speak the same language. They have a good collaboration and bonding. then it happens just naturally that they don't feel this pressure a lot.
And that's something that they understand that if they are also responsible for quality, it's actually much better for them in the end. it's maybe in the beginning it's feels like a lot of things to care about. But in the end they feel that they're making some difference and when they also know just more insight with testing, it's just beneficial for them.
Kavya (Director of Product Marketing, LambdaTest) - Sounds very interesting and also resonates a lot with what you just shared, right? Also resonates a lot with what we hear very often at XP Series, for instance, that quality is everyone's responsibility, including that of developers.
And as you rightly said, initially, even if there is, you know, there is a challenge in adoption, from a long-term perspective, it definitely makes more sense to make sure that the developers, as well as the testers, are equally responsible for the quality. And you also mentioned about making testing tools more accessible. Would it be possible for you to highlight that a bit more? What exactly does that mean?
Ruslan (Independent QA Consultant) - Yeah, so first of all, let's say how the tools can be more accessible. I think that means that, let's say, in some kind of a team environment or a team space, usually teams have some kind of account on GitLab or GitHub. Actually, it's not so important, maybe different names, but they have the test frameworks inside this repository.
It's important that Quality Engineers maintain the test automation tools and frameworks in a very good shape professionally, applying the best techniques applying static code analysis tools, and applying different tools that check code duplication. Like they care about styling, about reusability of the code.
So basically they care about test automation tools, the same as on other software. So that engineers and they don't hide it. So they are very open to the things that they write, that they develop. It's also important not to be shy about this. So when the team of Quality engineers is developing some framework, it's also important to show this to the other people.
Like don't be hesitant to demo some features even maybe it's not something that you know is required at the same moment by everyone if, like some developers would know, okay so there is an existing framework that people are already working on, and it's already covering this and that and they may not need like today but maybe tomorrow they need like for the other day.
And it's important to demo the software, like celebrate kind of this also small wins. Like some cases we even had demo of small features of this test framework on company all hands. Like we didn't have big like team, but let's say it was very nice when we shared notes on little perspective to 100 and more people inside the company like listeners were not only just developers.
But also some people working with marketing, with product, with, you know, different stakeholders. They were very interested in that and they kind of was interesting to know. So as you said, gaining the popularity is important and maintaining this everything in good shape is also quite important.
Kavya (Director of Product Marketing, LambdaTest) - Thank you so much, Ruslan. That really helps. And based on your experience, what are the next steps in the evolution of shift lift testing and how can teams prepare for the future?
Ruslan (Independent QA Consultant) - Thank you. So I think the next steps, it may include integrating some AI and machine learning into test practices. Like it's becoming possible to predict and prevent defects earlier in the development lifecycle. teams can also try to adopt predictive analysis to identify high-risk areas of code focused on testing efforts.
They also may, yeah, they may upskill themselves in some AI-based testing tools. Or maybe they, I think it's important also to build flexible modular test frameworks like Playwright. So, grab different suggestions for that. So practically in Playwright, it's possible to identify flaky tests with AI and suggest fixes that are also possibility to work with self-healing tests.
Yeah, I guess also it's something that I think is even LambdaTest XP is I think I have seen the demonstration of these practices inside the website. So it's something that I think gaining popularity a lot for predictive analysis. It's possible to use static code analysis tools like Sonar checks or Sonar Cloud like these tools give a lot of insights very cheaply, basically.
One can also integrate predictive models using JIRA API and Python if you are like knowledgeable on that using TensorFlow. So basically, as I said, it's also important to also have continuous quality feedback loops. So practically, you can integrate Playwrights with GitLab. You have immediate testing during development.
And yeah, think those tools, it's important also to hear what the market is offering. Like can use adaptive test automation frameworks, something like, I think, Applitools is offering AI-driven test script maintenance. So yeah, I think it's also important to look at what is in the market, keep the documentation in good shape, keep the quality of the code in good shape, and try to approach seriously and think about not just the one day you are developing, but also think about future.
So it's important when your software, means, test frameworks you're developing, is of good quality. can reuse part of the code, that the libraries are well organized, and then you are always ready for the next steps that you can explore some plugins that can be offered on commercial purposes or by communities or specifically already tailored applications.
I think also, for manual testing, I can recommend the test team IO. It's a great framework for manual testing for exploratory. Yeah, I think there are many, many test frameworks in the market that get help with that.
Kavya (Director of Product Marketing, LambdaTest) - Awesome. Thank you so much, Ruslan. That really encompasses everything. And just to add to our audience, on LambdaTest, you can do both manual and automation tests, of course. But as Ruslan was sharing, we also help you with productive analytics, for instance, through our test intelligence platform.
So if you're looking for a platform that could help you analyze as well as resolve testing challenges, you could give a test, LambdaTest, test intelligence and try it. It has AI-powered RCA, for instance, freaky test detection, as you rightly said, and error-trend forecasting, for instance. And of course, on top of it all, there is also actionable test insights that testers can get.
So yes, I think there are a lot of options out there for the testing community to try out and you've named quite a few of those. I'm sure that our audience would find it super beneficial. Moving on to the next question, how did you measure and track the impact of shift-lift testing on key metrics like defect rates, time to market, and even customer satisfaction? Because you have worked with quite a few organizations that were also with the B2C specific. So yes, over to you.
Ruslan (Independent QA Consultant) - Thank you. Again, I mean, it's kind of a difficult question, there are maybe not wide options to this answer. So we approached this by implementing dashboards in Jira and we were tracking defect density. We also independently tracking test coverage, build failure rates, and time to resolution, something that you can actually track with Jira.
So I think after adopting shift left testing, we saw like, think like beginning was 15 and then we were slowly going to 20% and 30% of reduction of production defects. And then there was like, I think like 10% and then 20% improvement in development lifecycle times.
So we also did not just in Guandu, but I think in, I remember in other organizations, made customer satisfaction surveys and through post-release surveys, we were able to see that we are making some improvements with user feedback like they had less support issues, fewer bugs that were coming after some rollouts and this was like more smoothly.
So just to, you know, maybe get more insight, like you can measure the fact density with Jira, very easy. You just can track total number of defect raised and then use some custom fields to generate reports. Like one can also track the defect density with lines of code. Or you can even use the REST API with GitLab to calculate these defect density reports automatically.
Yeah, with test coverage, I think it's pretty straightforward. It's Playwright. You can already use JSON or HTML coverage data. You can export it. So it's something that easily can get, it just doesn't require a lot of effort to get this measurement. And then you can track and see the historical data and how it is evaluated.
And with the Sonar Cloud as well. It's something that you can see after the improvements and applying like shifting quality left and then basically thinking about those defects, trying to fix them as soon as possible. That would mean that you have better tracking with tools, and then those pipelines would fail in much earlier stages, and then continuously you see that this production defect would be something very stale to them, which would be almost rarely possible.
Especially when everything is already set within the requirement phases, and everything is kind of tested, and you know, like spoken and that is checked, you just, you still can have production bugs, but usually some, maybe some kind of uncertainties or maybe some kind of problems with the setups.
It's rather than, you know, like human errors. It's like bugs are basically unavoidable, but you can also, you can always minimize them. So yeah, I think these things could help a lot. And like Jira, you also can share a lot of dashboards, and you can export data to BI tools with Tableau Analytics for trend analysis.
Kavya (Director of Product Marketing, LambdaTest) - Thank you so much, Ruslan. That was really interesting. Moving on to the last question of the day, what advice would you give to other organizations that are just starting their shift-lead journey based on the lessons you've learned from your own experiences?
Ruslan (Independent QA Consultant) - I think it's important to start small and build the momentum as soon as you go. So teams, would really suggest the teams to focus on the quick wins in the beginning, like some kind of, you can say like low-hanging fruits, the fruit that you can take very easily. Some kind of a small wins that don't require a lot of technical effort, but that can showcase that this process is improving.
If teams are introducing more focus on the integration, they already may have test automation in place, but maybe it was not integrated in the CI/CD pipelines, even if it was integrated, you can still win very easily when implementing specific checks. Like you can have some quality gates. You can implement quality gates with test automation.
You can add static analysis tools as quality gates as well. This would be quite cheap to implement and doesn't require like a lot of training and effort, and already using static analysis tools will kind of be a lot of focus shifting this like focus on left a lot. I think it's again, I would iterate that it's important to tailor the approach with teams' maturity levels. It's one should actually think about how to apply specific things to the teams that they actually need.
So it's important to hear teams, hear the thoughts, hear the experiences and actually know all these voices and know what people actually need inside the teams. So it's better to avoid some kind of one-size-fits-all strategy. So you have to be robust on that if you're implementing this shift.
As I said in the beginning, the importance is basically with shifting left is that you basically have much fewer resources, but you treat those resources well. you may have like one QA engineer even shared between different teams, but then this QA expert, can be really knowledgeable about everything and can provide guidance for the teams.
So it's still possible when there are multiple teams, even working with one QA expert, but he's knowledgeable, he can teach them and show a lot of insights on how to be independent while balancing quality and providing robust tools and keeping them in good shape with good documentation.
It's also important to have some popularity and have a good collaboration between those teams. And I think it's not just QA and development teams. think AI or data teams can also work a lot together and gain something from each side. It's also important to measure progress with metric data and then showcase the value, shift value to the stakeholders and show this metric data to them and see, like, how it historically improves.
Kavya (Director of Product Marketing, LambdaTest) - Thank you so much, Ruslan. Amazing insights. Thank you so much for sharing those. I'm sure that our audience would find it really helpful to learn from real-world examples that you shared, right? So, advice from your end. It has been an insightful session.
Thank you so much, Ruslan, for sharing your incredible journey and lessons on driving quality lift. All the practical advice that you shared and the inspiring success story behind your journey so far have been immensely valuable. I'm sure our audience will benefit from your insights, and of course, they can implement them in their workflow as well.
To the audience, thank you so much for joining us today. In case you'd like to get in touch with Ruslan, we'd be tagging him in all our social media posts. So feel free to reach out to him if you'd like to have more insightful chats with him.
Thank you once again, Ruslan, for joining us today. It's been a pleasure hosting you. And stay tuned for more such exciting episodes of the XP Podcast Series from LambdaTest. So until next time, take care and keep testing.
Guest
Ruslan Strazhnyk
Independent QA Consultant
Ruslan Strazhnyk is a seasoned engineering leader with 17 years of experience in quality assurance and software engineering. Originally from Ukraine, he has worked across Sweden and Germany, contributing to tech startups, digital healthcare, logistics, and more. With expertise in Shift-Left Testing, test automation, and Continuous Integration, he builds scalable testing frameworks and drives engineering excellence. Ruslan has a strong track record of leading cross-functional teams, mentoring engineers, and aligning testing strategies with business goals. His hands-on leadership and strategic approach enable organizations to enhance software quality, optimize workflows, and foster a culture of continuous improvement.
Host
Kavya
Director of Product Marketing, LambdaTest
With over 8 years of marketing experience, Kavya is the Director of Product Marketing at LambdaTest. In her role, she leads various aspects, including product marketing, DevRel marketing, partnerships, GTM activities, field marketing, and branding. Prior to LambdaTest, Kavya played a key role at Internshala, a startup in Edtech and HRtech, where she managed media, PR, social media, content, and marketing across different verticals. Passionate about startups, technology, education, and social impact, Kavya excels in creating and executing marketing strategies that foster growth, engagement, and awareness.
January 31st, 2025
29 Mins
January 17th, 2025
39 Mins