← Back to blog

Ask Me Anything — Dawid Dylowicz from Software Testing Weekly

Dec 10, 2021 · 8 min read · Dawid Dylowicz

In the Software Testing Weekly’s 99th issue I asked you to submit questions about running the newsletter, testing or anything else you’d like to know,

I was happy to see you’ve sent a dozen of questions. So here they are, answered!

1. Diogo Nunes: What made you create this awesome newsletter? Do you remember how you first came up with the idea?

I’ve always wanted to create something that would bring value to people around the world.

I came up with the idea for the newsletter in 2018. I got heavily inspired by the widely successful iOS Dev Weekly by Dave Verwer who pioneered this type of tech news curation. I realised there was nothing like this for testers and decided to start it myself.

Fun fact: The platform I use for hosting and publishing the newsletter, Curated, was created by Dave!

2. Safinah: Do you run this newsletter as a one-man show? Or do you have a small team or close friend assisting?

Yes, I’ve been running it on my own so far. However, I don’t exclude the possibility of collaborating with someone in the future.

So if anyone has an idea, I’m happy to talk!

3. Enrique Matta: Where do you find the time to share all your resources? Do you have a specific set of feeds you monitor and share? Big fan, thanks for the hard work.

Time management is crucial for this project. I wouldn’t be able to deliver it consistently every week without rigid discipline and some personal sacrifices.

It’s a habit now. I break it down into manageable pieces of work that I can do daily. Every day I look at my RSS reader that subscribes to about 400 blog feeds. There are 50–80 links showing up every day which amounts to 350–550 links every week. As you can imagine, that’s a lot to go through!

4. Fahim: What’s your reading schedule like in the week to curate high-signal articles and resources in this newsletter? How do you find so many and organise your reading? Do commit to X number of hours per day?

This is the schedule I follow:

  1. Monday:
    • Sourcing news. (15–30 min)
  2. Tuesday:
    • Sourcing news. (15–30 min)
  3. Wednesday:
    • Sourcing news. (15–30 min)
    • Selecting about 20 news from 40–50 shortlisted ones. (45–60 min)
    • Finding authors of the news (15–25 min)
  4. Thursday:
    • Describing the news and preparing the issue. (2–3h)
  5. Friday:
    • Last review before publishing. (15–30 min)
    • Sourcing news for the next issue. (15–30 min)
  6. Saturday:
    • Sourcing news. (15–30 min)
    • Posting on social media. (30–60 min)
  7. Sunday:
    • Sourcing news. (15–30 min)

So in total, I spent 6-9h a week preparing each issue.

5. Ale: What’s your approach to picking articles for the newsletter?

I pick articles based on multiple factors. There’s no magic formula, but I usually try to answer these questions:

  • What is the news about? (e.g. Something you can easily find on Google? Something that was recently included?)
  • Why is this valuable? (e.g. Is it new, fresh, unique?)
  • How is it delivered? (e.g. Is it an in-depth overview or just a skimmed list of general points?)
  • Who created it? (e.g. A software tester or a technical writer on behalf of a company?)
  • When was it published? (To keep the newsletter fresh, I usually select news that are one to two weeks old at most.)

Of course, there’s much more to that and it’s not easy to define. I always aim to make every issue knowledgeable, interesting, diverse and unique. As a result, some news that would make it to one issue might not make it to the other.

6. Diogo Nunes: How did you end up in Testing? Was it a choice at uni or a “happy accident”?

A bit of both! I studied Computer Science and I tried to get an internship as a software engineer after the second year.

I went to a few companies organising summer internships to do the qualification tests, mainly checking C++ and Java knowledge. I finished one of them faster and had time to do a QA test, too. I found it very intuitive and natural to me.

It turned out that I failed the programming tests but I did quite well on the QA one. That resulted in getting invited for an interview. It went well and I was offered a three-month internship.

So I took that chance. After three months, I realised that I really enjoy doing it and I felt fairly good at it. As a result, I chose it as my career.

7. Safinah: What is your ultimate goal in software testing — to you, to the community and the industry?

It keeps changing.

I remember when I interviewed for my second job, I was asked where I see myself in the next five years. I said I wanted to have my own company that has a positive impact on software testing. It was 2015 and I was 23. I didn’t know what that would be about but that was the 5-year “goal”.

And sure enough, I did it. I founded More Than Testing on my 27th birthday — exactly 5 years later. Shortly after, I started Software Testing Weekly that now helps thousands of testers discover the best news every week.

Now that I’ve reached that goal, I’m thinking about how I can leverage what I’ve built to make even more impact. At the same time, I’ve recently taken over a leadership role and I’m growing in this field.

There are a few ways of answering that.

First of all, to find what type of content testers like most, I’d recommend checking out the 52nd issue with the most popular links of 2020. Also, the weekly Tweets with the three most popular links might be helpful.

Regarding the trends that I personally see in software testing:

  1. Shift-left testing (getting developers and teams more involved in testing and at earlier stages),
  2. Shift-right testing (ongoing monitoring, chaos engineering),
  3. Full-stack tester (e.g. performance testing is not a profession anymore, but a part of the skillset),
  4. Community-driven tools win it all (e.g. Cypress vs. TestCafe),
  5. Web and API testing are way more popular than mobile testing.

For more trends, I recommend checking out How They Test — a collection of articles on how the most popular tech companies test their software.

Also, Software Development and QA Trends to Watch in 2022 is another very interesting read that I found recently.

9. Ale: I really appreciate the work you do every week, but don’t you feel like our industry lacks a bit of innovation? Don’t get me wrong, there are a few jewels here and there, but there are also dozens of articles discussing the same old ideas from the past decade without providing a new angle.

I agree with you it may feel a bit like this but I don’t think it’s any different from other areas of software development (unless you look specifically at the rapidly changing JavaScript ecosystem!).

We need to look at a longer time horizon here. It takes time for the innovation to be adopted and picked up by a wider audience.

For example, here are my personal top game-changers:

Even though they’re a few years old, all of them still apply today.

10. Anonymous: What are the reasonable, latest ways to measure QA performance, for performance evaluation, if metrics and verticals that are similar to “amount of bugs” are vanity metrics and can be abused? There is a lot of guides on how to do performance evaluation of developers but I can’t find good ones for QA.

Before we start, there’s one thing to remember — quality is hard to measure in numbers.

Also, there are no ideal metrics. The ones that will work well in one team or project may not work well in the other. It also depends on what you want to measure. And as you correctly pointed out, there are metrics to avoid, too.

Here are some examples:

  1. Product quality
    • A number of bugs reported by your customers — can tell you how customers see the quality of your service.
    • Product reviews score (e.g. on Apple/Google stores) — can tell you how your end-users see your service.
    • NPS (Net Promoter Score) rate — similarly, it can tell you how your end-users or customers are satisfied with your service.
  2. Team quality
    • Average time of open bugs — can tell you about the state of your engineering culture (e.g. Are bugs triaged and prioritised? Is tech debt tackled regularly?).
    • A number of ignored tests — similarly, can tell you about the culture and respect to tests.
  3. Code quality
    • A number of release rollbacks — can tell you about the state of your software development life cycle process.
    • A number of hotfixes deployed — similarly, it can tell you about the state of your software development life cycle process.
  4. Test quality
    • Test flakiness rate — can tell you how stable and reliable your automated tests are.
  5. System performance
    • Uptime % of your services — can tell you about your system’s reliability.

The thing is, in most cases, they’re shared between Engineering, Product and Business. And quality plays a role in each of them.

If you want to read more, have a look at the content posted on Software Testing Weekly using the search for “metric”.

It’s an interesting problem. I’ve got to ask some of my colleagues at work since we deal with camera capture apps and I’ll get back to you and update this post with more details.

What I’ve found so far is that some cloud test device provides have a solution for camera injection using Appium:

  1. SauceLab’s Camera Image Injection.
  2. BrowserStack’s Camera Image Injection.

I believe in both cases you can create a free trial account to try it out.

A completely different approach to testing camera feed with real data would be using crowd-testing services. That way you specify exactly what you want to be tested, on what device and system version. It’s a proper end-to-end test with real people and objects on a real device. It’s a costly solution, however.

PS. Thanks for the coffee!

12. Primož: What is your favourite beer?

Zlatý Bažant 🍻


Thanks for all the questions, I really had fun answering them! Feel free to follow up on LinkedIn or Twitter.

Until next time!

Dawid's profile picture

Dawid Dylowicz

QA Lead, SDET with 8+ years of experience in software testing.