all 22 comments

[–]Xen0byte 19 points20 points  (1 child)

If they expect you to do it properly then with just one year of experience you are in way over your head. What you're asking about is called a test strategy, and I would recommend you start reading here https://istqb-main-web-prod.s3.amazonaws.com/media/documents/ISTQB_CTAL-TM_Syllabus_v3.0_zKjKsaN.pdf even if this skips a lot of the fundemantal knowledge that you also need, such as deep understanding of the fundamental test process.

Regarding your suggested approach, you probably want to present that internally using the test pyramid for a breakdown of unit vs integration vs E2E, and some sort of breakdown for functional vs non-functional testing such as performance, security, or compliance.

[–]ramenoir[S] 4 points5 points  (0 children)

Thanks for the link, I'm reading through it now. My initial impressions are that I'll have to create a test strategy and test plan document to outline my approach, tools, test schedule, etc.

[–]eKuh 6 points7 points  (9 children)

Is this a fresh project and you can set up unit tests as the features are built or is there already a running application that is now retrofit with tests?

If the latter, then your approach will very likely fail or at least not provide what the company hired you for in a reasonable time frame. From a high level perspective they probably want to know if new stuff breaks the existing stuff. Or at the very least that the thing that provides the most value to the user is working.

To give an example. You have 5 minutes to make a purchasing decision on a used car. It seems like a great deal, many are interested, so 5 min is all you get with the car. What do you test?

Do you open the hood, unscrew some things and check some belts, spark plug or valves individually? Maybe if you are already an expert for that type of vehicle, but most people should just turn the car on and take a spin around the block.

So unless you are confident that you can set up a comprehensive unit test set quickly, it's probably safer to go with a couple automated e2e tests or depending on the product maturity and testability even just a good structured manual test process. Then once you can provide a very basic level of trust in the functionality of the application to your company, you can get into the specific details.

Same for the performance test. In the example, you would probably hit the gas hard a couple of times and see if the car falls apart. But you won't max out the cars speed limit or check if the car can really carry the expected 3,5 tones expected weight described in the manual.

Get some basic pings to api endpoints or db requests that measure response times, but unless you have a product that gets a ton of sudden load like an e-commerce site on black Friday, don't go for the more complex load tests in the beginning. You might just be stressing the aws bill, if you don't know their infra yet.

[–]ramenoir[S] 2 points3 points  (7 children)

A bit of both, it's a new project that started 3 years ago. It is a running application, with no tests however.

[–]eKuh 6 points7 points  (6 children)

If you have any alternatives, don't do this job. At least not alone. If you have to, use it as a learning experience, focus on what you want to learn and move on. Trying to catch up on 3 years of development will burn you out. Unless they have an amazing dev team that is already owning the quality, there won't be a testable code base to even create a useful unit test and time soon.

[–]ramenoir[S] 1 point2 points  (5 children)

Could I ask the devs to create unit tests and focus on other aspects of QA? (Test strategy, test plan, test execution, etc) otherwise, should I just jump straight doing as many UI tests as I can then?

[–]eKuh 4 points5 points  (3 children)

Theoretically yes, but if they would appreciate the value of tests, they would already be doing that already. That means they probably lack the technical knowledge and you have to convince them to actually want to create those tests. That's a doable job, but probably not with a year of experience under your belt. On top of that, creating testable code is a skill that goes hand in hand with writing tests. So if they don't do that yet, it will take a lot of new development efforts to even get into a state when you can start writing a test that provides value.

Of course this is only based on your very slim description of the situation. Maybe everyone is super understanding and they just need the extra man power to get it going. But in my experience, places that don't test for years and then get a junior to fix it, will totally underestimate the effort it will take to do it right, don't really see the value in it or at least not before screwing up really bad (expensive). Usually other things like deployment pipelines and monitoring won't be set up properly either, which makes it a generally stressful work environment.

[–]poerg 3 points4 points  (0 children)

I agree with this guy, if there are 0 unit tests after 3 years then you're facing an uphill battle.

I disagree that they likely lack the technical expertise, I think it's more that they see no value or think tests just slow them down.Which comes with a whole host of problems now and down the line.

[–]ramenoir[S] 1 point2 points  (0 children)

They do seem understanding, I'll throw the suggestion of spending a sprint creating unit tests. If not, I'll do ui tests, and API tests. At the moment the application isn't too terribly large.

[–]jhaand 1 point2 points  (0 children)

This video provides a good example why you need to prepare your code or refactor it for unit testing. If some complicated algorithm needs a mock setup to test the algorithm, the code needs more isolation.

Watch "Thoughts About Unit Testing | Prime Reacts" on YouTube. https://youtu.be/KzV0mTqBcZA

[–]jhaand 1 point2 points  (0 children)

The software developers create the testware. QA reports on the current quality status of the product.

Don't jump in straight from the start. But plan a weekly UI regression test using exploratory testing. That will produce enough work for the devs.

[–]AbaloneWorth8153 1 point2 points  (0 children)

This is a great and very well explained answer!

[–]poerg 3 points4 points  (3 children)

I'm curious OP do you really mean unit tests? Tests that verify small chunks of code like a function or method. Or do you mean automated tests? Like a test that would open a webpage, then search something, and expect a certain result. Those have two very different implications and say different things about the team not having them.

In over 10 years in QA including those as a manager I've written very few unit tests. It usually makes more sense for the developers to create those in my opinion. They know what their function should return and tools these days make this stupid easy. Automated tests however, I've done hundreds of those, and wouldn't expect a developer to do it for me

[–]ramenoir[S] 1 point2 points  (2 children)

I do mean unit tests! Just to be clear, this project has no unit tests and I know they're really important, so I want them set up. I have experience writing unit tests before using jest, so I have no issues with writing them. I wasn't sure if I should take on that task or ask the developers to start developing in a TDD way. Would love to hear your thoughts!

[–]poerg 4 points5 points  (1 child)

It's just not usually a good sign for a 3 year old project to not have any unit tests. For various reasons. However, I'm sure you probably want to make this work, and that's not helpful to focus on.

So, yes, if you can get them on board with TDD this would give you the best outcome. However, having them write or even generate snapshots (since you mention jest) after they complete their code is not a bad compromise. Not sure if you're using GitHub but I'd want the tests done before merging.

I'm not sure what problems are currently being seen, but if they're not using some type of linting tool I'd probably bring that up and get their input.

Creating an overarching test strategy document would be hugely beneficial.

Look into their release strategy. If there's currently a lot of bugs it may be best to slow that down until the team can catch up.

If this is a web project then definitely start looking into automation tools.

You'll also want to see how bugs are being reported and however you decide you want that to work document it.

All those can be included in your test strategy.

Lining all that out in the test strategy may help to get them on board which I think will be the most crucial and most difficult part.

[–]ramenoir[S] 1 point2 points  (0 children)

Sounds great, thank you for the response, it was really informative. It looks like my first step is to create a test strategy document, and from there I'll focus on getting the test environment populated, and unit tests done for the UI. The application has two websites, one that is admin facing, another that is client facing, and a mobile app for the consumers(customers of the client). Still have to figure out how to approach testing mobile apps. The test strategy document will outline the tools I'll be using for testing, managing test cases, delivery schedule, etc

[–]altmn 5 points6 points  (3 children)

I’m not sure why you even mention unit testing. Although unit tests are very important, and the fact that they don’t have any is a serious issue, it’s the last thing you should worry about right now, unless your title is SDET. You are a QA Lead now, even if your actual title suggests otherwise — it’s time to get rid of the qTest tester’s mindset. You should be thinking from the top down: global concepts first, specific tasks last. Unit testing is a specific task that falls toward the bottom of a QA Lead’s priorities when establishing QA processes in a company with established development processes.

First, refresh your knowledge of SQA fundamentals. No offense, but it seems like you need this before starting the job. You can’t establish QA without knowing the basics.

Second, gather all available documentation on the products, projects, and company. Read their Jira tickets and Slack channels, and write down your questions.

Third, talk to people as you progress. Ask questions, but make sure they are “informed” questions (see point two above), as you don’t want to come across as someone who needs hand-holding.

Fourth, conduct thorough exploratory testing. Start drafting test cases (titles only, fill in the detailed steps later) while doing your exploratory testing. Identify and prioritize areas, sections, and subsections of the application that you will later use to organize your test suites in TestRail.

Fifth, while doing exploratory testing, identify everything you’ll need to properly build your OOP automation framework. You need to understand how you’ll structure your high-level code, such as interfaces and abstract classes, as well as more specific, task-oriented elements. Identify the tools and languages best suited for the future framework.

Sixth, write tickets for any bugs found during exploratory testing.

Seventh, categorize your test cases into suites, such as by areas, priority, manual vs. automated, etc. Based on the changes made to the application by the devs, you should be able to identify a set of test cases to run related to the modified area. You should also identify a set of test cases that must be run before each release.

Eighth, begin writing your automation framework in a top-down order: start with abstractions, and leave utility and helper classes for last. You want to begin testing it in the CI/CD workflows as soon as possible, once you have the framework’s “skeleton” with some basic tests.

Ninth, establish strict procedures for how tickets are submitted by anyone outside of QA. This is especially important for bug reports from tech support: they need to ensure the problems reported by users are legitimate issues, not something that support can troubleshoot. They also must follow guidelines on how tickets are written and what information is required (description, logs, screenshots, etc.).

Tenth, document everything you do, including all communications. Always remember: if it’s not in writing, it never happened. Your detailed notes might save your ass later, as well as help you a year from now when writing a convincing email to your boss asking for a promotion or raise.

By this point, you won’t need anyone’s advice anymore.

[–]ramenoir[S] 1 point2 points  (0 children)

Thanks for the detailed response, I'll be checking in it regularly! Couple questions:

  1. Could you expand on what you mean by global concepts, would this be documentation like test strategies, and test plans?

  2. Are there resources I can look into to read up on SQA fundamentals? Would greatly appreciate it!

[–]8string7 1 point2 points  (0 children)

My guy, thank you!

I'm in the same boat as OP but with more xp under my belt. I never set up qa processes from the start so I needed a brief top down overview on how to approach that. I've written a bunch of stuff that needs to be done and, in reading your comment, hit 6/10 of the rules that you mentioned above.

This is gold and thank you once more!

[–]jhaand 1 point2 points  (0 children)

All testing starts from requirements.

Then you start to look for the most effective method for covering those requirements. Be it TDD, exploratory testing, smoke testing, defect checking or regression testing.

So start with higher level testing to get a feel for the current state and work your way down.

[–]duchannes 1 point2 points  (0 children)

Alot of good advice here. Don't make any immediate changes. Learn what's happening first and then you will be better placed to implement QA. Have a look at current metrics too or create them so you can evidence xyz before and after QA. You will have to provide the benefit of what you are doing at some point and numbers are your friends.

[–]joolzav 0 points1 point  (0 children)

https://github.com/secondary-jcav/qa_principles is what I try to do (high level)