Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
No no no. They do not set the ratio like it is knowing they will purposefully do a bad job. Anyone with half a brain can explain to a board why that’s a bad idea and the risk around it. Do you have any proof at all that companies knowingly build up a dept that will cost them a bunch of money and only prevent issues sometimes? No, that’s absurd. I have seen first hand as a developer, a QA resource cover up to 5 people without issues. I have managed teams, still do, that have a 1:4 ratio and keep up. It can be done and has been, just because you’ve not experienced it doesn’t make it impossible.
Yes, this company taking some risk like in a movie, beeting them to market. And what if their solution causes massive issues due to QA not catching some critical bugs due to cutting corners. They forever lose that entire market. They company could be out of business instead of being able to still compete in that space. No investor would risk money like that. Where are you getting these ideas from?
Simple. Been a QA, worked with QAs, been to conferences with QAs. We tell the boss we can’t cover the whole thing, they say just cover the most important stuff. The general advice from veteran QAs is to not even say that to the boss. They know, and they can’t get more resources. So veteran QAs advise others to get a feel for how much time you can spend on a thing before they start complaining that you are holding it up. Then work within that timeframe. As long as nothing major gets through it’s all good. Your view is one of survivor bias. Nothing big got through, but that doesn’t mean it was completely tested. Its good enough, untill it isn’t.
Side note, I’ve seen product managers close bugs, not because they weren’t bugs, but because they were bad enough, compared to features they thought would sell more software. This was an outlier, usually they just wait a few years and mass close everything that is X years old. That, I have personally seen everywhere I have worked.
And you’re view is based on anecdotes and assumptions. Things you’ve seen that you assume applies to the entire industry because you’ve worked in QA. Well I’ve worked at multiple companies, in roles from product to engineering, working myself up to the level of CTO. I talk to other CTOs and understand how their teams run and fail. I have to make decisions that keep our tech going and deal with consequences when they aren’t. So forgive me if I don’t put a ton of stock in your statement that “quality doesn’t matter” when I’ve had multiple conversations with executives and multiple experiences that prove that to be false.
Bottom line is, I’ve told you my point of view, you disagree, that’s fine. You don’t work for me, I don’t need to worry about it. If you truly think quality doesn’t matter and that’s working for you, have at it.
Jesus dude, your still on this, I wrote off this convo forever ago.
This will destroy Crowdstrike. They will not exist in a year. This is not “just a blip” lol. Many companies have collapsed over much more than this.
You make so many assumptions. How do you know they didn’t test it? How do you know a wrong build didn’t go out? You’re entire stance is based on assumptions fed by anecdotes from limited experience.
Remember solarwinds… that was supposed to destroy them to. Still here though.
You seemed surprised companies would risk a catastrophic bug because it would destroy them. But this will be the evidence that it won’t.
And yes, the most likely cause was that the copy process failed somewhere along the line. But testing hashes of the update against what you actually tested is part of qa. And clearly testing didn’t happen somewhere critical.
No no no. They do not set the ratio like it is knowing they will purposefully do a bad job. Anyone with half a brain can explain to a board why that’s a bad idea and the risk around it. Do you have any proof at all that companies knowingly build up a dept that will cost them a bunch of money and only prevent issues sometimes? No, that’s absurd. I have seen first hand as a developer, a QA resource cover up to 5 people without issues. I have managed teams, still do, that have a 1:4 ratio and keep up. It can be done and has been, just because you’ve not experienced it doesn’t make it impossible.
Yes, this company taking some risk like in a movie, beeting them to market. And what if their solution causes massive issues due to QA not catching some critical bugs due to cutting corners. They forever lose that entire market. They company could be out of business instead of being able to still compete in that space. No investor would risk money like that. Where are you getting these ideas from?
Simple. Been a QA, worked with QAs, been to conferences with QAs. We tell the boss we can’t cover the whole thing, they say just cover the most important stuff. The general advice from veteran QAs is to not even say that to the boss. They know, and they can’t get more resources. So veteran QAs advise others to get a feel for how much time you can spend on a thing before they start complaining that you are holding it up. Then work within that timeframe. As long as nothing major gets through it’s all good. Your view is one of survivor bias. Nothing big got through, but that doesn’t mean it was completely tested. Its good enough, untill it isn’t. Side note, I’ve seen product managers close bugs, not because they weren’t bugs, but because they were bad enough, compared to features they thought would sell more software. This was an outlier, usually they just wait a few years and mass close everything that is X years old. That, I have personally seen everywhere I have worked.
My view is survivor bias sure.
And you’re view is based on anecdotes and assumptions. Things you’ve seen that you assume applies to the entire industry because you’ve worked in QA. Well I’ve worked at multiple companies, in roles from product to engineering, working myself up to the level of CTO. I talk to other CTOs and understand how their teams run and fail. I have to make decisions that keep our tech going and deal with consequences when they aren’t. So forgive me if I don’t put a ton of stock in your statement that “quality doesn’t matter” when I’ve had multiple conversations with executives and multiple experiences that prove that to be false.
Bottom line is, I’ve told you my point of view, you disagree, that’s fine. You don’t work for me, I don’t need to worry about it. If you truly think quality doesn’t matter and that’s working for you, have at it.
https://www.abc.net.au/news/2024-07-19/technology-shutdown-abc-media-banks-institutions/104119960. They didn’t even test the update before pushing it. And the company will still exist in a month. They will take a stock hit, might lose some customers, but in the end it will just be a blip.
Jesus dude, your still on this, I wrote off this convo forever ago.
This will destroy Crowdstrike. They will not exist in a year. This is not “just a blip” lol. Many companies have collapsed over much more than this.
You make so many assumptions. How do you know they didn’t test it? How do you know a wrong build didn’t go out? You’re entire stance is based on assumptions fed by anecdotes from limited experience.
Remember solarwinds… that was supposed to destroy them to. Still here though.
You seemed surprised companies would risk a catastrophic bug because it would destroy them. But this will be the evidence that it won’t. And yes, the most likely cause was that the copy process failed somewhere along the line. But testing hashes of the update against what you actually tested is part of qa. And clearly testing didn’t happen somewhere critical.