by nonasktell on 8/25/22, 1:52 AM with 35 comments
It's not critical infrastructure or anything close to that, but we deal with the personal data of pretty important employees of those companies, and have an app installed on their phones.
And the "security" culture, seems like the worst you could imagine, I wonder if it's (almost) the norm in the industry or just some bad luck, would love to have your opinions and stories about this topic
A few months ago, I found a vulnerability allowing any user(and I do mean ANY user), to login as an admin on our homemade administration space with a simple trick, which would allow them to change almost any "trusted" content that users can see on our app, text, images, videos, links or files that they'll be asked to open/download, delete users, dump a list of all users...
Fixing this has been at the bottom of the todo list for months, and no one seems to care, no one is assigned to it. When it's brought up people are like "ooh yeah that's like really really bad, but what about [tiny useless feature than one user asked about]?we don't have the time to fix that!"
Hundreds of employees passwords for what's AFAIK our largest client, are stored in PLAINTEXT.(same story as the vulnerability, bottom of the todo list, no one cares)
Oh, and did I mention that any logged in user can call almost any API endpoints with almost zero verification that he's allowed to call them?
Those 3 issues are still there after months even though everybody is aware of them.
Would you just get the hell out without looking back? or keep trying to improve stuff even when it takes months to just get them to care enough to start maybe thinking about assigning a dev to a problem?
How is it at your current/past companies?
Thanks
by metadat on 8/25/22, 2:43 AM
The unfortunate reality is the class of companies called "startups" only requirement is that someone convinced someone else to give them money, and now they're the boss. This can lead to all sorts of things.. like what you're seeing.
I'd bail to somewhere else with more professional leadership. Even if this one instance gets fixed, the deficit is indicative of a cultural problem that I've not yet seen ever really change.
Not taking the security of your customers seriously is guaranteed poison in the medium to long term.
by aintmeit on 8/25/22, 3:11 AM
Because companies have a wide range of options for dealing with the consequences of leaks, they'll prioritize security last instead of shifting left. Some common responses by companies include:
- denying there's a problem
- covering up the problem
- acknowledging the problem in a blog and demand to be praised for the disclosure
- blaming employees for the mistake
To make a good plan, you can break down concerns piecemeal. What's the worst case scenario if attackers get a hold of employee passwords? What happens when users trust tampered content?
by raffraffraff on 8/25/22, 6:02 AM
by giantg2 on 8/25/22, 1:30 PM
More or less, yes. They generally talk about security, but it's mostly just lip service. Although I will say your examples are pretty extreme.
I once worked on a team as the security champion for a major financial system the company uses internally (thankfully) for trading. There was a problem with SQL injection on every page/input we built in that system. And it had schema owner privileges, so you could drop tables and stuff. This is a possibility to happen by accident since there are trade desk devs who could have a tablename collision and accidentslly paste SQL into a field, let alone the malicious possibilities
I brought this up with the principle to see which of the two remediation plans he wanted to pursue and what resources he would provide me with for the work. Apparently they wanted to go with their own option of do nothing. They said there's an automatic backup of the database in near realtime (forget the name). Would it duplicate dropping tables? How far back can you restore? Has it ever been tested? What are the procedures for restoring? How long to restore (even a 15 minute outage is the end of the world according to the business)? They didn't know any of it, and they didn't really care to.
I promptly left that team. They just wanted a security champion to do paperwork for regulatory compliance. I had no real power to make improvements beyond the smaller stuff could do myself.
by zivkovicp on 8/25/22, 8:25 AM
With that said, my experience is that security issues get taken care of when they are very dangerous, or only after the main product tasks get done. Fixing minor issues in a (NEW) product/company that might not survive the year is a comparative waste of time... survival comes first usually.
by speedgoose on 8/25/22, 4:44 AM
If you can’t do that, or people get upsets because you did, you should probably start looking for a better job.
by herbst on 8/25/22, 7:23 AM
I guess it's less about startup culture but more about bad culture in general.
by powerhour on 8/25/22, 4:47 AM
It's probably time to start interviewing. Ideally before your employer's security woes become headline news. (Of course, if you've only been there a few months you can easily leave them off your resume once word of the flaws gets out.)
by oreally on 8/25/22, 5:19 AM
by muzani on 8/25/22, 8:54 AM
That company still didn't die and are popular in the community. Natural selection applies in the business world, but some startups really are the cockroaches of the world.
That said, it's not common. It happens in maybe 2-3 jobs out of 10. Often these are the guys who say yes to everything the customer asks for, which puts them in a peculiar niche.
by lasereyes136 on 8/25/22, 12:29 PM
AppSec is never the most important thing on any product manager or product owners list. No one says AppSec isn't important, just other things are more important. Sometimes the best you can do is talk them into letting the people that care dedicate some of their time to fixing security issues. While it seems bad, if this is a product or company you care about, working to make it better makes sense. You also might consider adding security concerns into requirements, acceptance criteria, and code reviews (if you have those) to stop it from getting worse.
by superchroma on 8/25/22, 3:56 AM
by msarrel on 8/25/22, 8:12 PM
by sergiotapia on 8/25/22, 6:50 PM
by gitgud on 8/25/22, 8:31 AM
This means security might not be valued, but that is a risk the business is taking in order to more resources towards the multitude of problems start-ups face...
by zach_garwood on 8/25/22, 3:04 PM
by aprdm on 8/25/22, 5:31 AM
by danielmarkbruce on 8/25/22, 4:05 AM
by mountainriver on 8/25/22, 3:01 PM
by dev_0 on 8/25/22, 2:59 AM
by GentWhoCodes on 8/25/22, 9:47 AM
Poorly securing personal data in the UK/EU is rather illegal. So no, any reputable shop *will* care about security, ensuring personal data is kept secure and bugs patched as a priority.
Going up the food chain, "huge clients which you all heard of, multinationals, some parts of governments" will not be impressed if you are found to be slack when it comes to leaking customers data.
by cpach on 8/25/22, 11:44 AM