Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Software Businesses

Corporate Cultural Issues Hold Back Secure Software Development (betanews.com) 57

An anonymous reader shares a report: As the digital economy expands and software becomes more critical, security worries grow. In a new survey, 74 percent of respondents agree that security threats due to software and code issues are a growing concern. The study of over 1,200 IT leaders, conducted by analysts Freeform Dynamics for software company CA Technologies, finds 58 percent of respondents cite existing culture and lack of skills as hurdles to being able to embed security within processes. In addition, only 24 percent strongly agree that their organization's culture and practices support collaboration across development, operations and security. On top of cultural limitations, less than a quarter of respondents strongly agree that senior management understands the importance of not sacrificing security for time-to-market success.
This discussion has been archived. No new comments can be posted.

Corporate Cultural Issues Hold Back Secure Software Development

Comments Filter:
  • by arth1 ( 260657 ) on Monday January 22, 2018 @03:04PM (#55980359) Homepage Journal

    I think it's likely worse than that. The problem is that the "respondents" aren't necessarily people good at spotting where there are problems or what the nature of the problems are.

  • by enjar ( 249223 ) on Monday January 22, 2018 @03:17PM (#55980473) Homepage

    In other news, Microsoft finds that adopting Windows will work best for your company, Monsanto funds a study to say their crops are the sure way to make money as a farmer, Ford funds a study that says they make the best cars and trucks, Coca-Cola funds a study that finds their products are the most liked, etc.

    I don't disagree that security is a problem, I just have a fair bit of skepticism that a study funded by Computer Associates, takeover-and-neglect artists of the software world, is really going to get to the root issues that make integrating security into software development processes without a fair helping of "we can send an army of consultants to help you for a fee, in addition to licensing some software we acquired and will resell/license to you at a pretty large markup".

    • Agreed, and furthermore, the article doesn't specifically state who they interviewed. I originally read that to mean it was an INTERNAL study.

      Also, I find "cultural issues" to quite possibly be the best euphemism ever used to describe "greed."

  • by Anonymous Coward

    I have seen this myself. I had a DevOps job (which I leaped from, to a far better place) where the Scrum master (who had the power to recommend terminations, and managers rubberstamped them) where the dev team was always in a sprint. Every morning, there would be a 4-6 hour stand-up meeting where each dev would be interrogated about their code and where their build was. Each coder had to justify their existence, and why they fell short of the standing order of a huge amount of lines/day (around 10,000).

    • by zifn4b ( 1040588 ) on Monday January 22, 2018 @03:42PM (#55980641)

      I have seen this myself. I had a DevOps job (which I leaped from, to a far better place) where the Scrum master (who had the power to recommend terminations, and managers rubberstamped them) where the dev team was always in a sprint

      And then one day management and the scrum master awoke on a bright cheery morning. Sun shining bright, birds chirping. What a glorious morning! They got their Starbucks coffee, kissed their wives and kids and drove into work with cheerful positive music and thinking very highly of themselves and how fortunate their company was to have them. For without them, the company wouldn't be able to function. The scrum master skips through the door, smile on their face and spring in their step and goes into the empty meeting room where the daily stand up is supposed to be. 5-10 minutes pass after the stand up was supposed to start and there was still no one there but the Scrum Master. Why could this be? Where is everyone? They must be late! Those lazy no good slackers! And then it dawned on the Scrum Master, I fired everybody because they didn't meet my ridiculous expectations and I can't write a line of code to save my life. Without anyone to scapegoat the problem onto, because they were all fired, the Scrum Master panicked, chaos ensued, angry customers called the CEO and eventually the business closed its doors and faded into history as another failed startup.

    • by Junta ( 36770 ) on Monday January 22, 2018 @03:46PM (#55980693)

      Hyperbole aside, this isn't new to 'DevOps', though I will admit that in some circles it blesses the thought process.

      For as long as humans have been doing things, processes in bad groups devolve to this sort of blind and mad grasping at 'productivity', and devolving into spending more time fretting about the process of seeing if work is being done than actually doing the work. Each fad promising to 'correct' the ratio of overhead of the previous fad, either never realizing or intentionally ignoring the reality that people are the problem and will pervert any methodology that purports to fix it.

      Meanwhile, good teams operating within good larger organizations will succeed with whatever project management/development fad they nominally use.

    • I have a feeling I've worked with you before.

  • by zifn4b ( 1040588 ) on Monday January 22, 2018 @03:34PM (#55980583)
    1) Corporate Cultural issues aka employee engagement - seriously if upper management is toxic and plays psychological games, who is going to give a shit about your software on any level let alone security?
    2) Lack of software engineers with appropriate level of skill, education and experience. But you know it's because we can't find qualified candidates aka ones that are unicorns that will take minimum wage as compensation.
    3) Companies that don't take security and risk seriously because hey why do we need to take this seriously now? We didn't take it seriously 20-30 years ago and now you're asking me to spend more money than I used to on "best practice"? You're just trying to trick me into giving away my precious money on things we really don't need like all those RAD tools I've been pitched over the years...

    I could go on ad nauseum here but the TL;DR is: if you treat your employees like expendable pieces of shit that can supposedly be replaced by interns and contractors and tell them they should be thankful for it, your software is going to be shit on every level not just security.
    • I think 3 is more insidious than you give it credit for. "We didn't take it seriously 20-30 years ago and now you're asking me to spend more money than I used to on "best practice"?"

      There is a perverse logic to, "It's never bitten us in the ass before, why should we start worrying now?" That's doubly so when success before was predicated on not giving a shit about security, and beating someone else to the market who did.

      This works very, very well, up until the point it doesn't. The problem is that the data points build up supporting this flawed logic every time it's a successful gamble.

      And then, what if it's not a gamble? Looking at Windows, it's spent 20+ years as a multi-billion dollar making piece of swiss cheese. From a manager's standpoint, it's hard to point at that sort of success and say, "We should have held back some versions another 6 months to a year and fixed some of the bugs."

      It's not a corporate culture issue as much as it is a dollars and sense argument. If an early flawed release is going to make you more dollars than a later secure one, it makes sense to release your software early, holes and all. Over time, this may become corporate culture, but I'd argue that once it stopped making financial sense that most companies would revisit this habit.

    • by enjar ( 249223 )
      2) Is also difficult because you might need to re-architect and refactor parts of your code base. So if you can find a qualified person to work on the security issues found as part of a code audit, it can result in API changes, critical functionality changing behavior, and other such issues. We have people on staff at my company who do this kind of work, but it's not something you bang out in an afternoon and call it done.
  • by fish_in_the_c ( 577259 ) on Monday January 22, 2018 @03:38PM (#55980605)

    "On top of cultural limitations, less than a quarter of respondents strongly agree that senior management understands the importance of not sacrificing security for time-to-market success."

    time to market can be a make or break for profit. So how much liability are you willing to accept to make a profit ,if the alternative is bankruptcy , I'd say most managers would 'go for broke' and accept whatever risk is necessary to make a profit.

    That is much the same reason we have the FDA, because large food companies , when left unregulated , have in the past, killed more then a few people in the name of greater profit.

    There sooner or later will be more regulation on security critical software then there is now. That will slow down innovation, but prevent loss of life and property. A trade off that must be struck somewhere and will, weather good for everyone or not , be solved based on public perception.

  • by Junta ( 36770 ) on Monday January 22, 2018 @03:38PM (#55980611)

    less than a quarter of respondents strongly agree that senior management understands the importance of not sacrificing security for time-to-market success.

    So the problem is that senior management may understand, and the answer is not one that security experts like. Financially speaking, it may make sense to be a little fast and loose with security, or at least faster and looser than hardline security guys want. Security problems represent a liability, and for some cases not much liability, some times it *could* ruin your company, depending on what sort of company you are, the data you have, and which part of the data could be hypothetically compromised by the subsystem at hand. These have to be weighed against the cost of prevention both in terms of staffing/consulting and opportunity cost when your paranoia causes you to not implement a scary feature that your competitor does, or to be a year later than a competitor.

    Complicating things, there's a disconnect between paranoid security practices and where the largest breaches come from. The vast majority of breaches come from someone putting a crappy credential on something. This is overwhelmingly bad practice and overwhelmingly basic. The reaction to a breach in the industry is for security guys to use it to go to town, enforcing more and more draconian limitations using more and more inscrutable approaches to mitigate risk, even though the existing processes would have already defended them adequately if applied correctly. It's like never taking a shower because you keep reading about people drowning in the ocean. There's not the risk in your shower, but water-related death is a thing, so why take a chance.

    Meanwhile, time to market and availability are both negatively impacted when security-focused guys rule. In their job description, there is *insane* risk associated with ever saying 'yes', and generally not much risk associated with saying 'no'. They also know damn well that for all their effort, they will *not* get the whole picture, whether the team being reviewed intends to or not, they will never catch all the poor security decisions, further driving them to be paranoid in hopes they mitigate everything the company does in the hopes of catching the mistake in general roadblocks.

    The general corporate reality of 'external' security teams reviewing the efforts of 'non-security' teams leaves a lot of room for the worst of security policies inhibiting productivity and of insecure design getting through that the security team is going to be oblivious to. The answer is an embedded understanding of security principles in the day to day, but that truth is too inconvenient as that is quite an expensive proposition. They want to take unskilled folks and duct tape security on by having a small band of security 'experts' tick a checkbox in the process.

  • Bottom Line (Score:5, Insightful)

    by jmccue ( 834797 ) on Monday January 22, 2018 @03:44PM (#55980673) Homepage

    Corporate Bottom Line Hold Back Secure Software Development

    FTFY

  • Animal psychology issues hold back a lot more than that... culture is quite secondary. There is no mystery as to causation. The brain stem is still the master.

  • by Gravis Zero ( 934156 ) on Monday January 22, 2018 @03:48PM (#55980709)

    Just tie the pay of the managers to security. If there are security issues then the managers start losing money. Problem solved!

    • by Anonymous Coward

      That already happens, but not in a vacuum.

      The manager has to decide which is worse, added risk of a security lapse versus not delivering the product. And don't just say "any security lapse means the manager goes to jail" - you either deliver a product or you don't. It'll never be 100% perfect but if it isn't delivered it's a 100% failure.

  • Maybe locks themselves, hopefully safes and vaults, but virtually no other industry could afford to be as security focused as we're requiring of software here.

    My car door is easily opened without the key. The wipers are easily removed. A jeep can be dismantled with an Allen key.

    My front door has a deadbolt, with a big glass window next to it -- no bars.

    My Telecom services are connected in a box on the front line. Right next to the water shutoff.

    Virtually everything in our lives is completely insecure.

    We enc

  • No one wants security, and not just in code.

    I mean, in order for people to use safety gear that can save their lives, we need strict regulations and punishments. And I don't mean employers not caring about employees safety, I mean employees having everything they need at their disposal and not using it despite the signs.

    Security is about spending time doing non-functional stuff and restraining yourself. People hate that. And even if it is provably useful, it goes against human instinct which is to prioritiz

  • As I wrote in my book High-Assurance Design,
    • The average programmer is woefully untrained in basic principles related to reliability and security.
    • The tools available to programmers are woefully inadequate to expect that the average programmer can produce reliable and secure applications.
    • Organizations that procure applications are woefully unaware of this state of affairs, and take far too much for granted with regard to security and reliability.

Keep up the good work! But please don't ask me to help.

Working...