Linus - Linux and Security - follow-up

Published: 2008-07-31
Last Updated: 2008-07-31 11:27:48 UTC
by Swa Frantzen (Version: 1)
0 comment(s)

As promised in an earlier diary, some follow-up with your comments.

By far the most comments we received spoke out either in favor or either against full-disclosure. Some selected comments:

  • Neal: "Unless you are doing a red-team audit, working exploits are never needed. I could build a bomb in my home to prove it can be done, but the actual construction is never needed. If your goal is only to explain that it is possible, then there is no need to do it. (At Defcon last year, one guy tried to explain why working exploits were needed. I told him to prove to me that gravity is dangerous -- go jump off a cliff.)"
  • Guy: "While full disclosure is far from perfect, it's the best solution in the absence of any alternative. For commercial vendors, it's often the only reason to quickly fix a security vulnerability (look at Microsoft's track record). It also protects the finder of the hole from being silenced instead of fixing the bug."
  • Joshua: "How are you going to get the full picture without exploit code, thinking the software manufacturers are going to release this information to security researches is naive. This is kind of like making bullet proof vests without ever firing a gun or testing it." To which one of the other handlers replied: "Yea but who is wearing the vest during testing? In the FD world those of us wearing a vest can be "test" shot by anyone since FD provides the guns and ammo."

I guess we'll not settle such wide diverging viewpoints on full disclosure anytime soon.

What stood out to me is that most of pro FD arguments are with issues vs. commercial and/or closed source software where the vendors are unwilling to admit to have sold highly broken products to the consumers at large. If you read back to the original subject, we were discussing Linux: open source and the community at large is the developer ...

About Linus' viewpoint to not provide any hint of a security bug aside of the fix itself in the source code, it stood out nobody spoke up to defend that viewpoint. On the contrary:

  • Alan: "Apparently those who develop code cannot be relied upon to provide the information needed by the public to manage their patch processes -- even for major open source projects."
  • Guy: "I do not believe that it is possible to keep security issues secret. Some attacker might find it during the time that it wasn't disclosed and exploit it on his own (or sell it)."
  • A reader wishing anonymity: "I am troubled by the security through obscurity approach Torvalds puts forward."

Similarly, I've seen little indication our readers think security bugs should treated just like ordinary bugs.

An important reminder came from Jos: "What exactly is a security bug? I think it's not only about unauthorized access, it's also every bug that could lead to data corruption or less availability of a time-critical application." Something we all need to agree to security is about managing risk to "CIA" Confidentiality, Integrity and Availability. Also somethign we all to often tend to know but not use in the way we speak about things.

  • Morten: "I think Aidan Thornton has an important point, although obvious in his own words. Bugs can hit you on three levels:
    1. You loose availability of your system (crash, just restart the program.)
    2. You loose data (silent data corruption, just use a backup, which you do of course have!)
    3. You loose your money!
    "Normal" bugs hit you on level 1 and possibly on level 2, security holes can hit you on *all* levels, so they ARE different, and need to be found, fixed and rolled out BEFORE they hit anyone. You can't just restart your money, or roll them back in from a backup (mmmmmm... 8-] if only...). Security bugs must be hunted actively, normal bugs you just catch in traps ...
    "
  • Niel had an interesting observation: "[Security bugs are different from normal bugs] due to intent, but not due to the fundamental cause." This observation is very interesting in itself indeed as it can show a relationship between developers (who unwillingly create the bugs), QA who try to find bugs and security who extend the bugs towards motivation. In Niel's words:"There is a big difference between developers, quality assurance (software testers), and security people.
    • Developers may not think about how the code will be used beyond the specifications.
    • QA must think differently, to make sure the code does what it should do (alpha testing) and does not do what it shouldn't (beta testing). In addition, not every QA person does development.
    • Security people extend QA to motivation. "Security staff" are a special extension to QA and usually need to have developer skills in order to evaluate exploits, create solutions, and implement test cases. In this regard, security people ARE better than normal developers and QA because they must see a bigger picture and be able to react to it.
    NOTE: I never said that developers were not important, nor that QA was not important. Just as every automobile driver does not need to be a mechanic, every developer does not need to be a security expert or QA guru."
  • An anonymous reader wrote: "Security bugs ARE different.  They require faster and sometimes untested repairs."

Few responses focussed on finding a balance between full disclosure and obscurity, a notable exception was Morten:

"I have always upgraded my open source software/freeware as soon as I saw a newer version of it, important fixes or not (with a few exceptions), but I like to be given just some obscure reason like "Security bug fixed involving feature A used with feature B resulting in remote code execution or elevation of rights or ...", so I can make sure to upgrade to exactly this release, but I don't need to see POCs or exploits to be convinced. It is always a race, and I see no reason to change the rules of the game, before I have had a chance to enter the pit to change the tires as preparation to the new rules. Full disclosure, no thanks, just a rough summing up to tell normal bugs from security bugs."

I'd like to wrap up with a very important reminder from Jos that the business is what we're all about:

The main 'problem' I observe is that the discussions remain mainly on the tech side.
The main contributers to both the kernel and DNS discussion are technicians. The business side does not have a voice and therefore people don't consider the implications about their suggestions for the business.
What impact does Full Disclosure have on a company that does not have the resources to patch at will, but need several days or even weeks to fix?
Does it really matter if a bug is marked as generic or security from business perspective? How many companies have a dedicated 'security team' available to evaluate those kinds of bugs? And if they don't, how are they going to evaluate the bug and patch?
I don't believe in security by obscurity, but also don't believe that making every security leak visible helps.
Be honest, tell there is a problem in the software and indicate how much of a problem it is so business can determine the amount of effort they have to take to solve the problem. A label 'security patch' or the full details are not needed most of the time.

I like the need for an impact indication instead of labelling "Confdentiality" and "Integrity" problems as "security", but not doing so for "Availability" issues a lot!

--
Swa Frantzen -- Section 66

Keywords: linux
0 comment(s)

Comments


Diary Archives