Month of Apple Bugs

MOAB has begun with less than a bang, coincidentally MOAB was used to describe a 21,000 pound bomb (Massive ordinance air blast). Matasano has a great post about MOABs, the bomb that is about apple bugs, not the big ass bomb used in Tora Bora – Read it! (here)

Even though is looks like there is less there – there, as I have stated in the past (here), (here), and (here), I completely disagree with the decision for security companies, researchers and individuals to make information on vulnerabilties, or how to exploit them, public without following responsible disclosure – this is not very responsible Organizations should NOT do business with entities that put our networks at risk – period!

Advertisements

6 thoughts on “Month of Apple Bugs

  1. Pingback: On Our Announcement, User Awareness, and A Whole Month of Apple Bugs… - RiskAnalys.is

  2. It seems to me that your presentation is very one sided and biased against technological people. That’s ok, because I’m also biased against business suit type of people, but you must consider how this affects your judgment. Would you also say that you shouldn’t work with the PR company who created the Oracle Unbreakable Linux campaign? Not the perfect comparison but you understand my point: would you attack other parts of the business (finance, marketing, etc) as well?

    Back to vulnerability disclosure: you must realize that those bugs are there, regardless of the fact that they “have been discovered” or not. Their existence puts customers at risk (because there may be people who know them and don’t share that knowledge). The company that produced the software is responsible for them (because they don’t follow secure development practices, they don’t train their programmers, etc) – unfortunately only in a moral sense because todays EULAs relieve them of all legal responsibilities. In this context somebody who makes a flaw public is doing a service. He didn’t create the flaw, the company did. This is a distinction which should be made in peoples heads. Admittedly doing “responsible disclosure” is a better thing to do, but doing any disclosure is still a good thing. And many people chose to disclose vulnerabilities because their prior experience with the company treating the question as a PR / marketing question rather than a technical one.

    As a business suit, you should think about the fact that these people are doing for free what your heavily priced QA department was unable to do. If you are a smart company you hire them.

    A much more in depth blog post on disclosure can be found here: http://securosis.com/2006/08/29/the-3-dirty-little-secrets-of-disclosure-no-one-wants-to-talk-about/

  3. I am not sure what your point is and what difference it makes if I wear a suit or am technical – you seem to assume that the debate is lined up that way, it isn’t, and it is ignorant to fame it that way.

    I have had this debate ad naseum – you have absolutely nothing new to add, just more hot air from the research side as to how you are simply doing a service to the IT world by pointing out flaws that others should have caught, screw the IT folks caught in the middle of your silly game.

    As for Rich’s thoughts he also stated the same thing about disclosure so you should at least point to someone who supports your argument that disclosure, responsible or not, is good – it isn’t!

  4. I included the link as an example how an article written about disclosure should look: well argumented instead of just throwing up a few inflammatory sentences which fail to recognize the contribution and work of these security researchers (again, very few people can do this and you should be grateful to them for providing their services for free).

    Let me summarize my argument again:

    1. Bugs exists in the program from the time they are launched. Security researchers are not “creating” them.

    2. Some of these bugs get exploited by a small circle of blackhats who sell their services for financial gain. This is low profile, many of the affected people don’t know how they were exploited or even the fact that they were. The fact that this is under the radar results in the fact that the vendor (a) doesn’t know about them or (b) considers them “low priority” because “they are not actively exploited”

    3. Raising the bar of awareness if a very good thing. If the researcher wouldn’t come forward with the vulnerability (a) the vendor wouldn’t have know about it at all or (b) would have fixed it after a very long period of time.

    While I don’t have hard numbers, my feeling is that most of the MS patches for example were responses to public exploits rather than internal QA tests.

  5. Your points are weak and have never been argued successfully. Quite honestly I don’t really give a shit what you think, disclosure is not helping IT organizations. As for my brief posting, if you had taken the time to follow the links you would have seen a very long posting on the subject. As for other well thought out posts on disclosure I suggest you read what Ranum has to say – your attitude is just adding to the problem, not the solution.

  6. Pingback: How Fast Do Vendors Tackle Vulnerabilities? - Network Sentry

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s