Ramblings while reading Microsoft’s Security Intelligence Report

I just downloaded Microsoft’s Security Intelligence Report. Given my predisposition toward good stats, I am looking forward to reading it. Herewith is a running chronology of my thoughts as I read it:

  • opening pages – 25 authors! even more contributors! wow – it better be worth it…
  • 232 pages!
  • page 8: “the most significant trend in 1H09 was the large increase in worm infections detected
    in many countries and regions worldwide.” I wonder if this is due to more worms or better detection capabilities.
  • page 8: operating system trends – It would be cool if they tried to correlate infections with installed bases of the operating systems.
  • page 9: “Computers in enterprise environments (those running Microsoft Forefront™ Client
    Security) were much more likely to encounter worms during 1H09 than home computers
    running Windows Live™ OneCare™.” Again I wonder if this is an anomaly of better detection techniques.
  • page 9: Conficker is top threat with enterprises and not in top ten with consumers. Is this because Conficker targets enterprises or simply because consumers have many other threats that are well-controlled in enterprises?
  • page 9 (and elsewhere, I am sure): I really, really, really wish that statistics reports like this would discuss the overall change to a population prior to describing the % change of individual elements. I never know whether the absolute trend is growing or shrinking for any single element…
  • page 10: Interesting factoid – “Locations with smaller populations and fewer Internet hosts tend to have higher concentrations of phishing pages.” Not sure what to make of it – could it be a manifestation of Spire’s Fundamental Law of Internet Dynamics?
  • page 10-11: pretty sparse information about SQL Injection attacks…
  • Microsoft really wants people to regret not moving to Vista. What might be more interesting is comparing the costs/benefits associated with the move to XP to those costs/benefits of moving from XP to Vista.
  • page 12: “Compromised servers acting as exploit servers can have massive reach; one exploit server can be responsible for hundreds of thousands of infected Web pages.” I would not have characterized the number of affected web pages on a single server as “reach”. Am I reading this wrong?
  • page 12: first introduction of the world “disclosure”… uh, oh.
  • page 12-13: undercover exploits are becoming more and more popular. It would be great for Microsoft to also capture information about how many vulnerabilities are identified due to in-the-wild attacks versus various forms of code review and pen testing.
  • page 13: Ten out of 87 vulnerabilities were exploited during the first 30 days after release. Not sure how many patches this relates to. We are doing a lot of work for limited reward.
  • page 13: I am not sure why “Microsoft recommends” is a phrase that pops up in a “key findings” section. Also, it appears that much of the update information are opinions and not based on any findings.
  • page 15: I am just reaching the “Executive Foreword” section? Ouch, I am already on my second sitting and haven’t reached the details. Gotta keep reading…
  • page 19: 1.7 billion Internet users. Akamai sees traffic of about 425 million unique IPs per day and estimates that comes from about a billion unique users.
  • page 23: history of Trustworthy Computing; “the memo that rocked the world”… it never gets old… well maybe never…
  • “the computing experience is much different—and much safer—in 2009 than it was in 1999.” I wonder if this is the general consensus of security professionals. I don’t disagree vehemently, but I suspect others would. A little context might be good, rather than a broad sweeping statement like this one.
  • page 36: a brief interlude to page 223 to check out data sources. It turns out to be more of a product data sheet. I hope there is more information on how each data item was collected.
  • page 37: Computers Cleaned per Thousand (CCM) – “infection rates in this report are expressed using a metric called computers cleaned per thousand, or CCM, which represents the number of computers cleaned for every 1,000 executions of the MSRT.” A Metric! I like it!
  • page 38: They tricked me! They define a great metric and it’s not in the first chart that identifies “computers cleaned” but not “per thousand”. Rats! (It’s gotta be here somewhere…)
  • page 40: still no CCM! and now they are breaking down threats. Please don’t tell me they are going to tell me there is a 5.5% increase in computers cleaned for the US (for example) and NOT normalize it by scan volume.
  • page 41: a completely unsatisfying, though interesting ‘heatmap’ overlay of the world. Too general for my interests, and to be honest I don’t really understand why geography matters with a broad metric like this.
  • page 41: “See “Appendix A: Full Geographic Data,” on page 172, for a more comprehensive list with 212 locations,” AHA! I quickly flip to page 172 and find US figures – 9.1 for 2H08 and 8.6 for 1H09. And realize that exactly what I was concerned about (two bullets above) had happened – while there was an “increase” by 5.5% of computers cleaned in the US, there was actually a DECREASE when normalized by volume!  (stop here to check my work – can this really be true? yep, it is). This, btw, means that less than 1% of scans result in a computer requiring cleaning. (Note to self – compare this to data on likely botnet infection rate across Internet).
  • CCM is an excellent metric. I hope they make the best of it.
  • page 45: “We believe the low piracy rate, combined with a generally strict IT security enforcement of ISPs and the fact that updates are quickly installed due to fast Internet lines (broadband, cable connection) forms a basis for the generally low infection score in Austria.” Interesting.
  • page 50: good chart showing CCM by operating system. I mentioned earlier that it would be interesting to compare changes in risk and costs across operating systems. Using the chart on page 50, for example, we can see how within the XP family (and essentially for “free”) you can strengthen your security from 3.25% (32.5 CCM) to .08% (8.0 CCM). To move to Vista, you can go from .8% to .31%, but licensing and upgrade costs may not support the move.
  • page 56: 20% of people actively “ignore” severe threats. The chart on this page could probably be expanded into a psychology thesis.
  • page 62: it’s getting late and I am getting frustrated. How come I can’t parse this: ” Figure 20 shows the relative prevalence of different categories of malware and potentially unwanted software on infected computers running Windows Live OneCare and Forefront Client Security in 1H09, expressed as a percentage of the total number of infected computers cleaned by each program. Totals exceed 100 percent for each program because some computers were cleaned of more than one category of families.” And why on earth would they put side-by-side bar graphs when the populations are different and they use percentages on the y axis? especially when you can exceed 100% due to multiple infections? arrghh!

Well, it is 11:25, the Phillies are about to win game 5 of the World Series (I better be right on this one)… I am about 25% through the report… I guess I’ll need to continue this (or not) tomorrow…

Go Phillies!

1 comment for “Ramblings while reading Microsoft’s Security Intelligence Report

  1. November 16, 2009 at 4:26 pm

    “page 8: “the most significant trend in 1H09 was the large increase in worm infections detected
    in many countries and regions worldwide.” I wonder if this is due to more worms or better detection capabilities.”

    not sure, but i have long been predicting a renaissance for self-replicative malware – the economy of effort is just too tempting for the practice to stay out of favour.

    “# page 9: “Computers in enterprise environments (those running Microsoft Forefront™ Client
    Security) were much more likely to encounter worms during 1H09 than home computers
    running Windows Live™ OneCare™.” Again I wonder if this is an anomaly of better detection techniques.
    # page 9: Conficker is top threat with enterprises and not in top ten with consumers. Is this because Conficker targets enterprises or simply because consumers have many other threats that are well-controlled in enterprises?”

    one word – patching. enterprises are much less likely to leave automatic updates turned on. as a result vulnerabilities stay unpatched longer in those environments and worms exploiting those vulnerabilities spread better.

    “page 12: “Compromised servers acting as exploit servers can have massive reach; one exploit server can be responsible for hundreds of thousands of infected Web pages.” I would not have characterized the number of affected web pages on a single server as “reach”. Am I reading this wrong?”

    i wouldn’t characterize it that way either. perhaps the affected webpage stat is to demonstrate *how* such a reach could be accomplished.

Comments are closed.