Regulation Re-re-re-redux

Ira Winkler has a guest commentary on SearchSecurity.com today. It touts the need for regulations due to the “security problem.” He is promoting his “Winkler Act” which has the following points:

Computers attached to external networks must be installed and hardened per the vendor’s, Center for Internet Security’s and/or National Institute of Standards and Technology’s security baselines.

Vendor patches must be applied to systems attached to external networks within a specified time period based on the criticality of the associated vulnerability. For example, 30 days for a critical vulnerability and quarterly for low risk vulnerabilities.

Vulnerability assessment tools for identifying the presence of known threats should be utilized on a quarterly basis (at a minimum).

System administrators, or anyone responsible for maintaining operating systems and applications must take a security course approved by the vendor or legitimate certifying authority.

Vendors must have an established and documented software test program in place that accounts for common security problems and represents a measurable component of the development process. In lieu of an oversight group, a quantifiable measure written into law would place ownership on vendors for proving proper security software testing.

Internet service providers must implement software to detect and deactivate systems used to launch well-known attacks, such as denial-of-service and those that distribute spam and viruses — until they are fixed.

The civil liability resulting from the failure to implement these rules (i.e., the monetary loss from malicious system use by a third party) would ensure regulatory enforcement; criminal liability may even apply in some situations. Clearly, this part of the law would be specific, to include penalties, oversight groups, language that addresses commerce crossing state boundaries, etc.

Ultimately, this is misguided, as is most pro-regulation commentary in the security space. There are two main reasons for this: 1) We always want to equate Internet security issues with death and physical harm of people – it just doesn’t work (nor do the bordering on ridiculous vehicle safety analogies that come up). We need to reorient ourselves by considering parallels to other types of crimes – theft, primarily; and 2) Things generally work out okay on the Internet. Sure, there is a risk that it won’t in the future, and sure, professionals need to be wary of the activity going on, but to suggest there is a “security problem” is more about FUD than reality. Sort of like the West Nile virus “epidemic” that doesn’t exist. Don’t get me wrong here – I certainly believe it is prudent for enterprises to protect themselves, but that is their responsibility. To suggest that this is a community problem is a different story (ultimately, we are talking about the outer edges of the cloud here when we should be worrying about the core). It would be like regulating locks on doors, not going down dark alleys at night, and always driving with your windows up (think theft not death here).

The single important element to Winkler’s “Act” is to get ISPs more involved in identifying and deactivating (which I will interpret to mean blocking/dropping at the network level without touching the endpoint) violating systems. My caveat is to organize this to use only in times of mass-worms like Blaster and have the government sound the alarm for ISPs to implement. I am not opposed to something slightly more aggressive, like filtering out Code Red traffic, etc., but think that can get problematic.