In response to a post of Drazen's, quoting Peter Benson on Disclosure Laws; http://beastorbuddha.blogspot.com/2007/04/disclosure-laws-impacts-and-things-to.html
I offer the rant below.
I love this kind of topic, merely to highlight the macro and micro issues. One must look outside ones own discipline to find answers, as sometimes becoming too specialised does not allow one to 'see the forest, for the trees', more often than not.
I would like to try and answer the issue if I may with some history, a dash of the present and a dab of the future.
This is what's starting to happen in our society and industry in terms of complexity and economics http://dieoff.org/page134.htm . Even though this paper is focussed on natural ecosystems and civilizations; the internet and composing networks are a wonderfully rich representative ecosystem existing in our civilization.
As complexity increases there is increased energy needed in any system. This either produces new paradigms which address diminishing marginal returns, or the system collapses under the weight of trying to address the complexity. Thus what is required is either non-reductionist thought to address the complexity, e.g. "Defense in Depth" (which happens to be extremely costly), or a reduction in complexity and type of energy required in trying to solve the problems, resulting in a new paradigm or paradigms. To introduce the next paragraph I thought I'd quote Marcus Ranum (http://www.ranum.com/) "Your job, as a security practitioner, is to question - if not outright challenge - the conventional wisdom and the status quo. After all, if the conventional wisdom was working, the rate of systems being compromised would be going down, wouldn't it?"
Present: Quality and Cost Benefit Analysis
Sometimes when you have been travelling along a certain path, there are a few signposts as to why you are potentially lost.
Personally I believe the tools and processes are out there, but the enumeration of the problem is somewhat incorrect and being hampered by the 'old guard' of IT who actually don't really get it! They are suffering extreme forms of 'Future Shock' (http://en.wikipedia.org/wiki/Future_Shock) "too much change in too short a period of time"... This is in fact a wider social issue that is very hard to address as people are afraid to challenge the status-quo or can't affect change within their existing roles. This must happen more quickly rather than allowing a generational 'breed out' of the less savvy CIO's, CTO's, CSO's and below, as things are speeding up and not slowing down. This will only occur with economic motivations. Economics is based on theories of scarcity and the perceived value of goods and services. We are having huge issues in evaluating data over it's lifecycle and putting a price on the ensuing issues and costs of a breach, disclosure or unintended manipulation of data.
As Grace Murray Hopper, USN (Ret) points out;
'Some day, on the corporate balance sheet,
there will be an entry which reads,“Information”;
for in most cases, the information is more valuable
than the hardware which processes it. '
Dan Geer re-introduces this in his wonderful paper "The Shrinking Perimeter: Making the Case for Data-Level Risk Management", which argues for object level protection and data valuation, which opens with the previous quote. (http://www.verdasys.com/pdf/ShrinkPerim.pdf )
Another interesting topic is that of time and physics at play in our new world. Time based security and convergence argues for new paradigms. (Convergence, Dan Geer http://geer.tinho.net/ieee.geer.0606.pdf) and highlights new effects of this highly connected information based economy.
To understand the infrastructure and ecosystems out there, one must constantly sample and baseline traffic in the face of constant change. Some change is valid, some invalid. One cannot manage what one cannot measure, and change management is at the heart of it all. Metrics need to be standardised upon and individual nodes or systems need to become simpler e.g. more easily defined and controlled.
MTTR (Mean Time To Repair, http://en.wikipedia.org/wiki/Mean_time_to_repair) for example, requires that one actually knows something is at first broken and/or performing incorrectly (be it malicious or benign!).
Even though technology changes, the challenge of information management stays the same.
Sampling and surveillance, tied to regulation and compliance? Whose pocket gets hurt and what can they then do about it? Does a public shaming exact the financial penalties warranted or is public memory short lived when entities change and reform as different companies?
I do believe it's the start of building a baseline awareness. But honestly, without a form of Total Information Awareness, massive indexing and far reaching information asset management, how do you know:
a) what you've lost
b) when you've lost it
c) how you've lost it
d) how not to lose it again
Where does the burden of liability fall and how big is the carrot or stick?
Hopefully we don't start to litigate. http://www.ranum.com/security/computer_security/editorials/lawyers/index.html
I am beginning to be more optimistic with good folks like SA (http://www.security-assessment.com/) on the case!