a) 1974 http://web.mit.edu/Saltzer/www/publications/protection/
and
b) SecurityMetrics mailing list going round in circles.....
including
c) "It'll be just as insecure as it possibly can, while still continuing to function."
http://www.ranum.com/security/computer_security/editorials/point-counterpoint/homeusers.htm
One does worry.
Until we can elicit a value to shared and dedicated nodes/messages + the organisational superorganism as a whole, risk and the quantification thereof is a joke.... unfortunately shared infrastructure and services such as routing/DNS/SNMP/NTP/logging *are* business critical e.g. data and control planes including management control planes. http://twitter.com/irldexter/status/1087480944
Here's to 2009! And some standardisaiton of code development and testing including liability etc as per David Rice's arguments in Geekonomics. http://my.safaribooksonline.com/9780321477897
Peace.
Wednesday, December 31, 2008
Saturday, December 27, 2008
Aussie Filtering Meme
Gridlock 09': what do you think happens when every car is searched on the information superhighway?
Note: potentialy breath testing on the freeway, what do you think will happen?
NoteII: can use trucks, cars, buses and motorbikes to refer to packets/QOS etc.
NoteIII: easily accessible meme... 87% speed limit reduction on motorways for everyone..
NoteIIII: Australia, going nowhere fast.
NoteIIIII: Australia = Auto-BAN!
Note: potentialy breath testing on the freeway, what do you think will happen?
NoteII: can use trucks, cars, buses and motorbikes to refer to packets/QOS etc.
NoteIII: easily accessible meme... 87% speed limit reduction on motorways for everyone..
NoteIIII: Australia, going nowhere fast.
NoteIIIII: Australia = Auto-BAN!
Monday, December 22, 2008
Sunday, December 21, 2008
Cavity searches and Internet Filtering
Proxying and tunnels will always get around filters. Full stop. I do not support censorship. Full stop. I do not support child pornography. Full stop. If child pornography was served from static webservers it would be easy to pull down. Full stop. There are 65,535 usable tcp ports. There are 65,535 usable udp ports. There are ~4 billion usable IP addresses spread around different regions on the planet and advertised by different Autonomous Systems.
However some thoughts.
a) From a Cisco perspective on a CRS/SSG/SCE, a thousand+ line ACL and Policy Map that routes requests for certain IP addresses to 'null 0' or sets the next hop for traffic to a logging/404 host would be feasible at peering edges. Feasible. As would a live feed akin to the Team Cymru dynamic bogon and martian feed. Feasible but very dangerous to centralise such dynamic control. One could also insert better prefixed /32 routes on the fly, akin to 'clean pipes' solutions that sinkhole and try to scrub distributed denial of service attacks.
The above points in (a) are equally achievable with Juniper high end gear also... and probably others too.
b) Any appliance/blade based content filtering or 'inline/OOB' IDS(Intrusion Detection System) will result in an epic fail; as opponents could shunt bad packets and malformed/obfuscated http gets at the device from multiple shifting sources all day and night to bring it to its knees! (Unless it only runs a DST IP block list...) There would also be massive issues with scaling and redundancy for most ISPs, let alone power and space constraints.
Both of the above techniques with the exception perhaps of the Null routing option for option(a) would allow one to reverse engineer the list with web spidering and/or large scale scanning.
c) AHTCC(Australian High Tech Crime Centre) and AFP(Australian Federal Police); if they are aware of known IP source addresses of 'kiddie porn', should already be looking for these traffic flows via a form of (a) that only provides them logging e.g. fully honors the traffic request and then subsequently investigate which Australian devices have sent said illegal requests via standard process. Unfortunately there can be issues with malware and establishment of identity/intent, however subsequent forensic investigations on offending hard-disks *should* confirm innocence or guilt (though costly to pursue as most good investigations are). If the list of DST IP addresses was leaked, miscreants could play havoc with the authorities through spoofing requests from local ISP IP addresses e.g. valid local SRC ranges(which an ISP cannot filter), and/or use botnets on Australian networks to achieve denial of service. Devices would have to fail open.
If logging was bidirectional then both inbound requests to Australian hosted content and requests to external countries would be enumerated. Unfortunately one could only effectively begin with IP's reported to authorities as any form of deeper content inspection would facilitate issues and abuses mentioned in (b).
Essentially, as with real world crime, one issue encountered is sometimes to balance the benefit of blocking or pre-empting an ongoing 'crime', with the possible information garnered by passively monitoring and going for the 'crimebosses' or actual perpetrators. This is a tough topic. No one is suggesting child pornography is a victimless crime, however are we going after paedophiles or trying to block what kids *might* see?
I do not support censorship. I am across LI(Legal Interception) in mobile/PSTN and data networks somehwhat and one should have warrants to actively tap citizens communications based upon probable cause. If one can prove that an IP address serves or accesses child pornography individuals should be prosecuted as per the judicial system. Unfortunately filtering *all* flows is a slippery slope, especially for those that control the blacklist(s).
d) How does one vet content and keep the IP addresses in (a) up-to-date and based upon what criteria? This can be especially difficult when recent research found that 69.8% of the websites for .com, .org and .net domains shared an IP address with 50 or more other websites. Such that a device served illegal content from a fixed IP, the content would be brought down very quickly by the authorities in the hosting country once that country had laws governing hosting of said content.
e) DNS poisoning = epic fail. Go direct to IP. Do not pass go. Do not collect $200.
I do not support child pornography. I do not support censorship. I do not support wasting millions of tax payers hard earned money on a fallacy.
I support law enforcement that works. I support the AHTCC and AFP. I support democracy. I do not support strip searching everyone that enters and leaves the country. I do not support all phone calls being tapped for illegal seditious conversations. I do not support our mail service opening every one of our letters. What are we doing about child pornography being physically mailed around in hardcopy or encrypted on USB thumb drives in the regular mail, or is that too hard?
And yes I am having seditious thoughts.
Cyberlaw, cybercops and cyber-democracy are required for cyberspace. Not cyber kid gloves and cyber sledge hammers.
However some thoughts.
a) From a Cisco perspective on a CRS/SSG/SCE, a thousand+ line ACL and Policy Map that routes requests for certain IP addresses to 'null 0' or sets the next hop for traffic to a logging/404 host would be feasible at peering edges. Feasible. As would a live feed akin to the Team Cymru dynamic bogon and martian feed. Feasible but very dangerous to centralise such dynamic control. One could also insert better prefixed /32 routes on the fly, akin to 'clean pipes' solutions that sinkhole and try to scrub distributed denial of service attacks.
The above points in (a) are equally achievable with Juniper high end gear also... and probably others too.
b) Any appliance/blade based content filtering or 'inline/OOB' IDS(Intrusion Detection System) will result in an epic fail; as opponents could shunt bad packets and malformed/obfuscated http gets at the device from multiple shifting sources all day and night to bring it to its knees! (Unless it only runs a DST IP block list...) There would also be massive issues with scaling and redundancy for most ISPs, let alone power and space constraints.
Both of the above techniques with the exception perhaps of the Null routing option for option(a) would allow one to reverse engineer the list with web spidering and/or large scale scanning.
c) AHTCC(Australian High Tech Crime Centre) and AFP(Australian Federal Police); if they are aware of known IP source addresses of 'kiddie porn', should already be looking for these traffic flows via a form of (a) that only provides them logging e.g. fully honors the traffic request and then subsequently investigate which Australian devices have sent said illegal requests via standard process. Unfortunately there can be issues with malware and establishment of identity/intent, however subsequent forensic investigations on offending hard-disks *should* confirm innocence or guilt (though costly to pursue as most good investigations are). If the list of DST IP addresses was leaked, miscreants could play havoc with the authorities through spoofing requests from local ISP IP addresses e.g. valid local SRC ranges(which an ISP cannot filter), and/or use botnets on Australian networks to achieve denial of service. Devices would have to fail open.
If logging was bidirectional then both inbound requests to Australian hosted content and requests to external countries would be enumerated. Unfortunately one could only effectively begin with IP's reported to authorities as any form of deeper content inspection would facilitate issues and abuses mentioned in (b).
Essentially, as with real world crime, one issue encountered is sometimes to balance the benefit of blocking or pre-empting an ongoing 'crime', with the possible information garnered by passively monitoring and going for the 'crimebosses' or actual perpetrators. This is a tough topic. No one is suggesting child pornography is a victimless crime, however are we going after paedophiles or trying to block what kids *might* see?
I do not support censorship. I am across LI(Legal Interception) in mobile/PSTN and data networks somehwhat and one should have warrants to actively tap citizens communications based upon probable cause. If one can prove that an IP address serves or accesses child pornography individuals should be prosecuted as per the judicial system. Unfortunately filtering *all* flows is a slippery slope, especially for those that control the blacklist(s).
d) How does one vet content and keep the IP addresses in (a) up-to-date and based upon what criteria? This can be especially difficult when recent research found that 69.8% of the websites for .com, .org and .net domains shared an IP address with 50 or more other websites. Such that a device served illegal content from a fixed IP, the content would be brought down very quickly by the authorities in the hosting country once that country had laws governing hosting of said content.
e) DNS poisoning = epic fail. Go direct to IP. Do not pass go. Do not collect $200.
I do not support child pornography. I do not support censorship. I do not support wasting millions of tax payers hard earned money on a fallacy.
I support law enforcement that works. I support the AHTCC and AFP. I support democracy. I do not support strip searching everyone that enters and leaves the country. I do not support all phone calls being tapped for illegal seditious conversations. I do not support our mail service opening every one of our letters. What are we doing about child pornography being physically mailed around in hardcopy or encrypted on USB thumb drives in the regular mail, or is that too hard?
And yes I am having seditious thoughts.
Cyberlaw, cybercops and cyber-democracy are required for cyberspace. Not cyber kid gloves and cyber sledge hammers.
Saturday, December 06, 2008
Some light relief?
I might get in trouble for this with the natives but I love this guy. Protect the kids from predators but that's about it. Industrial schooling, FUCK YOU SOCIETY!
Read Summerhill and grow. Link to website here: http://www.summerhillschool.co.uk/
Read Summerhill and grow. Link to website here: http://www.summerhillschool.co.uk/
Wordle.net tres-cool-super-sexy
Thursday, December 04, 2008
Thursday, November 20, 2008
Tuesday, November 18, 2008
Arrogance of obfuscation.
http://spittoon.23andme.com/2008/08/29/labs-remove-genetic-data-from-public-databases-after-forensic-breakthrough/#more-1174
"a new statistical method that can establish the presence of a single
individual's genetic signature in a sample containing DNA from
hundreds of different people."
"It was previously assumed that aggregating the data of hundreds or
even thousands of people — essentially giving the overall genetic
composition of the group as a whole — would make it impossible to
identify any one person in it."
"Hypothetical scenarios aside, it is highly unlikely that any person
has ever actually been picked out of an aggregate database, and not
just because the mathematics of the new method are so complex."
I sometimes worry more about DNA being 'stored' and derivitive
contextual information and the associated information security thereof
rather than any credit card data etc. This is prompted by a great new
service https://www.23andme.com/
In the future when a genomic representation of me can be anywhere and
stored in multiple locations, how do I asert that I am me.
The biometric system *must* be ensured to be localised and not
connected to any other networks... erm a bit like SCADA systems I hear
you say? Will we need an X point check system.. something I have,
know, macro am, micro am, quantum am, how Y is done/performed, how Z
is done/performed, how K is done/performed..
Fun: I spoke before about nanobots, I now present "nano-identbots"
which are like mayflies i.e. they die once they identify a host and
have a localised PGP relationship with the system with which identify
is supposed to be represented to. They generate keys on birth and
immediately go out to identify the subject etc...
"a new statistical method that can establish the presence of a single
individual's genetic signature in a sample containing DNA from
hundreds of different people."
"It was previously assumed that aggregating the data of hundreds or
even thousands of people — essentially giving the overall genetic
composition of the group as a whole — would make it impossible to
identify any one person in it."
"Hypothetical scenarios aside, it is highly unlikely that any person
has ever actually been picked out of an aggregate database, and not
just because the mathematics of the new method are so complex."
I sometimes worry more about DNA being 'stored' and derivitive
contextual information and the associated information security thereof
rather than any credit card data etc. This is prompted by a great new
service https://www.23andme.com/
In the future when a genomic representation of me can be anywhere and
stored in multiple locations, how do I asert that I am me.
The biometric system *must* be ensured to be localised and not
connected to any other networks... erm a bit like SCADA systems I hear
you say? Will we need an X point check system.. something I have,
know, macro am, micro am, quantum am, how Y is done/performed, how Z
is done/performed, how K is done/performed..
Fun: I spoke before about nanobots, I now present "nano-identbots"
which are like mayflies i.e. they die once they identify a host and
have a localised PGP relationship with the system with which identify
is supposed to be represented to. They generate keys on birth and
immediately go out to identify the subject etc...
Strategy and IT Architecture
Greenfields accommodates top down thinking.
Legacy constraints force the dichotomoy of top down and bottom up, for the devil is in the detail!
Legacy constraints force the dichotomoy of top down and bottom up, for the devil is in the detail!
Monday, November 10, 2008
Thursday, November 06, 2008
Had to re-post this. I love it.
This is very like Open University or BBC2 programs for schools. Wonderful sardonic stuff!
Tuesday, October 28, 2008
Cloud Computing
A crackin' good read from Amrit Williams on a topic close to our hearts!
http://techbuddha.wordpress.com/2008/10/26/cloud-computing-the-good-the-bad-and-the-cloudy/
http://techbuddha.wordpress.com/2008/10/26/cloud-computing-the-good-the-bad-and-the-cloudy/
And on the second day God said “let there be computing - in the cloud” and he gave unto man cloud computing…on the seventh day man said “hey, uhmm, dude where’s my data?”
Sunday, October 26, 2008
Thursday, October 23, 2008
Oyster
Sunday, October 19, 2008
Wednesday, September 24, 2008
Saturday, August 30, 2008
Bacterial IT Security
Imagine if you will lots of people playing Will Wrights new game SPORE
http://www.ted.com/index.php/talks/will_wright_makes_toys_that_make_worlds.html
. There is no common species to speak of. Everyone who plays uses
COTS(yes off the shelf :) but every organism looks, feels and acts
differently. Sure they move, amble, fly, walk, run, eat, shit,
procreate etc... however we now have many, many unique entities that
interact. They can all be affected by lack of water, food,
temperature, disease, but how exactly? What do we measure and what do
we focus on? Now lets go macro to micro.
How is any system sustainable? Is there a net gain or loss in
energy/entropy? Will it sort itself out if we just sit-back and wait
for certain breeds to die out? What is it that allows some to succeed
and others to become extinct. We don't need to measure *all* factors,
however perhaps just the successful candidates, and what defines
success? Success = survivability and adaptability?
Assume fluid/shifting environment thus mobile, morphing, modular,
ability to react, change, multiply, access to resources.
Erm, again a stream of consciousness... I think you see where I am
going. We fucked up in IT. We thought we were building pyramids and
fort knox's when in actual fact we needed flocks of birds and 'ships
of the desert'....
What is the longest surviving species and why? -> symbiotic virii?
reptiles? co-existant properties for good guys/bad guys?
Ranum: Will the future be more secure? It'll be just as insecure as it
possibly can, while still continuing to function. Just like it is
today.
http://www.ted.com/index.php/talks/will_wright_makes_toys_that_make_worlds.html
. There is no common species to speak of. Everyone who plays uses
COTS(yes off the shelf :) but every organism looks, feels and acts
differently. Sure they move, amble, fly, walk, run, eat, shit,
procreate etc... however we now have many, many unique entities that
interact. They can all be affected by lack of water, food,
temperature, disease, but how exactly? What do we measure and what do
we focus on? Now lets go macro to micro.
How is any system sustainable? Is there a net gain or loss in
energy/entropy? Will it sort itself out if we just sit-back and wait
for certain breeds to die out? What is it that allows some to succeed
and others to become extinct. We don't need to measure *all* factors,
however perhaps just the successful candidates, and what defines
success? Success = survivability and adaptability?
Assume fluid/shifting environment thus mobile, morphing, modular,
ability to react, change, multiply, access to resources.
Erm, again a stream of consciousness... I think you see where I am
going. We fucked up in IT. We thought we were building pyramids and
fort knox's when in actual fact we needed flocks of birds and 'ships
of the desert'....
What is the longest surviving species and why? -> symbiotic virii?
reptiles? co-existant properties for good guys/bad guys?
Ranum: Will the future be more secure? It'll be just as insecure as it
possibly can, while still continuing to function. Just like it is
today.
OMFG-SPAM
I have to opt-out of corporate communication in Australia rather than opt-in, fair nuff'. I have already opted out of all Qantas UCE(Unsolicited Commercial Email) as stated in the junk mail I just received the other day. I will let it speak for itself. Corporates are out of control.
"Dear MR X,
We regularly send exciting offers and news via email. But sadly, we haven't been able to get them to you, because while we have your email address as xxxxxx@xxxxxxx.com.au you have not opted-in to receive any of our email communications."
So they send me physical junk mail... Pure and utter, MADNESS! OMFG!
Taking a leaf out of Wade's book, I have lodged a complaint here http://www.acma.gov.au/WEB/STANDARD/pc=PC_310369 but this might not work based upon a technicality/convergence?
Aside: Click on the screenshot for a somewhat better image, apologies about quality as was taken on camera phone in bad light.
"Dear MR X,
We regularly send exciting offers and news via email. But sadly, we haven't been able to get them to you, because while we have your email address as xxxxxx@xxxxxxx.com.au you have not opted-in to receive any of our email communications."
So they send me physical junk mail... Pure and utter, MADNESS! OMFG!
Taking a leaf out of Wade's book, I have lodged a complaint here http://www.acma.gov.au/WEB/STANDARD/pc=PC_310369 but this might not work based upon a technicality/convergence?
Aside: Click on the screenshot for a somewhat better image, apologies about quality as was taken on camera phone in bad light.
More DNS jiggery!
So here is a brain-fart that prolly' needs cleaning up and is awaiting moderation on Kaminsky's site! Challenge-response mechanism basically. A DNS tunnelled type of CHAP for DNS.. and I know, I know, separate DNS server functions and all that... however ideas are ideas...
http://www.doxpara.com/?p=1237
"I’m afraid it’s always a computational cost or time trade-off. Essentially Penny Black project style challenge or LaBrea tarpitting is required.
What *DO* we own re: logical assets? RRs!!!!!
Could we ask the remote server somehow to lookup a temp record in our own domain, generate one(as we own our own servers hopefully) and then wait for the lookup from the remote NETBLOCK? Kinda like SMTP authentication for websites and mailing lists with an OTP. One would have to be in-path to know the variable or understand some characteristic of the remote network/domain.
Let’s use the attack in reverse to secure the attack? Use the attack to secure ourselves as we can generate an arbitrary RR in our own domain e.g. our resolver talks to our domain NS and tells it to inject a local variable/record… think of it as a magic number, assume the attacker is not in-path.. then force the remote domain to ask our domain about it, before they give us the original new query for a host… haven’t totally thought this through fully :) might be worse re: BW :(
Or maybe debounce but record “IP TTL” tolerance. Sure IP TTL’s can change with backbone routing updates but less likey in the course of lookups for random/new hosts not actually in local cache already.
Firewalls or any policy point invalidates host based rate limiting somewhat."
http://www.doxpara.com/?p=1237
"I’m afraid it’s always a computational cost or time trade-off. Essentially Penny Black project style challenge or LaBrea tarpitting is required.
What *DO* we own re: logical assets? RRs!!!!!
Could we ask the remote server somehow to lookup a temp record in our own domain, generate one(as we own our own servers hopefully) and then wait for the lookup from the remote NETBLOCK? Kinda like SMTP authentication for websites and mailing lists with an OTP. One would have to be in-path to know the variable or understand some characteristic of the remote network/domain.
Let’s use the attack in reverse to secure the attack? Use the attack to secure ourselves as we can generate an arbitrary RR in our own domain e.g. our resolver talks to our domain NS and tells it to inject a local variable/record… think of it as a magic number, assume the attacker is not in-path.. then force the remote domain to ask our domain about it, before they give us the original new query for a host… haven’t totally thought this through fully :) might be worse re: BW :(
Or maybe debounce but record “IP TTL” tolerance. Sure IP TTL’s can change with backbone routing updates but less likey in the course of lookups for random/new hosts not actually in local cache already.
Firewalls or any policy point invalidates host based rate limiting somewhat."
Wednesday, August 27, 2008
Thursday, August 21, 2008
Request for flows
Problem Statement/Overview of Initiative:
- information has value though it is subjective to the possessor and the dispersion/dilution of instances of said data
- data at rest and data in motion have differing utilities/values
- shared infrastructures are often abstracted or ignored when in actual fact their intrinsic and aggregate value is a multiple of any individual member system or atomic piece of data
- organisations do not share security or incident data due to perceived reputational issues
- security posture and security spend is predicated upon real/percevied value of presence/absence of data/information but not upon complete systems/networks, fabrics or reachability to said data
- to generate even an interim or arbitrary currency of data value, context of data both at rest and in motion must me known
- fabrics provide connectivity, utility, and add-value services to endpoint/virtual nodes
- foundational values and attributes must be quantified for shared infrastructures/fabrics as a pre-requisite to factoring value of endpoint systems and subsequently data -> otherwise all value propositions are independent, incorrect and removed from their actual interdependency and 'real' cost
- once the concept of value can be assigned to classifications of infrastructure nodes based primarily upon their importance/utility, subsequent paths/flows and interdependencies may be weighted and valued (for example similar to metrics used for routing algorithms at a link, distance metric etc)
- *no large flow datasets are available to the security/network community (as per logs to dshield.org etc) with the exception of the Arbor ATLAS project from which primarily only Arbor benefit directly for such types of research.
Proposition:
To generate metrics for different classifications of nodes based upon a simple taxonomy of flows and weighted relationships to infrastructure services. (also utilising reachability/relationships with numbers of endpoints over time). Reachable endpoints/IPs may be virtual interfaces or physical interfaces and also may be subsets of greater nodes such as clusters/load balancers/virtual endpoints(EHV) etc.
Sample datasets of netflow/IPFIX data are required from a range of organisations over as long a time period as possible. Anonymisation will be applied to protect the innocent.
The author hopes to research the contextual relationships between nodes in an organisation and to weight and attribute value to flows in the hope of arriving at something akin to the ramble @ http://bsdosx.blogspot.com/2006/06/byo-rfc.html , also perhaps using a derivative of 'metcalfes law' based upon reachable/live endpoints to generate a 'total' value of the fabric and services to the business or organisation.
- information has value though it is subjective to the possessor and the dispersion/dilution of instances of said data
- data at rest and data in motion have differing utilities/values
- shared infrastructures are often abstracted or ignored when in actual fact their intrinsic and aggregate value is a multiple of any individual member system or atomic piece of data
- organisations do not share security or incident data due to perceived reputational issues
- security posture and security spend is predicated upon real/percevied value of presence/absence of data/information but not upon complete systems/networks, fabrics or reachability to said data
- to generate even an interim or arbitrary currency of data value, context of data both at rest and in motion must me known
- fabrics provide connectivity, utility, and add-value services to endpoint/virtual nodes
- foundational values and attributes must be quantified for shared infrastructures/fabrics as a pre-requisite to factoring value of endpoint systems and subsequently data -> otherwise all value propositions are independent, incorrect and removed from their actual interdependency and 'real' cost
- once the concept of value can be assigned to classifications of infrastructure nodes based primarily upon their importance/utility, subsequent paths/flows and interdependencies may be weighted and valued (for example similar to metrics used for routing algorithms at a link, distance metric etc)
- *no large flow datasets are available to the security/network community (as per logs to dshield.org etc) with the exception of the Arbor ATLAS project from which primarily only Arbor benefit directly for such types of research.
Proposition:
To generate metrics for different classifications of nodes based upon a simple taxonomy of flows and weighted relationships to infrastructure services. (also utilising reachability/relationships with numbers of endpoints over time). Reachable endpoints/IPs may be virtual interfaces or physical interfaces and also may be subsets of greater nodes such as clusters/load balancers/virtual endpoints(EHV) etc.
Sample datasets of netflow/IPFIX data are required from a range of organisations over as long a time period as possible. Anonymisation will be applied to protect the innocent.
The author hopes to research the contextual relationships between nodes in an organisation and to weight and attribute value to flows in the hope of arriving at something akin to the ramble @ http://bsdosx.blogspot.com/2006/06/byo-rfc.html , also perhaps using a derivative of 'metcalfes law' based upon reachable/live endpoints to generate a 'total' value of the fabric and services to the business or organisation.
Tuesday, August 19, 2008
Buller08..Snowboarding
A video encoding education! Hope you enjoy, the music is fun!
If you want a really good quality stream click here and select the "Watch in High Quality" link under the "Views" keyword!
If you want a really good quality stream click here and select the "Watch in High Quality" link under the "Views" keyword!
Thursday, August 14, 2008
Tuesday, August 12, 2008
Enterprise Management and Provisioning
Virtualisation of interfaces between systems and entities will actually provide more robust and persistent characteristics of nodes and resources. It will also allow for dynamic and non-disruptive "adds, moves and changes", but will be a troubleshooting nightmare whereupon convergence of skills and depth of code shall introduce more complexity. A new paradigm of defensible, self monitoring and self diagnosing code will be required. I also believe we need a virtual internet for testing, but that's going to be the next big thing! Donal.
Note:
EHV = End Host Virtualiser
NPV/NPIV = N_Port Virtualisation / N_Port Identifier Virtualisation
This is a pretty good overview on "The Evolving Data Center from Cisco':
http://www.cisco.com/en/US/solutions/collateral/ns340/ns517/ns224/ns783/white_paper_c11-473501.html
Note:
EHV = End Host Virtualiser
NPV/NPIV = N_Port Virtualisation / N_Port Identifier Virtualisation
This is a pretty good overview on "The Evolving Data Center from Cisco':
http://www.cisco.com/en/US/solutions/collateral/ns340/ns517/ns224/ns783/white_paper_c11-473501.html
Friday, August 08, 2008
Google "You can make money without doing evil." ?
Google acquires Postini.
Postini pass traffic to or route through the US ARMY domestic spying ring INSCOM.
My GMAIL account is blocked when trying to send to a mailing list in the US which discusses security metrics.
Details in previous post. 'NGILBEVSMB002CL.il.ng.ds.army.mil' is effectively re-routing and blocking my emails.
Wired story on AT&T splitting backbone fibres: http://www.wired.com/science/discoveries/news/2006/05/70908
Google acquires Postini.
Postini pass traffic to or route through the US ARMY domestic spying ring INSCOM.
My GMAIL account is blocked when trying to send to a mailing list in the US which discusses security metrics.
Details in previous post. 'NGILBEVSMB002CL.il.ng.ds.army.mil' is effectively re-routing and blocking my emails.
Wired story on AT&T splitting backbone fibres: http://www.wired.com/science/discoveries/news/2006/05/70908
Sunday, August 03, 2008
Service transparency needed in Oz too
”The blunt means was referring to how some Deep Packet Inspection (DPI) platforms manage traffic when placed out of line. If a device is out of line, the only way to control traffic is to directly or indirectly signal the sender to slow down or terminate the communication session.”
“Consumers must be fully informed about the exact nature of the service they are purchasing and any potential limitations associated with that service.”
http://asert.arbornetworks.com/2008/08/lessons-learned-from-the-fcc-decision/
Friday, August 01, 2008
SF-SJ-SF-SFA
So I'm sitting in San Fran International terminal awaiting my flight to Melbourne via Auckland and reflecting on some fun stuff in the USENIX Security Symposium in San Jose I just attended(more specifically a special part of the conference on Security Metrics called Metricon):
a) SF Airport is at "Department of Homeland Security Threat Level ORANGE", who gives a sweet flying f**k.. and what does that mean anyway? Security traffic lights 'go slow' perhaps?
b) My GMAIL (web based email obviously.. AJAX'y port 80) to my securitymetrics.org group didn't get through from either the hotel's wireless nor the free wireless at the conference as the military have me blocked as a 'prohibited sender'... cool huh? Must be that email I sent a while back with as many keywords as possible in it. Nice one Draz! Anyway, see below for more details re: SMTP headers (I'm hoping it's not a redirect or filter by the hosting company of securitymetrics.org but I'm not sure yet) (NGILBEVSMB002CL.il.ng.ds.army.mil is in there somewhere...)
c) Spent an hour listening to the radio in San Fran today whereby a liberal radio station was interviewing Vincent Bugliosi, author of 'The Prosecution of George W. Bush for Murder', which is being blacklisted as such by the weak-ass delusional US media. Rock on Vince... he wants the death penalty for Bush + cronies. The fact that he carries such weight in terms of his background and history is one reason the mainstream invertebrates in the media are side stepping the gent.
d) met some cool amurican Lockheed Martin R&D dudes at the conference, including some peeps from Darpa, CIS(Center for Internet Security) and security bloggers I follow.
e) Met Dan Geer. Mission accomplished. Met Andrew Jaquaith who was like 'ahhhh Donal'..when he saw my name badge.
f) conference was pretty weak on the ground in terms of actual content but I didn't really care as I was just there for a holiday and to say hello to some peeps.
g) ebay security chick was cute, bigfix security chick looked like my mate micanders missus Holly
h) I came up with the idea of temporarily revoking NETBLOCKS as a punitative measure for orgs on the internets
i) Myself and Russell hit the bars twice chasing Asian-American chicks and had our fair share of Coronas and Mojitos, interesting discussions, great food, phone numbers, but didn't seal the deal. What's the story again with 2am closing?
j) I was reminded of the mass delusional insular conscious state most Amuricans live in
k) I was reminded of the smell of 'sewage' that wafts in certain areas of SF, including the abundance of homeless peeps around certain neighbourhoods.
l) I was pleased to see randomers walking around Haight-Ashbury in home made super hero capes, some in wizard hats... ain't it great that I wasn't phased nor were most of the public..
m) I was reminded how beatiful parts of California are and how cool and cooky SF still is.
n) I didn't get to the Green Gulch Zen Center, maybe next time! I seem to have ended up in San Fran every 1.5 to 2 years since around 1998-1999
What follows are the SMTP headers from the f**'ing dopy military, almost like they want to expose their internal MTA's....
a) SF Airport is at "Department of Homeland Security Threat Level ORANGE", who gives a sweet flying f**k.. and what does that mean anyway? Security traffic lights 'go slow' perhaps?
b) My GMAIL (web based email obviously.. AJAX'y port 80) to my securitymetrics.org group didn't get through from either the hotel's wireless nor the free wireless at the conference as the military have me blocked as a 'prohibited sender'... cool huh? Must be that email I sent a while back with as many keywords as possible in it. Nice one Draz! Anyway, see below for more details re: SMTP headers (I'm hoping it's not a redirect or filter by the hosting company of securitymetrics.org but I'm not sure yet) (NGILBEVSMB002CL.il.ng.ds.army.mil is in there somewhere...)
c) Spent an hour listening to the radio in San Fran today whereby a liberal radio station was interviewing Vincent Bugliosi, author of 'The Prosecution of George W. Bush for Murder', which is being blacklisted as such by the weak-ass delusional US media. Rock on Vince... he wants the death penalty for Bush + cronies. The fact that he carries such weight in terms of his background and history is one reason the mainstream invertebrates in the media are side stepping the gent.
d) met some cool amurican Lockheed Martin R&D dudes at the conference, including some peeps from Darpa, CIS(Center for Internet Security) and security bloggers I follow.
e) Met Dan Geer. Mission accomplished. Met Andrew Jaquaith who was like 'ahhhh Donal'..when he saw my name badge.
f) conference was pretty weak on the ground in terms of actual content but I didn't really care as I was just there for a holiday and to say hello to some peeps.
g) ebay security chick was cute, bigfix security chick looked like my mate micanders missus Holly
h) I came up with the idea of temporarily revoking NETBLOCKS as a punitative measure for orgs on the internets
i) Myself and Russell hit the bars twice chasing Asian-American chicks and had our fair share of Coronas and Mojitos, interesting discussions, great food, phone numbers, but didn't seal the deal. What's the story again with 2am closing?
j) I was reminded of the mass delusional insular conscious state most Amuricans live in
k) I was reminded of the smell of 'sewage' that wafts in certain areas of SF, including the abundance of homeless peeps around certain neighbourhoods.
l) I was pleased to see randomers walking around Haight-Ashbury in home made super hero capes, some in wizard hats... ain't it great that I wasn't phased nor were most of the public..
m) I was reminded how beatiful parts of California are and how cool and cooky SF still is.
n) I didn't get to the Green Gulch Zen Center, maybe next time! I seem to have ended up in San Fran every 1.5 to 2 years since around 1998-1999
What follows are the SMTP headers from the f**'ing dopy military, almost like they want to expose their internal MTA's....
Delivered-To: irldexter@gmail.com
Received: by 10.110.39.19 with SMTP id m19cs75264tim;
Tue, 29 Jul 2008 16:29:07 -0700 (PDT)
Received: by 10.100.41.1 with SMTP id o1mr11435736ano.10.1217374144604;
Tue, 29 Jul 2008 16:29:04 -0700 (PDT)
Return-Path:
Received: from mail06.ng.army.mil (mail14.ng.army.mil [132.79.8.26])
by mx.google.com with ESMTP id 6si327532yxg.6.2008.07.29.16.29.03;
Tue, 29 Jul 2008 16:29:04 -0700 (PDT)
Received-SPF: pass (google.com: best guess record for domain of IL-ExchangeService@ng.army.mil designates 132.79.8.26 as permitted sender) client-ip=132.79.8.26;
Authentication-Results: mx.google.com; spf=pass (google.com: best guess record for domain of IL-ExchangeService@ng.army.mil designates 132.79.8.26 as permitted sender) smtp.mail=IL-ExchangeService@ng.army.mil
Received: from mail06.ng.army.mil (unknown [127.0.0.1])
by mail06.ng.army.mil (Symantec Mail Security) with ESMTP id 9D2A4520007
for; Tue, 29 Jul 2008 18:23:31 -0500 (CDT)
X-AuditID: 844f0819-ac13fbb000001122-9a-488fa673712f
Received: from NGIAFESTBH002.ng.ds.army.mil (unknown [132.79.8.28])
by mail06.ng.army.mil (ARNG Mail Security Out) with ESMTP id 880194DC002
for; Tue, 29 Jul 2008 18:23:31 -0500 (CDT)
Received: from NGILFESTBH002.il.ng.ds.army.mil ([55.70.177.222]) by NGIAFESTBH002.ng.ds.army.mil with Microsoft SMTPSVC(6.0.3790.3959);
Tue, 29 Jul 2008 18:29:02 -0500
Received: from NGILBEVSMB002CL.il.ng.ds.army.mil ([55.70.177.218]) by NGILFESTBH002.il.ng.ds.army.mil with Microsoft SMTPSVC(6.0.3790.3959);
Tue, 29 Jul 2008 18:29:01 -0500
Received: from mail pickup service by NGILBEVSMB002CL.il.ng.ds.army.mil with Microsoft SMTPSVC;
Tue, 29 Jul 2008 18:29:01 -0500
thread-index: Acjx0uGiIpV20940QBOD9qKfljCyIw==
Thread-Topic: Symantec Mail Security detected a prohibited sender in a message sent (SYM:07622397080654417781)
From:
To:
Subject: Symantec Mail Security detected a prohibited sender in a message sent (SYM:07622397080654417781)
Date: Tue, 29 Jul 2008 18:29:01 -0500
Message-ID: <45678D748EE142BB9E065FE101891564@il.ng.ds.army.mil>
MIME-Version: 1.0
Content-Type: text/plain;
charset="utf-8"
Content-Transfer-Encoding: 7bit
X-Mailer: Microsoft CDO for Exchange 2000
Content-Class: urn:content-classes:message
Importance: normal
Priority: normal
X-MimeOLE: Produced By Microsoft MimeOLE V6.00.3790.4133
X-OriginalArrivalTime: 29 Jul 2008 23:29:01.0304 (UTC) FILETIME=[E1C17F80:01C8F1D2]
X-Brightmail-Tracker: AAAAAA==
Subject of the message: Re: [securitymetrics] Security awareness metrics
Recipient of the message: "Imran Mushtaq";"discuss@securitymetrics.org"
Wednesday, July 30, 2008
Future Shock Security 2.0
Utility / Cloud computing will not take over, but drive price comparison in internal IT shops. Atomic metrics must have abstract units convertible to associated costs perhaps in a financial market like brokered environment.
It is nuts. It is scary. Breeding out the 'old guard' wil happen also, as currently security is also a social and geo-political problem. Incentives and penalties will need to be introduced, initially per country. Once RIRs get fully authoritative and sBGP, DNSSEC happens, we may look at penalising entities! Virtual hosts, virtual servers, virtual networks and virtual storage will also drive fluidity in IT yet increase the static nature and characteristics of 'virtual nodes' which transact with each other.
More: What if a *national* security board/organisation could instruct an RIR(Regional Internet Registry) based upon an IRR(Internet Routing Registry) recorded NETBLOCK to be revoked as punishment for X... e.g. sinkhole/null route at Tier1/2/3 ISP/INEX? Thus an organisation would lose it's internet presence. Maybe we could use this to force em' to supply their anonymized *logs* and *survey* data (signed by the CIO of course) ....
It is nuts. It is scary. Breeding out the 'old guard' wil happen also, as currently security is also a social and geo-political problem. Incentives and penalties will need to be introduced, initially per country. Once RIRs get fully authoritative and sBGP, DNSSEC happens, we may look at penalising entities! Virtual hosts, virtual servers, virtual networks and virtual storage will also drive fluidity in IT yet increase the static nature and characteristics of 'virtual nodes' which transact with each other.
More: What if a *national* security board/organisation could instruct an RIR(Regional Internet Registry) based upon an IRR(Internet Routing Registry) recorded NETBLOCK to be revoked as punishment for X... e.g. sinkhole/null route at Tier1/2/3 ISP/INEX? Thus an organisation would lose it's internet presence. Maybe we could use this to force em' to supply their anonymized *logs* and *survey* data (signed by the CIO of course) ....
Monday, July 14, 2008
Sunday, July 13, 2008
Thursday, July 10, 2008
Top 5 Abused/Misused/Miscontrued Terms in Information Security
Paradigm Shift
Game Theory
l337 5p34k
* is dead
Security ROI
http://techbuddha.wordpress.com/2008/05/21/top-5-abusedmisusedmiscontrued-terms-in-information-security/
Game Theory
l337 5p34k
* is dead
Security ROI
http://techbuddha.wordpress.com/2008/05/21/top-5-abusedmisusedmiscontrued-terms-in-information-security/
Tuesday, July 08, 2008
Monday, June 30, 2008
Saturday, June 21, 2008
Verizon report says it all...
I'll let Richard from TaoSecurity sum it up:
http://taosecurity.blogspot.com/2008/06/verizon-business-report-speaks-volumes.html
Thursday, June 12, 2008
P0wned
MCP(Management Control Plane)
CP(Control Plane)
DP(Data Plane)
...should all be separate or as near as, especially in Tier1/2/3 ISP, INEX etc...
however why not try...with a global botnet to BGP announce your local SRC address for all DNS root servers sequentially while including BGP malformed/exploits with decreasing TTLs from the hopcount down to the first layer 3 hop. Lather. Rinse . Repeat. (including multipathed repsonses)
Hmmmm.....
CP(Control Plane)
DP(Data Plane)
...should all be separate or as near as, especially in Tier1/2/3 ISP, INEX etc...
however why not try...with a global botnet to BGP announce your local SRC address for all DNS root servers sequentially while including BGP malformed/exploits with decreasing TTLs from the hopcount down to the first layer 3 hop. Lather. Rinse . Repeat. (including multipathed repsonses)
Hmmmm.....
Wednesday, June 11, 2008
Sunday, June 08, 2008
Nanobioinfotechnology
Nothing to see here yet, move along. Just thinking out loud about convergence.
Friday, June 06, 2008
How do you teach a child that fire is hot?
How do you teach management and the 'old guard' of IT about router root kits?
How do you teach Mom and Pop about 'drive-by pharming'? (mine excluded :)
How do you teach kiddies that 'bad people' exist and 'bad things' should not be looked at on the internet without prematurely shattering their innocence or attempting to remove their access to the internet?
Well. You don't. Manage to the edge and offer a managed service. Remove default permit.
How do you teach Mom and Pop about 'drive-by pharming'? (mine excluded :)
How do you teach kiddies that 'bad people' exist and 'bad things' should not be looked at on the internet without prematurely shattering their innocence or attempting to remove their access to the internet?
Well. You don't. Manage to the edge and offer a managed service. Remove default permit.
Thursday, June 05, 2008
Sunday, June 01, 2008
Architecture by any other name
Physics and space-time compression in cyberspace? What is the cost model, fundamental units and atomic entities in IT and are they static? Would buildings or other structures look or be designed differently if they were intended to run software that inherently changed their function and application, or if they had to resist sentient attackers and unforeseen loads? Viva la revolution!
Architecture definition from the Oxford English Dictionary in OSX.
It isn't great practice to argue by analogy, but when the terminology inherent in a concept, system or discipline heavily borrows and indeed fragments an existing well defined discipline, it is hard not to use the fundamentals of the original discipline as a starting point. This post will focus on the relationship between information technology architecture and traditional architecture (though may be applied to biotech and nanotech in the future once the ubiquity of IT is also reached in these fields).
When the grey goo starts spreading it will be too late to fire the "architect".
Two things to remember throughout this post a) the fact that traditional architects must be licensed in most countries and are held accountable and responsible for what they produce and how and b) materials science does not accelerate at the pace that information technology does.
Rather than re-invent the wheel let's look at the traditional field of the roles of architects and structural engineers: http://www.helium.com/items/1028268-senior-school-heard-terms
"1 - Architects are generally responsible for design of buildings used primarily for everyday use by people. Structural engineers are responsible for design of a wide range of structures, such as bridges and power plants, for which an architect is not usually involved.
2 - Architects are responsible for design of the building shape, layout and appearance. Structural engineers are responsible for design of the building elements (foundations, columns, beams) that support all other building elements.
3 - In general, the results of architecture are visible when the building is completed. In general, the results of engineering for buildings are not visible after construction.
4 - When the owner hires an architect, the architect manages the overall design process. For buildings, structural engineers most often works for the architect. In general, the architect defines parameters (criteria) that the structural engineer must use for design of the structural elements.
5 - The structural engineer uses relatively complex mathematics and computer software for design. The architect uses graphical techniques primarily, along with basic math. "
From: http://www.boxesandarrows.com/view/enterprise
"Having IT, business analysts, and subject matter experts involved is important and necessary, but none of those three groups understands information and knowledge at a sufficiently deep level to offer truly creative and innovative alternatives that make information and knowledge systems work across the whole enterprise."
And a comment from Trilochan Chhaya in response to a related post http://archnet.org/forum/view.jsp?message_id=201 :
"Information and Technology.........
Knowledge and Architecture.........
Information and Technology have existed for ages,and have challenged the Creativity of the Humans all the time.
Fortunately, it needs a creative thought to change Information to Knowledge and Technology to Architecture."
Let me get back to this one as I ran out of time. But if you are in any way IT inclined, you probably know where I am going already... and as an aside,
Combine "internet 0", with nanotech/nanofabrication... yes there is indeed "Plenty of Room at the Bottom"
Architecture definition from the Oxford English Dictionary in OSX.
architecture |ˈärkiˌtek ch ər|
noun
1 the art or practice of designing and constructing buildings.
• the style in which a building is designed or constructed, esp. with regard to a specific period, place, or culture : Victorian architecture.
2 the complex or carefully designed structure of something : the chemical architecture of the human brain.
• the conceptual structure and logical organization of a computer or computer-based system : a client/server architecture
It isn't great practice to argue by analogy, but when the terminology inherent in a concept, system or discipline heavily borrows and indeed fragments an existing well defined discipline, it is hard not to use the fundamentals of the original discipline as a starting point. This post will focus on the relationship between information technology architecture and traditional architecture (though may be applied to biotech and nanotech in the future once the ubiquity of IT is also reached in these fields).
When the grey goo starts spreading it will be too late to fire the "architect".
Two things to remember throughout this post a) the fact that traditional architects must be licensed in most countries and are held accountable and responsible for what they produce and how and b) materials science does not accelerate at the pace that information technology does.
Rather than re-invent the wheel let's look at the traditional field of the roles of architects and structural engineers: http://www.helium.com/items/1028268-senior-school-heard-terms
"1 - Architects are generally responsible for design of buildings used primarily for everyday use by people. Structural engineers are responsible for design of a wide range of structures, such as bridges and power plants, for which an architect is not usually involved.
2 - Architects are responsible for design of the building shape, layout and appearance. Structural engineers are responsible for design of the building elements (foundations, columns, beams) that support all other building elements.
3 - In general, the results of architecture are visible when the building is completed. In general, the results of engineering for buildings are not visible after construction.
4 - When the owner hires an architect, the architect manages the overall design process. For buildings, structural engineers most often works for the architect. In general, the architect defines parameters (criteria) that the structural engineer must use for design of the structural elements.
5 - The structural engineer uses relatively complex mathematics and computer software for design. The architect uses graphical techniques primarily, along with basic math. "
From: http://www.boxesandarrows.com/view/enterprise
"Having IT, business analysts, and subject matter experts involved is important and necessary, but none of those three groups understands information and knowledge at a sufficiently deep level to offer truly creative and innovative alternatives that make information and knowledge systems work across the whole enterprise."
And a comment from Trilochan Chhaya in response to a related post http://archnet.org/forum/view.jsp?message_id=201 :
"Information and Technology.........
Knowledge and Architecture.........
Information and Technology have existed for ages,and have challenged the Creativity of the Humans all the time.
Fortunately, it needs a creative thought to change Information to Knowledge and Technology to Architecture."
Let me get back to this one as I ran out of time. But if you are in any way IT inclined, you probably know where I am going already... and as an aside,
Combine "internet 0", with nanotech/nanofabrication... yes there is indeed "Plenty of Room at the Bottom"
Tuesday, May 20, 2008
Time to return to Wing Chun chop foooey
"Man, I see in Fight Club the strongest and smartest men who've ever lived.
I see all this potential, and I see it squandered.
God damn it, an entire generation pumping gas, waiting tables; slaves with white collars.
Advertising has us chasing cars and clothes, working jobs we hate so we can buy shit we don't need.
We're the middle children of history, man.
No purpose or place.
We have no Great War.
No Great Depression.
Our Great War's a spiritual war... our Great Depression is our lives.
We've all been raised on television to believe that one day we'd all be millionaires, and movie gods, and rock stars.
But we won't.
And we're slowly learning that fact. And we're very, very pissed off."
Sunday, May 18, 2008
Wonderful... the science of happiness
I inherited my love of radio from my Dad.
I find myself switching on ABC Radio National everyday before any other station. On Saturday the 17th May 2008 I stumbled upon this gem. http://www.abc.net.au/rn/allinthemind/stories/2008/2243006.htm
"The pursuit of happiness is a global obsession. But can science investigate its slippery, subjective nature? What are the metrics—self report, brain activity, or the good deeds we do? Five world leaders in the field join Natasha Mitchell in conversation—neuroscientist Richard Davidson, Buddhist monk Matthieu Ricard, Buddhist scholar B. Alan Wallace, psychologist Daniel Gilbert and philosopher David Chalmers."
Considering I missed the end of the segment, the concept of podcasting reinforces itself in the spreading of enlightenment via information and ideas. A segue perhaps to this:
"To create a social business which reduces all suffering by facilitating the access to and dissemination of enlightened information and ideas."
Note: It's coming.
Saturday, May 17, 2008
Tuesday, May 13, 2008
Something else is missing... need a pointer
Was thinking today... since school and university have come and gone(from my perspective), and family are in a different hemisphere; why is it so difficult to find or stumble across a mentor in this industry or even in this life? Is it just when you are younger that you feel like there are people around and above you who know what's going on, or have a plan? Once one becomes more self aware do we see everyone stumbling along together? One wonders what structure or guidance we as a society will need or will cling to to ensure cohesion, or what will emerge in the place of organised religions in the West?
Where are the few who espouse 'enlightened self-interest' and 'non-zero sum games' for the world? Why is it not immediately obvious in our leaders and politicians? What is broken and how can we fix it? ( I tend to start these days with http://www.ted.com/ or the http://www.worldchanging.com/book/ )
Unfortunately Information Technology is extremely lacking in individuals who can offer wisdom and guidance, presumably being such a young discipline/art itself?
Life is difficult to figure out in this generation of mass media with a constant bombardment of encoded information which mainly focuses on mass materialism and consumption. Is it any reason that graffiti and street art has tried to fill the gap left by the press, advertising and the media at large.
Everything has sped up. Most people are disembodied from themselves and society. What could you prescribe? Diet, exercise, preventative mental health, meditation, different types of education, emotional intelligence...etc?
If you would like to be a mentor please send a S.A.E. to PO BOX 564 :)
I probably need my own personal technical version of one of these guys: J Krishnamurti, Anthony DeMello, Eckhart Tolle, Matthieu Ricard... or maybe an aggregate avatar of all of them that follows you online or is a 3d hologram buddy in real life ;)
Note: The image above is of the 3d holographic shark who assists the kid in one of Discovery's 2057 programs. An episode that happens to focus on a city wide virus that gets in to everything including digital signage. Not on my shift I tells ya!
Monday, May 05, 2008
How much is enough?
Is IT Security/Technology Risk Management a discipline or an art, is it subjective or objective? ( Is information technology deterministic or just overly complex? )
Are IT systems and frameworks closed systems? What comparable frameworks or systems (through which value transits) must defend against sentient attackers who attempt to subvert, control or disable services?
Can organisations quantify the value of information in motion or at rest within their managed footprint? Can they independently verify/audit the flows and data objects present? Somehow the bad guys have a better appreciation for CPU, disk and BW and SERVICE than we have!
Does it come down to simple economics? How to incentivise and penalise?
Surely 'Critical Infrastructure' should be held to extremely high standards by an independent body of technical auditors?
Does it really come back to accountability? Do we/they/us/them need to get burned badly (which the miscreants don't want either!) before we are enlightened...
Can the little guys afford the head count of the big boys? (big boys who actually sometimes have *less* of a clue about their systems than the little guys in the first place!). Is it possible that sink-holing traffic centrally in the cloud will give us the visibility/control we have hoped for? Thin offices perhaps staffed with 'thin' people :)
For me it comes back to a simple paradigm. You can't manage what you can't measure. We need to return to atomic units via reductionist thought. This is what I hope shall come with cloud and utility computing. Can you or the cloud provider "afford" NON-integral CPU, DISK, FLOWS, BW, KILOWATTS... runaway code.. such that it now becomes a billing issue? Once IT shops in enterprises start properly implementing "charge-back" rather than a flat rate service we may see some changes.... this coupled with a metric/cost applicable to shared infrastructure such as network fabrics, DNS, NTP, control planes etc...
How can we secure a service when we can't even charge for a service?
Billing 2.0, Utility 2.0, Employment 2.0
Are IT systems and frameworks closed systems? What comparable frameworks or systems (through which value transits) must defend against sentient attackers who attempt to subvert, control or disable services?
Can organisations quantify the value of information in motion or at rest within their managed footprint? Can they independently verify/audit the flows and data objects present? Somehow the bad guys have a better appreciation for CPU, disk and BW and SERVICE than we have!
Does it come down to simple economics? How to incentivise and penalise?
Surely 'Critical Infrastructure' should be held to extremely high standards by an independent body of technical auditors?
Does it really come back to accountability? Do we/they/us/them need to get burned badly (which the miscreants don't want either!) before we are enlightened...
Can the little guys afford the head count of the big boys? (big boys who actually sometimes have *less* of a clue about their systems than the little guys in the first place!). Is it possible that sink-holing traffic centrally in the cloud will give us the visibility/control we have hoped for? Thin offices perhaps staffed with 'thin' people :)
For me it comes back to a simple paradigm. You can't manage what you can't measure. We need to return to atomic units via reductionist thought. This is what I hope shall come with cloud and utility computing. Can you or the cloud provider "afford" NON-integral CPU, DISK, FLOWS, BW, KILOWATTS... runaway code.. such that it now becomes a billing issue? Once IT shops in enterprises start properly implementing "charge-back" rather than a flat rate service we may see some changes.... this coupled with a metric/cost applicable to shared infrastructure such as network fabrics, DNS, NTP, control planes etc...
How can we secure a service when we can't even charge for a service?
Billing 2.0, Utility 2.0, Employment 2.0
Saturday, May 03, 2008
"Toffler-esque", wondering about IT churn?
An interesting look at employee churn, and very apt in the IT arena, methinks:
"Employees – especially the most talented ones – are not “dating around” and moving from place to place in search of the Perfect Company at which they can grow old and retire at. They’ve already aced the first four rungs of Maslow’s hierarchy and are in search of self-actualization: the instinctual need of humans to make the most of their abilities and to strive to be the best they can."
http://thedailywtf.com/Articles/Up-or-Out-Solving-the-IT-Turnover-Crisis.aspx
Thanks Wade. All we have to worry about now is the predictions of Malthus.
"Employees – especially the most talented ones – are not “dating around” and moving from place to place in search of the Perfect Company at which they can grow old and retire at. They’ve already aced the first four rungs of Maslow’s hierarchy and are in search of self-actualization: the instinctual need of humans to make the most of their abilities and to strive to be the best they can."
http://thedailywtf.com/Articles/Up-or-Out-Solving-the-IT-Turnover-Crisis.aspx
Thanks Wade. All we have to worry about now is the predictions of Malthus.
Wednesday, April 30, 2008
Phrase of the day...
Reading away in some offical blueprint documents for a client ..... came across the phrase "opportunistic use of automation"... tee hee.
Definition of opportunistic from my Mac's 'Dictionary and Thesaurus':
Definition of opportunistic from my Mac's 'Dictionary and Thesaurus':
opportunistic |ˌäpərt(y)oōˈnistik|
adjective
exploiting chances offered by immediate circumstances without reference to a general plan or moral principle : the change was cynical and opportiunistic.
Tuesday, April 15, 2008
Internet Infrastructure Report from Arbor
Nearly forgot to read/listen to this this year. Sound ain't great, would have expected more from the guys, however the content is worth a listen or the report a read.
PDF below:
http://www.arbornetworks.com/index.php?option=com_content&task=view&id=1034&Itemid=525
Monday, April 14, 2008
Screwing with perception.. quality
So basically time stops and challenges peoples perception of reality. Wonderful really.
Friday, April 11, 2008
Microsoft end to end rebuttle, trust me
From: http://www.microsoft.com/mscorp/twc/endtoendtrust/default.mspx
http://tinyurl.com/53psbo "Establishing_End_to_End_Trust.pdf"
The word transitive is not used once though hinted at. Let me preface the below rant with the fact that I don't have the answer(s). As pointed out in the article on Page 7: "As noted below, there are historic, economic, social, and political forces that suggest a well-constructed regime is better than none at all, especially in light of the challenges we face on the Internet and the desire of people to be more secure in their daily lives"... and "We must create an environment where reasonable and effective trust decisions can be made." Agreed.
However: ( Christ their lingo could do with some updating and accuracy though... read on... !)
Apparently sloppy code, the pace of technology, a bloated, incestuous, self-serving IT industry lacking in basic engineering discipline (around tolerances and expected usage) coupled with complexity, default permit, cost of massive parallel attacks, jurisdictional immunity.. yadda, yadda has *nothing* to do with our current predicament.... it's *all* about end-to-end trust... hmm subjective or objective guys? I'm hoping end-to-end includes dependencies relating from transitive trust...
Apparently though "Experience shows that most cybercriminal schemes are successful because people, machines, software, and data are not well authenticated and this fact, combined with the lack of auditing and traceability, means that criminals will neither be deterred at the outset nor held accountable after the fact. Thus the answer must lie in better authentication that allows a fundamentally more trustworthy Internet and audit that introduces real accountability"
Page 4 "But staying the current course will not be sufficient; the real issue is that the current strategy does not address effectively the most important issue: a globally connected, anonymous, untraceable Internet with rich targets is a magnet for criminal activity—criminal activity that is undeterred due to a lack of accountability.".... hmmmm yeah but what about http://geer.tinho.net/ieee.geer.0606.pdf
Geer:"Everything about digital security
has time constants that are three or-
ders of magnitude different from the
time constants of physical security:
break into my computer in 500 mil-
liseconds but into my house in 5 to
10 minutes."
Geer:"Human-scale time and rate con-
stants underlie the law enforcement
model of security. The crime happens
and the wheels of detection, analysis,
pursuit, apprehension, jurisprudence,
and, perhaps, penal servitude then,
paraphrasing Longfellow, “grind
slowly, yet they grind exceeding
fine.” In other words, law enforce-
ment generally has all the time in
the world, and its opponent, the
criminal, thus must commit the
perfect crime to cleanly profit from
that crime."
Geer:"If the physics of digital space and
digital time mean we ally ourselves
with the intelligence world view
and not the law enforcement world
view, we have to ask ourselves two
things: is the price of digital sur-
veillance a bearable price for the
benefit of digital safety? And, if so,
what is the unit of digital surveil-
lance? What do we watch—people
or bits?"
Back to Microsoft:Establishing_End_to_End_Trust:..
Page 8: Mis-use of the word hacker "external hackers with access to their systems, in large part because a hacker ".. will they ever learn or at least work harder for a little respect? Apparently "device-to-device authentication" will foil the "hackers"... and scripting attacks facilitates "thus making anyone an “expert” hacker; and the amount of data that can be stolen is limited only by bandwidth. "(Page 10)
Page 8: "robust management tools" are predicated on trusted management tools and one would hope them to be robust to begin with :)
Page 8: "depending on the threat level" , define threat and what metric is employed to address the level?
Page 8: "flooding and probing attacks"... is probing an attack?
Page 8: "Autonomous defense would be possible if, for example, packets likely to be malicious (because they are reliably identified as coming from a dangerous source) could be dropped shortly after entering the network or at a computer’s interface to the network." erm, a self-defending network by virtue of a firewall?
Page 8: "Even the intractable insider threat could be more successfully addressed because better audit tools would make it easier to identify suspicious access patterns for employees in a timely manner." ok so this is anomaly detection now via trust?
Page 8: "The authentication of identity, device (and its state), software, and data could be used to generate trust measurements that could also be used to reduce risk to the ecosystem." don't get me started although 'trusting the state' is a good concept, trust measurements spins me out considering 'trust' is somewhat boolean or a nominal measurement :)
Page 8: Apparently "one of the reasons that large enterprises manage risk relatively well is that they have dedicated IT staff implementing risk management programs." hmmmmm.... so throw some people at the issue and get the job done 'relatively' well :)
Page 8: Here comes the NAP pitch "Yet there is no chief information officer for the public, and no mechanism for protecting the broader Internet by taking best practices from enterprises, such as Network Access Protection, and applying those practices to the public." What about FIRST, NSP-SEC, Arbor, Internet Motion Sensor, Network Telescopes and ISPs in general? Ummm... NIST? SANS? Sorry guys Net Neutraliy vs extreme ISP Hostility with NAP?.. the oul' internet would break methinks... erm... we don't all conform on the endpoints ladies and gents, good luck on that one... the internet is a superorganism that is evolving its immune system. Surveillance and telemetry is the first step.
Page 8: We can fix it all "With better authentication and audit, dynamic trust decisions could be made (based upon, for example, the state of a machine) and Internet service providers could use network access controls to limit the activities of “untrustworthy” machines until they were updated."
So Internet wide NAC/NAP is the answer, don't let the "bad guys" or "bad nodes" on in the first place and kick em' off if they're bad(tm)
Page 9: Hmm, "Second, absent the ability to identify and prove the source of misconduct, there can be no effective deterrent—no effective law enforcement response to cybercrime and no meaningful political response to address international issues relating to cyberabuse." true.
Page 10: "Because all software operates in an environment defined by hardware, it is critical to root trust in hardware." hmmmmmm "If machines did a machine-to-machine-based authentication rooted in TPM keys before allowing a network connection, then one could arguably exclude unapproved machines from accessing network resources. Using new cryptographic techniques, this can be done in privacy-compliant ways." hmmmm cold boot?
Page 14 states: "As the firewall continues to diminish in importance, it is important to focus on protecting data as opposed to simply protecting the machines that store such data." Not sure I'd use this phrase rather that the focus from all sides is moving up the stack, but by no means obviating the need for firewalling.
Page 15: "standardizing audit data formats and tools".. erm syslog? http://www.loganalysis.org/sections/standards/index.html
Page 15: "one can call or send mail to millions of victims, but the time and cost makes this infeasible. " ye think? yeah SPAM is really on the decline, NOT! Micrsofts Penny Black project made sense though... http://research.microsoft.com/research/sv/PennyBlack/
Ok at this point I give up or this will get too long..... the whole section on F.Audit Page 14/15 is purile and so far behind the times it's scary... they need to look outside of their Microsoft shaped box in Redmond.
No mention of the network.
As Ranum states on http://www.beastorbuddha.com/ "The Internet applications stack depends heavily on ARP and DNS and those protocols depend on a tamper-free network. It’s just silly to think your end-point can secure itself if the network fabric is untrustworthy! If the network is untrustworthy, it’s “game over, man!” as Private Hudson would say."
At the end of the piece a question is posed "can we maintain a globally connected, anonymous, untraceable Internet and be dependent on devices that run arbitrary code of unknown provenance?"... Apparently if the answer is no, then " we need to create a more authenticated and audited Internet environment"... DOH!
"it is important to address all of the complicated social, political, economic, and technical issues raised to ensure we end up with the Internet we want, one which empowers individuals and businesses, and at the same time protects the social values we cherish. " Agreed but which *we* is that? And do we want backwards compatibility?
http://tinyurl.com/53psbo "Establishing_End_to_End_Trust.pdf"
The word transitive is not used once though hinted at. Let me preface the below rant with the fact that I don't have the answer(s). As pointed out in the article on Page 7: "As noted below, there are historic, economic, social, and political forces that suggest a well-constructed regime is better than none at all, especially in light of the challenges we face on the Internet and the desire of people to be more secure in their daily lives"... and "We must create an environment where reasonable and effective trust decisions can be made." Agreed.
However:
Apparently sloppy code, the pace of technology, a bloated, incestuous, self-serving IT industry lacking in basic engineering discipline (around tolerances and expected usage) coupled with complexity, default permit, cost of massive parallel attacks, jurisdictional immunity.. yadda, yadda has *nothing* to do with our current predicament.... it's *all* about end-to-end trust... hmm subjective or objective guys? I'm hoping end-to-end includes dependencies relating from transitive trust...
Apparently though "Experience shows that most cybercriminal schemes are successful because people, machines, software, and data are not well authenticated and this fact, combined with the lack of auditing and traceability, means that criminals will neither be deterred at the outset nor held accountable after the fact. Thus the answer must lie in better authentication that allows a fundamentally more trustworthy Internet and audit that introduces real accountability"
Page 4 "But staying the current course will not be sufficient; the real issue is that the current strategy does not address effectively the most important issue: a globally connected, anonymous, untraceable Internet with rich targets is a magnet for criminal activity—criminal activity that is undeterred due to a lack of accountability.".... hmmmm yeah but what about http://geer.tinho.net/ieee.geer.0606.pdf
Geer:"Everything about digital security
has time constants that are three or-
ders of magnitude different from the
time constants of physical security:
break into my computer in 500 mil-
liseconds but into my house in 5 to
10 minutes."
Geer:"Human-scale time and rate con-
stants underlie the law enforcement
model of security. The crime happens
and the wheels of detection, analysis,
pursuit, apprehension, jurisprudence,
and, perhaps, penal servitude then,
paraphrasing Longfellow, “grind
slowly, yet they grind exceeding
fine.” In other words, law enforce-
ment generally has all the time in
the world, and its opponent, the
criminal, thus must commit the
perfect crime to cleanly profit from
that crime."
Geer:"If the physics of digital space and
digital time mean we ally ourselves
with the intelligence world view
and not the law enforcement world
view, we have to ask ourselves two
things: is the price of digital sur-
veillance a bearable price for the
benefit of digital safety? And, if so,
what is the unit of digital surveil-
lance? What do we watch—people
or bits?"
Back to Microsoft:Establishing_End_to_End_Trust:..
Page 8: Mis-use of the word hacker "external hackers with access to their systems, in large part because a hacker ".. will they ever learn or at least work harder for a little respect? Apparently "device-to-device authentication" will foil the "hackers"... and scripting attacks facilitates "thus making anyone an “expert” hacker; and the amount of data that can be stolen is limited only by bandwidth. "(Page 10)
Page 8: "robust management tools" are predicated on trusted management tools and one would hope them to be robust to begin with :)
Page 8: "depending on the threat level" , define threat and what metric is employed to address the level?
Page 8: "flooding and probing attacks"... is probing an attack?
Page 8: "Autonomous defense would be possible if, for example, packets likely to be malicious (because they are reliably identified as coming from a dangerous source) could be dropped shortly after entering the network or at a computer’s interface to the network." erm, a self-defending network by virtue of a firewall?
Page 8: "Even the intractable insider threat could be more successfully addressed because better audit tools would make it easier to identify suspicious access patterns for employees in a timely manner." ok so this is anomaly detection now via trust?
Page 8: "The authentication of identity, device (and its state), software, and data could be used to generate trust measurements that could also be used to reduce risk to the ecosystem." don't get me started although 'trusting the state' is a good concept, trust measurements spins me out considering 'trust' is somewhat boolean or a nominal measurement :)
Page 8: Apparently "one of the reasons that large enterprises manage risk relatively well is that they have dedicated IT staff implementing risk management programs." hmmmmm.... so throw some people at the issue and get the job done 'relatively' well :)
Page 8: Here comes the NAP pitch "Yet there is no chief information officer for the public, and no mechanism for protecting the broader Internet by taking best practices from enterprises, such as Network Access Protection, and applying those practices to the public." What about FIRST, NSP-SEC, Arbor, Internet Motion Sensor, Network Telescopes and ISPs in general? Ummm... NIST? SANS? Sorry guys Net Neutraliy vs extreme ISP Hostility with NAP?.. the oul' internet would break methinks... erm... we don't all conform on the endpoints ladies and gents, good luck on that one... the internet is a superorganism that is evolving its immune system. Surveillance and telemetry is the first step.
Page 8: We can fix it all "With better authentication and audit, dynamic trust decisions could be made (based upon, for example, the state of a machine) and Internet service providers could use network access controls to limit the activities of “untrustworthy” machines until they were updated."
So Internet wide NAC/NAP is the answer, don't let the "bad guys" or "bad nodes" on in the first place and kick em' off if they're bad(tm)
Page 9: Hmm, "Second, absent the ability to identify and prove the source of misconduct, there can be no effective deterrent—no effective law enforcement response to cybercrime and no meaningful political response to address international issues relating to cyberabuse." true.
Page 10: "Because all software operates in an environment defined by hardware, it is critical to root trust in hardware." hmmmmmm "If machines did a machine-to-machine-based authentication rooted in TPM keys before allowing a network connection, then one could arguably exclude unapproved machines from accessing network resources. Using new cryptographic techniques, this can be done in privacy-compliant ways." hmmmm cold boot?
Page 14 states: "As the firewall continues to diminish in importance, it is important to focus on protecting data as opposed to simply protecting the machines that store such data." Not sure I'd use this phrase rather that the focus from all sides is moving up the stack, but by no means obviating the need for firewalling.
Page 15: "standardizing audit data formats and tools".. erm syslog? http://www.loganalysis.org/sections/standards/index.html
Page 15: "one can call or send mail to millions of victims, but the time and cost makes this infeasible. " ye think? yeah SPAM is really on the decline, NOT! Micrsofts Penny Black project made sense though... http://research.microsoft.com/research/sv/PennyBlack/
Ok at this point I give up or this will get too long..... the whole section on F.Audit Page 14/15 is purile and so far behind the times it's scary... they need to look outside of their Microsoft shaped box in Redmond.
No mention of the network.
As Ranum states on http://www.beastorbuddha.com/ "The Internet applications stack depends heavily on ARP and DNS and those protocols depend on a tamper-free network. It’s just silly to think your end-point can secure itself if the network fabric is untrustworthy! If the network is untrustworthy, it’s “game over, man!” as Private Hudson would say."
At the end of the piece a question is posed "can we maintain a globally connected, anonymous, untraceable Internet and be dependent on devices that run arbitrary code of unknown provenance?"... Apparently if the answer is no, then " we need to create a more authenticated and audited Internet environment"... DOH!
"it is important to address all of the complicated social, political, economic, and technical issues raised to ensure we end up with the Internet we want, one which empowers individuals and businesses, and at the same time protects the social values we cherish. " Agreed but which *we* is that? And do we want backwards compatibility?
Thursday, April 10, 2008
Counterpoint to the generational divide...
So my Mum sends me this email. She was only introduced to the Internet in 2000.
"I suspected something was running in the background. And fssm32.exe was quoting 92-95 under CPU usage. I googled "fssm32.exe and CPU usage" and found this:
http://www.geekstogo.com/forum/fssm32-exe-taking-all-CPU-resolved-t5718.html
which was exactly my problem. I had had an error message from F-Secure saying it
couldn't connect to update. I updated it manually, after a few tries, and ran a
scan.
It told me I had two trojan keyloggers, and listed them in the TIF but said it
couldn't delete. But they didn't show up under my identity. I found them under
the Admin identity, and deleted them. But of course when I rebooted and ran a
scan, they were back, only this time in a .dbx of the administrator. The
Phishing Email folder. So I deleted both .dbx (me and the admin, who are the
same person) cos I knew new folders would be recreated in Outlook. Then I ran
"Window Washer" with "bleach" (which means it overwrites files three times) and
included 'free space', as well as TIF and the rest ... Then rebooted and ran
F-Secure again.
When F-Secure said I was clean, I confirmed it with two online scans --
TrendMicro and Panda. The sluggishness has disappeared. And CPU for fssm32.exe
is now saying 02 or 03 when I have only Outlook open. I'm *assuming* for now
that if there were any files still in registry, that F-Secure should be telling
me. Maybe I shouldn't assume.
On the F-Secure info page about the type of trojan, it said I'd better change
all my passwords when I was sure I was clean.
I have "HijackThis" but am nervous of using it without guidance.
The problem *seems* to be related to Windows Automatic Updates. I'm set to
Automatic, but when I checked last night, it downloaded 9 Updates, which was a
shock. No idea at all how that happened. I'm still set to Automatic Updates."
Noice huh?
"I suspected something was running in the background. And fssm32.exe was quoting 92-95 under CPU usage. I googled "fssm32.exe and CPU usage" and found this:
http://www.geekstogo.com/forum/fssm32-exe-taking-all-CPU-resolved-t5718.html
which was exactly my problem. I had had an error message from F-Secure saying it
couldn't connect to update. I updated it manually, after a few tries, and ran a
scan.
It told me I had two trojan keyloggers, and listed them in the TIF but said it
couldn't delete. But they didn't show up under my identity. I found them under
the Admin identity, and deleted them. But of course when I rebooted and ran a
scan, they were back, only this time in a .dbx of the administrator. The
Phishing Email folder. So I deleted both .dbx (me and the admin, who are the
same person) cos I knew new folders would be recreated in Outlook. Then I ran
"Window Washer" with "bleach" (which means it overwrites files three times) and
included 'free space', as well as TIF and the rest ... Then rebooted and ran
F-Secure again.
When F-Secure said I was clean, I confirmed it with two online scans --
TrendMicro and Panda. The sluggishness has disappeared. And CPU for fssm32.exe
is now saying 02 or 03 when I have only Outlook open. I'm *assuming* for now
that if there were any files still in registry, that F-Secure should be telling
me. Maybe I shouldn't assume.
On the F-Secure info page about the type of trojan, it said I'd better change
all my passwords when I was sure I was clean.
I have "HijackThis" but am nervous of using it without guidance.
The problem *seems* to be related to Windows Automatic Updates. I'm set to
Automatic, but when I checked last night, it downloaded 9 Updates, which was a
shock. No idea at all how that happened. I'm still set to Automatic Updates."
Noice huh?
Tuesday, April 08, 2008
Sentence of the day, even if I do say...
To a colleague today about IT security and information assurance.
"theory" looking for visionary leadership in a world gone sour with an inverted pyramidal house of cards being built on yet smaller physical footrprints with sedimentary protocols forming ingrained foundations whereupon we dance with virtualisation in expanded cyberspace with even less capacity for visibility and management, let alone surveillance and optimsation. L2/L3/L2 -> Ethernet/IP-MPLS/VPLS... ESX/VSwitch/Windows = layers of complexity, layers of code, and yet fully fledged OS's pushed further away from the networking stack... abstracted in to inner space.....
Sunday, April 06, 2008
Great talk by Richard A. Clarke at Source Boston 2008
As Ranum et al have been banging on about for ages, Richard has actually been in the belly of the beast! (I think I'm gonna go read Richard's book, a "fictitious" account of state sponsored cyber-terrorism.)
Saturday, April 05, 2008
Safety for kids and a trip to Mars?
Kids:
I have been looking for something like this for a while to point parents towards to help them with some direction around their children's online activities.
It's a scary topic when you delve deeply in to tech, how to protect the kids. Personally I think parents should key log kids machines, but what about outside the home?
http://www.google.com/intl/en/landing/familysafety/
Mars:
Virgin and Google team up to go to Mars.
http://www.google.com/virgle/index.html
I have been looking for something like this for a while to point parents towards to help them with some direction around their children's online activities.
It's a scary topic when you delve deeply in to tech, how to protect the kids. Personally I think parents should key log kids machines, but what about outside the home?
http://www.google.com/intl/en/landing/familysafety/
Mars:
Virgin and Google team up to go to Mars.
http://www.google.com/virgle/index.html
Tuesday, March 25, 2008
ipv6 trix
More notes for myself for future reading, mesh, mobility and stuff...
Google v6 Tech Talks
http://tinyurl.com/2afeqc
What the US is missing by ignoring v6
http://www.infoworld.com/article/08/03/12/11NF-ipv6_2.html
Google v6 Tech Talks
http://tinyurl.com/2afeqc
What the US is missing by ignoring v6
http://www.infoworld.com/article/08/03/12/11NF-ipv6_2.html
Monday, March 24, 2008
Sunday, March 02, 2008
Anchored in time and tech, need new flows
Information Technology is fluid. IT is a capability whos ultimate goal stays the same, i.e. that of managing information. Unfortunately its operating environment, rules and players constantly change. Essentially what is being dealt with is a 'sliding window' of services constantly being built, tweaked and evolved on a platform of aging non-modular equipment and code.
Sliding windows suffer from extreme lag when they are consistently anchored by non-modular, non-extensible technology *and* people. We find ourselves constrained generally by the long tail of the process, thus consuming inordinate amounts of time and resources which could be better allocated and more productive elsewhere.
One of the foremost problems facing our society today from a technological perspective is not power consumption, general acceptance, awareness or learning, it's actually that of being trapped in the past, the near past. We are not so much trapped per se, but beholden to the constraints imposed upon us by the previous architects, engineers, management and chosen technologies. One must ask oneself, why be so short sighted? Did they really have a choice? Did they not factor the costs to maintain and deal with change? How does one manage change in an environment where the priorities seem to change daily and technology evolves almost independently while we wait for the darwinian champion of the 'most adaptable' to succeed.
Once more we should look to nature to see what the criteria for success are in an ever changing environment. Perhaps with this technological challenge we will be more aware of the interconnectedness and influence we exert in the evolution of cyberspace. What is it that we can manage? What is it that we can measure? Either the code needs to start taking care of itself or we need to embrace more fully an old engineering paradigm of loosely coupled replaceable sub-components. I would enjoy seeing both more! Don't get me wrong we will always need specialists and specialist systems, just built more-so from re-deployable units or resources. I am not advocating a monoculture, but a viewpoint or perspective on how we build considering the future caretakers of our digital creations from the outset.
At this point let me ask you a direct question dear reader; how many projects or times has legacy code, legacy infrastructure or tightly coupled systems thrown a virtual spanner in the works?
Virtualisation itself has started to offer some of the desired benefits alluded to above in relation to extensiblity and modularity, but many in management or leadership roles cannot tell you why or how virtualisation will and can benefit us, just that everyone else is doing it and it saves on the power bill.
Until we have our grey goo, a version of true utility computing whereupon perhaps we can 'pour' more computing in or on, or have any node re-purpose itself on the fly as another role, we will continue to build ourselves in to cul-de-sac's of wasteful practices. How much time and resources are spent trying to manage, measure or repair (while excessively consuming energy) the wrongs of the near past in our IT footprints.
We waste fossil fuels needlessly all the time within IT, but we also waste human capital trying to clean up after an unconscious breed of Information Technology 'professionals' who haven't seen the obvious staring them right in the face... survival of the most adaptable! Corporate memory just like public memory is short lived, however techs just like civil servants see the politics at play and the players only trying to further themselves. There is a new breed coming, an undercurrent of massively distributed techs with instant communication and new paradigms slowing trying to strip away the ineffectual practices of old. If you are the equivalent of a paper(email) shuffler in the office, adding no value, watch out I tells ya'... the language and sands are shifting and buzzwords just don't cut it any more!
Friday, February 15, 2008
My path, your path?
It exists inside. The gateless gate. It is already there. There is no path. It begins and ends within. There is no formal path. Some need training. Some need challenges. Some need to allow themselves to see further only to see closer. Whether lay or not is not the issue. Practice is all around. Formality can assist, can speed the path. It however is a pathless path, a gateless gate. We have already stepped through. The point at which one embarks on the journey is when they have both left and arrived. You get what I'm saying? YOU are awake already, once you question and ask if you are awake! The next step is only the depth, path and continuing effort or style. Sometimes thinking too much is destructive. Sometimes not thinking at all is destructive. To find the middle way is to have walked the edge and reached many extremes. Extremes cannot be found in comfortable places. The most uncomfortable places are in the mind, not in a geographical space, place or time.
My 0.02 brain cycles worth... my subjectivity is built from "our" objectivity and your subjectivity ;)
My 0.02 brain cycles worth... my subjectivity is built from "our" objectivity and your subjectivity ;)
Thursday, February 14, 2008
Simplicity
What is it that defines us?
What is the most important thing to us in our short existence?
What is the thing we should cherish most in our lives?
What do we have from birth to death and has the power to colour our lives for better or for worse?
Easy.... our minds, our consciousness...
So why do we neglect something so deeply important to our quality of life and base existence?
Surely we need to engage in some form of mind training or develop more tools to address and deal with our perception of reality?
Cmon' guys, why allow a crazy world to passively pollute our minds unnecessarily, why not focus a little bit on awareness and mindfulness. Start by observing yourself. Then take the time to quietly observe others without judging. Remove yourself from your preconceptions and look with clearer neutral eyes. Perceive from a neutral standpoint and quieten your monkey mind for a moment.
A good, easily digested, palatable first step in this modern age are talks online by people like Mattheau Ricard http://www.youtube.com/results?search_query=matthieu+ricard+happiness&search_type= and Anthony De Mello.
http://goldfusion.wordpress.com/2007/08/22/tony-de-mello-videos-online/
What is the most important thing to us in our short existence?
What is the thing we should cherish most in our lives?
What do we have from birth to death and has the power to colour our lives for better or for worse?
Easy.... our minds, our consciousness...
So why do we neglect something so deeply important to our quality of life and base existence?
Surely we need to engage in some form of mind training or develop more tools to address and deal with our perception of reality?
Cmon' guys, why allow a crazy world to passively pollute our minds unnecessarily, why not focus a little bit on awareness and mindfulness. Start by observing yourself. Then take the time to quietly observe others without judging. Remove yourself from your preconceptions and look with clearer neutral eyes. Perceive from a neutral standpoint and quieten your monkey mind for a moment.
A good, easily digested, palatable first step in this modern age are talks online by people like Mattheau Ricard http://www.youtube.com/results?search_query=matthieu+ricard+happiness&search_type= and Anthony De Mello.
http://goldfusion.wordpress.com/2007/08/22/tony-de-mello-videos-online/
Tuesday, February 12, 2008
3 IS the magic number - Mobile, Mesh, Multicast
Real time feedback loops to help the world.
a) have a read of this, link from Wade(bit long but worth it):
http://www.cityofsound.com/blog/2008/02/the-street-as-p.html
b) watch this from a Multimedia perspective to round out the concepts
http://www.albinoblacksheep.com/flash/epic
c) as I'm reading Arthur C Clarke's "The Light of Other Days" http://en.wikipedia.org/wiki/The_Light_of_Other_Days
it hammers the point home. Transparency. The multitude of data already out there. Our re-interpretation thereof. Intent. Information management and the integrity thereof. Interdependence demonstrated. Wake up. Welcome to the future. See the MESH. Feel the quantum foam ;)
a) have a read of this, link from Wade(bit long but worth it):
http://www.cityofsound.com/blog/2008/02/the-street-as-p.html
b) watch this from a Multimedia perspective to round out the concepts
http://www.albinoblacksheep.com/flash/epic
c) as I'm reading Arthur C Clarke's "The Light of Other Days" http://en.wikipedia.org/wiki/The_Light_of_Other_Days
it hammers the point home. Transparency. The multitude of data already out there. Our re-interpretation thereof. Intent. Information management and the integrity thereof. Interdependence demonstrated. Wake up. Welcome to the future. See the MESH. Feel the quantum foam ;)
Tuesday, January 15, 2008
Subscribe to:
Posts (Atom)