Sunday, September 25, 2005

CIA

Confidentiality, Integrity and Availability...
Sometimes we forget about how exactly to tackle the last one. HA, Load-Balancing, BCP, Geographical Redundancy, Clustering, Primary/Secondary, Active/Active etc etc...

Don't forget 'backups', http://taobackup.com/ ( nice but vendor related! )

Fun http://www.backuptrauma.com/video/default2.aspx?r=1 from John Cleese!

Sunday, September 18, 2005

Once more in to the breach....

So I have started to recount this phrase to myself on a Sunday evening ( over a beer.. or two.. ) before stepping once again in to my job on a Monday morning...

I am an 'Information Security' practitioner for a large national mobile Telco and the landscape _is_ always changing... ( though we face the most basic challenges of yesteryear also..)

...out of the trenches and march forward in to the (semi)-unknown! Perhaps someone will allow the 'Red-Cross' in and sing 'Stile Nacht' over Christmas, while we bunker down and play a MMORPG.. however I doubt it as the Internet never sleeps! ( And nor should SecOps! )...

I have been aware of 'Marcus Ranum' for a while but revisted his site recently after a link was sent around for 'The Six Dumbest Ideas in Computer Security'.. http://www.ranum.com/security/computer_security/index.html

I would like to share with you some of the 'nuggets' in this 'Prophet's' site, that not only _pre-date_ but echo most of my sentiments -> if you have been here before:

Aside: I am only a mere mortal vs. this 'security-techno-demi-god' !

Quotes like:

1) Set up the production systems
2) Make them work
3) Test them
4) While true; do
If they are working; Continue; Endif
If they are not working; GOTO 2; Endif
5) Done

( Maybe OpenBSD + layered security + quality userland software.. )

or:

The mainframe programmers of the 70's and 80's used to write of a practice called "Change Control" - in which production systems were managed with care and forethought. During the late 90's the last of the Change Control believers were taken out and shot, and their cubicles were given to the consultants who were there to mark everything up in XML in order to make everything better in some manner nobody understands yet.

maybe the 'calender' based upon the classic 'Motivations' calenders:

http://www.ranum.com/security/computer_security/calendar/index.html

Friday, September 09, 2005

Anomaly or progress...

Hmm.. again I love the advances in 'polymorhic worm' behaviour, traffic normalization, IDS, IPS etc etc etc...

But I really think we are missing the fundamental point entirely. My favourite phrase is 'Complexity is the Enemy', especially as it relates to fast paced ever changing environments. 'Change Control' , 'Change Management' or 'Release Management' is great.. but I have never seen it done really effectively. Even in one of the best networking companies in the world, it is still a form of controlled chaos! As best effort / guestimate work is done in identifying host dependencies in downstream networks or similar service dependencies in downstream / upstream applications or code. ( Let alone full appreciation for business and supporting processes... ). Who _are_ these guardians of 'Change Control' who _really_ understand the _Infrastructure_ in all its glorius levels and depths... -----> 'techno-demigods' I think they would be called :)

"Well, that's the security guys / operations manager's role... oh, well then, it's the um administrators or engineering, or implementations guys....", I hear you all say in tandem.... well perhaps, but do they really know what's going on? Who actually did what, when, where and why? And could you really tell what was done and how?

Who are the implementors? Are they insourced, outsourced or was the update or change performed by some 'fly-by-night' technorati....? Relax my friends, it's all ok you uber-geeks, we all know the CIO knows exactly what's happening and is responsible for the whole shebang!

Take for example a business with a large dependency on IT ( any medium to large business, desperate to bring an IT based service or product to market -> think of Microsoft in the early days, some may argue still now...! ) and sprinkle that with a lack of _quality_ in employees' experience, training and a lagging behind the pace of technology... then add a dollop of rapidly trying to use said latest and greatest technology, and has _anyone_ really got a handle on what's going on! Do they have the policies, management support / comprehension and business backing to inherently understand the risks to existing and future services. The risk to the products and current or projected revenue streams is vast while driving the pace at full kilter. Only experience lends itself to an instinctual appreciation of the hidden costs of _rushing_ something out the door without the necessary QA, UAT, SIT.... ( Quality Assurance, User Acceptance Testing, Systems Integration Testing )....

Remember that millions of lines of code are wrapped around all Operating Systems and Applications or Services, whether in supporting the business or tied up in the business' delivery of products and services to its customers... then introduce the standard network users - driving the equivalent of virtual computer tanks and nuclear warheads with no proof of 'licensed to operate' or without the requisite training and experience. Mix this with network and system administrators, developers and database administrators with about as much scientific appreciation of computational logic and determinism ( in so far as _computer-systems_ are deterministic :) as the Incas had in believing in Sun Gods and that engaging in human sacrifice and voodoo like 'hibbidy-gibbidy', would appease said Gods of the time. Add to this a light sprinkling of 'management' who now find themselves in some _key_ technically related role, who have about as much experience with technology as those assembling their first 'Kinder Egg' with similar measures of people management skills, akin if you will to the atypical high school gym 'last pick' ability to inspire confidence, lead a team or score goals.

You are now ready to bake in the binary oven of success or failure, wait 30 minutes at 'Homeland Security' defcon 4 for the inevitable results.

'Baked Alaska' is not something you can get right with beginners luck...

So back the key theme, that with such complexity and general lack of appreciation of said complexity.. it actually needs to be reduced to faciliate some form of control. Most solutions these days actually _increase_ the complexity to try and control the complexity! (which doesn't really work without the correct resourcing, comprehension and mangement!)

Let's take a step back and focus on the basics. Let's cut out the fluff and focus on solid and secure systems and services that allow us to work on the real 'add-value' to the business or customers. Why is it we require an army of incompetents who create their own microcosms of increased complexity, entropy and cost, when computers are supposed to save us time so we can get on with what we're actually really good at?

Sunday, August 21, 2005

S.O.E. ( Standard Operating Environment )

Well, even if you use NetFlow on routers / switches why not include something like Argus [ http://www.qosient.com/argus/index.htm ] in all your standard host builds limited to its own slice / filesystem ( or implement some log rotation.. ) so the system or host itself builds a historical log of network relationships for troubleshooting, forensics etc etc

Sunday, August 07, 2005

How do we know about History? What are we doing wrong today....

Hmmmm.. simple premise.... we only uncovered much of what we know today about previous civilisations due to the mark they made upon the world, whether the information was intentionally created for recording purposes or that which was an unintentional byproduct of something else they did / used or created.

Here, the concepts of the intentional lifetime of data and the medium of storage chosen are of utmost importance. ( additionally data format / language and physical / logical interface to the data are of concern )

Some remnants of a society such as architecture may be considered a byproduct, however many buildings such as the pyramids of Egypt and South America were built to last the ages and were intended to be a legacy of the then rulers or of the civilisation itself. Funny that in the current modern era, we have sprawling metropolis' of concrete and steel which will in theory also last the test of time, but we don't in essence continue to write or record anything on mediums with similar longevity. Cave paintings and vellum scrolls when in the right conditions can last for thousands of years and convey stories and records of life as it was, and lessons for future generations whether intentional or not. Imagine if you will if the Rosetta Stone was written on a sheet of modern paper, saved on a harddrive in a proprietary format or burnt to a CD-R or DVD...... how long would it last, and what would future generations lose out on or be deprived of?

Maybe you are starting to see my point? We have seen amazing advances in the current and last century, mainly attributable to the rapid increase and spread of information. Cumulative knowledge allows for rapid progress. More raw human processing power, if you will, all connected and digesting reams of information, making inferences, connections, theories, statistical observations and basically learning, refining and increasing the sum knowlege of all humankind. Hopefully making things better and not worse!

Now imagine .... we destroy ourselves in a nuclear holocaust accompanied by a huge EM pulse ( electromagnetic pulse ) that wipes out most, but not all of the digital data on the planet. The end of the current global age of the internet and digital data. We need to start from scratch but most of the engineers and basic information for building complex circuits and the means to access any of the remaining survivable information is gone to us? Operating systems, source code and close to all raw data would be gone.

One could well ask if it ever did then actually exist? How much are we missing from that which was daily life for the Egyptians, Greeks, Romans, Incas etc etc... From what we have found thus far.. e.g ruins, artifacts, personal effects, architecture, farming and certain amounts of business and governmental records of the time - we build a picture of the politics, philosophy, medicine, science, mathamatics, law, ethics etc of these people's lives and overall civilisations....

Many of these civilisations either fell, mutated, diverged or imploded.... again imagine if you will the sum of all human knowledge available to us should we have had a cumulative repository over the past few thousand years... maybe we would be more advanced or maybe we would have wiped ourselves out properly, once and for all!!!!

We are creating and learning at a pace never seen before in human history ( well to the best of our current knowledge based upon what we have found.... how would we really know unless they wrote it down or *all* the information and records had survived? ) what happens when our civilisation comes to an abrupt or bitter end?

Should we be doing more to ensure the information we create and learn about ourselves, our civilisation and our environment is given the intrinsic longevity it deserves.. if not for our children, for future races of humans, for the historians, researchers, teachers and perhaps those that once again one day try to rebuild society out of the dark ages?

Aside: This question also begs an answer to the issue of complexity in many of our sciences and systems and how we represent them. Ask yourself if it would be easy to rebuild, reproduce or look at a current high level system, grasp the underlying concepts and reproduce the outcomes. These systems I speak of could be anything from computer systems, to law systems to social systems. We are building a house of cards with no thought for my favourite question, not "Why" but "What if?".....

Aside II: Do we really actually care about the "What if?" or would it stifle our creativity and the speed of advancements if we were to spend more time ensuring the integrity and longevity of our cumulative knowledge? Why are we rushing so far and fast ahead in to the unknown, we'll still get there eventually... it will still be unknown a week from next Tuesday! Maybe it's time to slow down and take a timeout, have a 'kit-kat' and then take a really good long hard look at what we as a race are really doing and trying to achieve... Also is it sustainable and recreatable should we break it? The wonderful concept not of "How well does it work?", but "How well does it break?" comes to mind...

Right now I see a huge risk to society at large.

"How we represent, store, archive and share digital information. "

I believe the sum of all human knowledge is in danger... let's at least start by ensuring we could start from scratch again. ( or someone or something else could.. what would an alien archaelogist make of all this should they visit earth after we have reduced ourselves to another pre-industrial age again? "Bloody amateurs!" )... oh and here's an interesting byproduct.... maybe a new paradigm would create a template for human learning that creates a code or method for how to educate children so they do not need to spend X times as long comprehending a mish-mash of overlapping disciplines by starting from scratch each time they enter a new field of study. Some may agrue that that is the role language fulfils, a symbolic representation of ideas and concepts to allow them to be expressed and communicated.

There is no handbook for parents, there is no common teaching system other than repetetively hammering information in to childrens skulls. The advances we could make as a race if we harnassed and guided the abstract thought processes of children. ( possible focus "edutainment" )

Libraries and museums perhaps need a bit of a rethink and some real funding?

If we want to build a new society the 'Tipping Point' will start with the children.
If we want to 'keep' and progress our society we need to focus on 'keeping' the cumulative information safe and healthy.
If we want to advance our society we need to eliminate fear, greed and inequality.

Hmmm... rant over.. time to watch the Simpsons....

Some fun links to projects / papers:

Information Longevity http://sunsite.berkeley.edu/Longevity/
NARA National Archives and Records Administration ( American, http://www.archives.gov/ ) ERA ( Electronics Records Archive ) http://www.archives.gov/era/index.html
OSTA Optical Storage Technology Association ( http://www.osta.org/ )

Sunday, July 31, 2005

Meta-info... coz' my time is short...

Great aggregated blog @ Planet Security Linkhttp://www.dayioglu.net/planet/ for all things Information Security.

And another @ InfosecDaily http://infosecdaily.net/securitynews/ .... and.. TaoSecurity http://taosecurity.blogspot.com/

Here's a fun WormBlog @ http://www.wormblog.com/ and here's a similar one from F-Secure http://www.f-secure.com/weblog/

Microsoft Response Center Blog http://blogs.technet.com/msrc/

Microsoft Security Wiki http://channel9.msdn.com/wiki/default.aspx/SecurityWiki.HomePage

All I need to do now is get my new team membership in First .... I miss my First list with my coffee in the mornings!

Saturday, July 30, 2005

I wish.....

The network _is_ the computer....

I have really good spatial comprehension, and am mostly a visual person. This is how I think. Currently careerwise I am an Information Security practitioner and Network Engineer. I like building, fixing and securing / protecting things. ( Substitute paternal instinct as I have no kids? )

Anyway, I digress.... as you move from job to job you tend to build up your stash of happy tools, resources, methods etc etc... _however_ when arriving in a large organisation it is very hard to get a handle on what's going on and build a map in your head of the network and nodes that contain information you are supposed to be securing / defending / protecting.....( especially if that company's documentation is bad, non-existant or they have never used any visualisation tools or mapped / diagrammed anything! ) Also, sometimes the company can be in a high growth phase, where things change daily or weekly - and we all know that devices are not always built, deployed, alarmed or documented properly..

For quite sometime I have been formulating an idea on how to get a handle on this .. it also applies to the actual NOC / SOC [ Security Operations Center ] guys too and how they view their operating world.... these days we need to know what's going on second by second, not day by day, or week by week.. internet time is just too fast, and so are the releases of worms following proof of concept code, 0 day exploits, or reverse engineered vendor patches.

Complexity is also the enemy - however that beast is getting larger not smaller ( as node numbers, services and depth of code / processes on hosts increases.. ) which I believe leads to the true gap right now; the ability [or inability] of us mere mortals to ingest, comprehend, correlate and appreciate changes / incidents and outages _properly_, including the ability to take the decisive actions to mitigate, fix or even just improve the situation. Inherent in this model is the ultimate accountability or responsibility for the decisions made in mitigating or remediating said issues. This is where supposed 'silver bullets' like intelligent IPS's, intelligent networks, sandboxing policies will invariably always fail. Too many overheads. Configuration needs to be done before the fact, and this administration can be forgotten / overlooked or just ignored. We still need to create the rules, tune the IDS, define the actions for them to take and then still no one I know in the industry will let a system issue of it's own accord an ACL [ Access Control List ] change, TCP reset or blackhole / sinkhole routing to /dev/null, Null0 or a 'scrubber' of sorts. They are too worried about customers and mission critical platforms, and rightly so? A.I. is still rule based / heuristic and often incomplete, as humans still need to re-write or tweak the frameworks, sample spaces to achieve the desired results. Neural networks still rely on 'us' humans for their playing fields.

I don't believe machines will ever be able to do real-time business risk modelling by drawing the correct inferences at the right times, this is still a skill humans are better at. When associating patterns, schedules and dependencies from informtaion we are presented with, what's fundamental is the type, quality, amount and correctness of the data presented to the human operator. Most humans are visual creatures, even the blind who build connections and patterns in their minds....

Aside: ( one of the best Cisco Routing and Switching CCIE's in the TAC [ Technical Assistance Centre ] they had in Brussels, Belgium was actually blind and supported large complex enterprises remotely on the phone! )

For now though, let's think about having the right information, easily represented and at the right time. Take a peak at the OODA loop ( in previous post below ) and the concept of a CERT or CSIRT, if you are not familiar with them. ( I am bundling the NOC / SOC and concept of a CERT in to the same teams / functions here... )

The pitch: A near-realtime 3D network map, seperating out a rough OSI / ISO 7 layer model into 2D connected visualisation planes that can be manipulated in real-time possibly with a touch screen. ( Alternatively and probably more pragmatic would be that of the 5 layer TCP/IP Sun/DOD model ) Other features would include nodes giving off visual alarms when there are issues and when thresholds are reached. Screens could be split to render multiple parts of the network simultaneously. Employees / clients could access standard templates / defined sub-maps remotely. These clients may be run on normal users or operators desktops, with the realtime rendering done on the client. Clients may have different roles as it relates to the network and get seperate streams overlayed to their maps. ( Traps, Anti-Virus, IDS, Flows with filters, syslog alerts.... )

DBA's see overlaid maps of JDBC, ODBC, SQLNet etc
Network Operators see ICMP, SNMP, SSH, SCP, TFTP, RCP, RSH, Telnet, Syslog etc..
Security can see everything but pick known 'bad' ports or recent outbreaks that use certain ports?
Content guys can see their product moving around...
Web guys can see their piece of the pie etc etc etc

Note: Suddenly at any point in time, all your observers become your distributed operations and network monitors!!! An Open Source model to keep the _network_ smooth and efficient...

Client / Server architecture similar in a sense to that of a MMORPG's methods of passing state and object information in a highly compressed format whereby the rendering engine primarily uses the client-side resources. Included may be the concept of Multicast or Peer-to-Peer to distribute information reducing bandwidth consumption. As with the gaming model, administrators may change information in realtime or influence the network also in realtime. Operators could push, only clients could pull. As this mapping would be graph based, holding state information and inter node relationship information ( think link-state / hybrid routing protocols ) each client would have a world view but _build_ his or her own "routing table" or view of the world as a normal router would ( including endpoints too though..! ) and then receive _state_ changes, which, in the message passing syntax would be anything from a threshold alert to a node state change, to a change in the graphical representation of a node in relation to some pre-defined event etc...

So to 're-cap' the 'network game server' as we'll call it handles most of the topology information, message scrubbing and over-all admin rights. ( Think of it as a shiny front end MOM / NMS / Event Correlation engine that understands flows... ) Clients, be they desktop users, network administrators, remote NOC teleworkers or customers who wish to see their relevent part of the network or hosts are performing from a network perspective, all get to see what's going on when, where and _hopefully_ in a distributed environment _we_ can get to the ever more elusive why in a reduced amount of time?

Transparency drives growth, change and improvement.

As information and events are all realtime and streamed in somewhat of a pipeline ( including flows ) it should be possible to ( with accurate network wide NTP ) perform limited tracebacks of incidents, albeit the event must be recognised or pre-defined in some form. This is where baselining and normalisation is exteremely important. SourceFire are doing pretty well in this regard it seems with RNA...

Sounds futuristic? Maybe it's out there already?

Perhaps, but most of my previous posts, in theory, contain close to the correct tools to do this ( well nearly anyway ).... the closest I have seen in operation thus far is a good independent 2D map built by QualysGuards Vulnerability Assessment tool, and OpNet's SPGuru ( perhaps their new 3DNV product? ) that feeds itself from existing NMS's and MOM's like CiscoWorks Information Centre, HP Openview etc..

a) get all related SNMP read strings for routers, switches and firewalls ( if you so do...)
b) ensure your platform has full ACL rights for the above
c) ensure your platform has full port connectivity through firewalls etc to achieve connectivity... ICMP/TCP/UDP
d) allow your platform to fingerprint hosts and nodes and make it an iterative behaviour...
e) allow your object orientated mapping engine to attribute status to graph leaves in real-time as it's rendered
f) have a concept of trending / difference
g) allow your platform to parse routing tables and understand topology ( Hmmmm, stateful or stateless mapping.. guess it needs to build consistent view rather than rebuild each-time to reduce overheads... as with gaming, build the world.. then interpret changes? )
h) perhaps overlay NetFlow (tm) information for close to real-time ( 5min+- ) traffic overlays.. top talkers etc. ( NetFlow(tm) is not realtime but exported in time intervals to collectors where it can be aggregated..
i) perhaps use this engine to allow you to do a form of touchscreen IPS ( Intrusions Prevention System ) on your whole network, thus the final realtime responsibility lies with the Network Operators?
j) X3D http://www.web3d.org/ as a framework instead of the supposedly outdated VRML ?
k) you would possibly need a fast rendering game engine to achieve basic visualisation depending on network size and complexity if not using X3D / VRML.
l) could feed and help with Capacity Management ? RMON + real-time fault-tracking ( ICMP sweeps / SNMP traps )?

Just a thought, but it's kinda where I see the defensive perimieter paradigm being turned inside out as it relates to Information Security with the keywords _realtime_ _complexity_ _perimeter_ _defense_ _ips_... imagine also if the host OS or NOS could tag confidential enterprise information and insert this boolean based tag in the TCP header somewhere ( DSCP / TOS -> QOS-> Public || Confidential ) and then NetFlow also had a header that could see and report on this... you could then see when the information was walking out the network door? This is hugely simplified from the host, file and application context I know.. but it's a thought as it would need to be a standard and built in to document formats. Users could then turn it off perhaps... maybe it could be enforced at a polcy level, but most host based agents don't run on all platforms or would be supported etc alas engineers will always want to run their own OS.... or have root privileges anyway.

This of course does not take in to account making copies on to removable media.. that's another issue... but it would be a start... probably impinging on DRM [Digital Rights Management] but not *really* as it's targeted for a corporate environment only.. and it would be a label / watermark.. not an endpoint restriction.. ( though it could be, I am mainly referring to the network gateways / edge though... ! ) but it lends itself to being auditable and the concept of the "Shrinking Perimeter" being popularised by Dan Geer http://www.verdasys.com/site/content/whitepapers.html

Most of the time companies drop keyword searches for the term "company confidential", or take a copy of encrypted emails for future use. This does not address dns, http, ftp etc... ftp access is not always granted, but http(s) is, either through proxies or direct. Maybe we should give up and not try to control the data leaving the network.. just audit it and focus on employee visibility and compliance? At what point does the complexity, entropy and technology allowing access to information really become manageable, controllable and auditable by humans anyway?

Work and Personal

So I'd like to address 2 topics and what's going on with me right now, both somewhat technologically impacted ( and then of course some interesting links etc.. ):

Personal:

I am having great fun right now with a mixture of PodCasting and the content @ Zencast.org . Free Buddhist classes for the masses, who said Podcasting wouldn't catch on. Today I sat in the sun on Manly beach for 2 hours learning and meditating :)

Work:

With no IT Security Strategy, comprehensive policy, budget, resources and incorrect internal reporting chains.... an outsource trying to drive the clients Information Security Policy and Information Security Management System; the emphasis has to be on initially enumerating information assets and classifying them as part of the companies risk profile / attack surface before engaging in anything else. This unfortunately means, in the absence of any current snapshot of information / physical assets or full knowledge of business processes, an independent audit is needed to achieve a baseline.. with subsequent scans / audits building upon this... with special focus paid to the ousourcing interface and contractual obligations on all parties. ( ...including the other outsourced services / interfaces from other companies / organisations.. )

It also denotes the need for a base level strategy and methodology. The most effective framework in information security right now is a subset of the Parkerian Hexad http://www.answers.com/Parkerian%20Hexad ( C.I.A. / Confideniality, Integrity and Availability ) ratings and also the OODA loop http://www.answers.com/ooda%20loop developed by John Boyd for gathering Intelligence and then Execution in Information Warfare.

General News:

I'd also like to mention a recently given speech at Blackhat by Michael Lynn an ex-ISS security researcher because many see it as a huge threat.... basically apart from the DNS root servers, everyone seems to forget about the routers(tm), as with Cisco's monopoly running IOS on most backbone infrastructure, why own 1000's of hosts.. when you can own the network? Ask yourself what else is ubiquitous... remember the SNMP issues and what about BGP or goofin' with the common implementations of the TCP/IP stack out there?

Some really cool people I admire in the Industry ( you gotta be known when you're in Wikipedia/Answers.com? ):

Rob Thomas http://www.cymru.com/
Dan Kaminsky http://www.doxpara.com/
Dan Geer http://www.answers.com/topic/dan-geer
Bruce Schneier http://www.answers.com/bruce%20schneier
Paul Graham http://www.answers.com/Paul%20graham

Some cool Penetration Testing / Information Security Consulting companies:

Security-Assessment http://www.security-assessment.com/
Corsaire http://www.corsaire.com/
NGS http://www.ngssoftware.com/

Information Security Testing Methodologies:

OSSTMM http://www.isecom.org/osstmm/
OWASP http://www.owasp.org/index.jsp

Back to the concept of network visualisation and graphing I have updated:

Sunday, July 10, 2005

Building blocks and giant's shoulders....

Newish stuff.... and references that are always good:


3G Related:



Risk related definitions:

Risk Management http://www.answers.com/risk%20management
Risk Assessmenmt http://www.answers.com/topic/risk-assessment

Law as it relates to IT:

http://www.groklaw.net/


Internet Modelling / Risk Modelling:

NetworkViz http://networkviz.sourceforge.net/
CAIDA http://www.caida.org/
OpenQVIS http://openqvis.sourceforge.net/
Opte Project http://www.opte.org/
LGL http://bioinformatics.icmb.utexas.edu/lgl/

( java'ish )

JASPVI http://lab.verat.net/Jaspvi/ ( very cool ASN mapping )
Tom Sawyer http://www.tomsawyer.com/home/index.php
yFiles http://www.yworks.com/
( http://www.yworks.com/en/products_yed_about.htm )

GINY http://csbi.sourceforge.net/
JUNG http://jung.sourceforge.net/

Piccolo (2d) http://www.cs.umd.edu/hcil/piccolo/


OSS Routing Daemons:

ZEBRA http://www.zebra.org/
OpenBGPd http://www.openbgpd.org/
OpenOSPFd ( coming ) http://www.openbgpd.org/


BGP/RADB/Whois type stuff:

http://bgp.potaroo.net/
http://www.dnsstuff.com/
http://www.traceroute.org/
http://www.bgp4.as/tools

More Netflow tools / info:

Netflow info http://www.cisco.com/warp/public/cc/pd/iosw/ioft/neflct/tech/napps_wp.htm
Extreme Happy Netflow Tool http://ehnt.sourceforge.net/


Quick NOS ( Network Operating System ) Emulation:

http://www.dcs.napier.ac.uk/~bill/emulators.html


DNS:

RDNS Project http://www.ripe.net/rs/reverse/rdns-project/

Cisco OSS Related:

COSI http://cosi-nms.sourceforge.net/


Local Internet Registries AU:

http://www.ripe.net/membership/indices/AU.html

Wednesday, June 22, 2005

Mind Mapping

Been looking for a java based cross platform alternative to TheBrain http://www.thebrain.com/ for some time and just found this, FreeMind http://freemind.sourceforge.net/

Also here is a curses based 'hierarchical notebook' called HNB http://hnb.sourceforge.net/ .. enjoy!

Monday, June 13, 2005

Standards, standards, standards... open?

Sunday, May 29, 2005

Longevity = Portability, Security, Mass acceptance?

OK, so I'm basically wondering why I am frustrated ( spiritually! )... one possible answer is I haven't *created* anything in a long time. I add value to projects and script now and again in relation to work, but most of what I do is related to 'Risk Management' and 'Information Security' from a process, network and application / system standpoint... thus it's mainly advice, recommendations and some design and architecture ( this bit does involve 'creating' usually... )

I haven't done any art, cartooning, flash, web pages, video etc in a long, long time ( last was probably http://indigo.ie/~nodecity )... I have sort of decided to go back to proramming [something I used to hate in University...http://www.cs.ucd.ie/ ] but am finding more uses for it these days... usually however, I 'script' with perl for some quick and dirty stuff.. e.g. text parsing + regexp and bolting together other apps and scanning scripts to automate real time network reports etc...

Firstly, what struck me was that if I was going to create something I should get the most 'bang for my buck', and it should be cross-platform, interoperable, open source/standards and have a great deal of flexibility ( from GUI development to low level access to memory etc [speed and efficiency must be taken in to account here also..). It should also be beautiful, concise, intuitive and easy to hack on - easy to prototype on... after reading Paul Graham's take from 'The Python Paradox' http://www.paulgraham.com/pypar.html and Eric S. Raymonds take in 'Why Python' http://www.linuxjournal.com/article/3882 I decided to put some time and effort in to the Python language. I first had to do some bits and pieces on my new Mac Mini to get Python playing happy with an extra toolkit called Tkinter ( TK Interface ) to allow for some GUI programming...

Secondly, I started thinking about OS choice again... perhaps NetBSD or my beloved OpenBSD would make more sense? [Again depends on function || hardware and / or also desktop (FreeBSD) / server / intranet / internet / extranet / ] Maybe go back to Fedora Core? Try out Solaris 10 ?

Then perhaps standardise on a window manager like twm ( as it somes with X ), FVWM or go for something lightweight and extensible like Fluxbox, shell-wise sh / bash but what about rc anyone? ( Guess that breaks the 'lowest common denominator' thrust? ) .....
I guess when you come full circle you have to really look at the title of the post.... I am looking for longevity and a perceivable 'Return on Investment' on the time and energy I am going to invest both professionally and personally.. and subsequently many factors [ some external / market related ] come in to play...

Anyway for your and mine own viewing pleasure; some interesting links for posterity:

- Python http://www.python.org/
- DivX based Python video tutorials http://ourmedia.org/node/11134 and Dive Into Python http://diveintopython.org/ not forgetting O'Reilly's http://python.oreilly.com/
- Tkinter http://www.pythonware.com/library/tkinter/introduction/ and http://wiki.python.org/moin/TkInter
- DJB stuff.. http://cr.yp.to/ focus especially on his software... qmail, djbdns, daemontools and ucspi-tcp

Note: I also recently switched to Camino http://www.caminobrowser.org/ as my browser [on Mac OSX] as Firefox 1.0.4 kept crashing!

Note: Camino now seems to be grumpy with Blogger http://www.blogger.com/ :( , back to Safari http://www.apple.com/safari/ which is not fully supported by Blogger either? Double DOH! Anyone wanna' run three browsers?

Thursday, May 26, 2005

Lands I have visited....

Some very interesting Internet research sites and handy stuff you may not have seen before (some *very* techy and some not! ) :

Wednesday, May 25, 2005

Make me better... "on the shoulders of giants"..

It struck me that most blogs I read only have X number of postings on the front page - and as with Google these days, it's very rare to go past the initial front page. This doesn't quite hold true if you have been a long term reader of a blog or get updates from Bloglet http://www.bloglet.com/ , but the point being 'less' is 'more'... quality over quantity per se..

I hereby set myself the challenge to keep this blog at one page, almost like a Wiki... but it's still a blog OK? This means reduced graphics, shorter explanations and more links to let you guys go 'walkabout', to read around the edges -> as most things have been said before anyway...

So I have some filters on my Gmail namely 'Efficiency / Productivity' and 'Reading List', and I thought I'd share some of them with you..

Efficiency and Productivity



Reading and Listening List


Fun Stuff

Monday, May 23, 2005

Here's a thought.... or two.... or three...

  • learn Chinese ( Mandarin / Cantonese )
  • play squash competitively
  • learn to properly defend yourself
  • take a night class in something interesting
  • your body is a nutrient /drug filter, only put nutrients and good drugs in to it ! http://moodfoods.com/
  • spend time developing your mind and soul
  • as you are the center of your universe, learn about yourself
  • want less, expect more
  • be patient, sometimes doing nothing is something
  • do not watch random tv, specific channels or programs only, dl programs...
  • read the classics
  • look for the good in people, if you can't find any... move on...

Monday, April 25, 2005

It's all been said before...

"..and he wanted to decide whether life was worth living. He did not know that this was the question in his mind. He did not think of dying. He thought only that he wished to find joy and reason and meaning in life - and that none had been offered to him anywhere.

He had not liked the things taught to him in college. He had been taught a great deal about social responsibility, about a life of service and self-sacrifice. Everybody had said it was beautiful and inspiring. Only he had not felt inspired. He had felt nothing at all."

Part IV, Chapter: Howard Roark, The Fountain-Head ( Ayn Rand 1947 )

Sunday, April 10, 2005

Some more stuffing....

Anyway, if you need an answer go to Answers.com http://www.answers.com/ or perhaps desire some fun techy news about flying wind farms, then go to Wired http://www.wired.com/wired/ magazine.

I like Blade Servers http://www.answers.com/blade%20server but also would recommend Google's http://news.com.com/Googles+secret+of+success+Dealing+with+failure/2100-1032_3-5596811.html approach; especially what I like to call the 3-way data rule... if you could call it that?

Rackable http://www.rackable.com/ are well worth a read about too.....

Another thing I am keeping an eye on is how to establish the ultimately flexible home entertainment system. There are many, many things to consider - not limited to, but including protocols, ports and of course cost. A good audit trail has been started over at PVRBLOG http://www.pvrblog.com/

Sunday, April 03, 2005

Relationships...

Well I am back to an actual solid and easily used OS. I have played with, used and worked on Windows, Linux*BSD's - and then last week I bought an Apple MAC Mini http://www.apple.com/macmini/ running Darwin , otherwise known as OSX http://www.apple.com/macosx/ ( Panther 10.3.8 )

All I can say is it's shiny and I get a BSD underlying framework and bash style shell when I need it.... here is a screenshot of my desktop with some Konfabulator http://www.konfabulator.com/ widgets ( which incidentally you can get for WindowsXP http://www.microsoft.com/windowsxp/default.mspx
also! )

Small Medium Large

Note: The MAC OSX framework is still hugely more developed for all users than that of the original FreeBSD 4.4 on which it was based as it features some pretty cool stuff for users and developers alike; Cocoa, Quartz, Aqua and a Mach kernel etc.

Have a look over here for more information http://developer.apple.com/documentation/MacOSX/


I also spent some time getting Shinobi playable on MacMAME and some other games on Snes9x, threw in the Cisco VPN client and Citrix client for work.. while simultaneously making sure I had Xcode installed and subsequently DarwinPorts + Fink.... oh yeah don't forget Desktop Switcher, Onyx, Azureus [BitTorrent], MacTheRipper, VLC, Mplayer OSX, FireFox, Audacity, HenWen and CyberDuck and I'm sure I forgot some other stuff I threw on... any comments, suggestions are welcome.... Panther; maybe pay for DiskWarrior, Konfabulator, LiteSwitchX, KeyCue, Alarm Clock Pro...

SIP slowly but surely....

With the proliferation of VOIP http://www.answers.com/voice%20over%20ip and the growing popularity of Asterisk http://www.asterisk.org/ as a cheap, reliable alternative to ye big olde' PBX's http://www.answers.com/pbx/ I thought I'd mention some fun stuff like:
Note: of course we must not forget that end-to-end QOS http://www.answers.com/qos/ is still not viable between most Telco's http://www.answers.com/telcos/

( similar to the interconnectivity problems with MPLS http://www.answers.com/mpls/ in heterogenous networks... although I guess GMPLS is a start http://www.mplsrc.com/faq3.shtml#GMPLS )

On Safari...

I have a fair few books, many from O'Reilly http://www.oreilly.com/ and Cisco Press http://www.ciscopress.com/ and they are a pain to lug around and access on a regular basis, so now I try to do two a few smarter things with my library when possible:

Centralised Social Bookmarks

Del.icio.us http://del.icio.us/is a site on the oul' Interweb http://www.answers.com/interweb . It's a centralised way to manage your bookmarks and socially share them via members statistics and access via RSS http://www.answers.com/rss .. pretty neat as I was using Gmail http://www.gmail.com/ as a sorta' deposit/resource for these kind of bookmarks...

So why not?

Well I guess the reason I'm starting this is for some kind of record of the next while...
Tired of complexity and stupidity... this will be my quest to find the shortest distance between two points, have others help me to find it - and in the process try and 'pass-it-on' as it were...