Disclaimer :If you view everything from a technical perspective, be aware that the use of terminology here in some places is an attempt to re-invent it's usage...
Sometime in late 1998 I wrote about the nature of the internet bringing issues like standards, performance and security to the fore (not that it wasn't relevant before that, it was more topical then). I asked if the “metadirectory” & it's subsequent evolution would pave the way to finding realistic answers to these concerns?
A “metadirectory” is a software tool which allows a single point of administration for multiple proprietory network operating systems, directory structures and application based directory services. Ideally, it should be able to provide flexible interoperability between diverse systems on any network. Some of the limitations or advantages of network specific directories (depending on it's use) are, their application & vendor specific functionality. This limits the user's access and reach within a network (intranet) and the internet. The metadirectory is ideally poised to facilitate interconnectivity and set standards for the evolution of the web.
The distributed nature of the internet, has encouraged diversity in terms of platforms. Thus in seeking uniformity & standards, to deliver performance in terms of true financial & business impact we have come a long way in terms of popular acceptance of Unix distribution in the form of Linux and open source projects and also Microsoft's focus on security.
Hypothetically speaking, the development & deployment of the ideal metadirectory encourage the ISP's to add services which will make the virtual market even more effective and attractive in terms of value. Addressing security & bandwidth concerns then takes on more focussed dimensions. The progression to real time response requirements (99.999%) has made standards formulation imperative in one form or the other. Growth of niche markets has aided deviation from monopolistic tendencies although centralisation and distributed miniaturisation will have to co-exist & evolve depending on budgets of the end user and of course ROI based on existing investments.
Industry leaders the likes of IBM, Microsoft, HP, Sun, Oracle & Cisco have been actively working on the metadirectory with a host of lesser known & now defunct brands. This reflects the opportunities created in terms of bridging multiple platforms either through M&A's or by further developing interoperability. The move into the era of seamless integration and OSI becoming reality is evident, as the playing field evens out with innovation and we have been reminded that size of organisations matter in the capitalistic world of business.
DEN's (directory enabled networks) could be considered the “core logic” of a “metadirectory” in a sense (although the term itself was realised later), when we say that metadata is the building block of a metadirectory just like 0's &1's are the building blocks of assembly language or machine code . None of these terms are new or novel concepts, we have just arrived at an age of better presented and marketed products & technologies and legacy skills are being replaced by bandwidth intensive GUI driven requirements. I cannot however imagine that logic would be very different, hence the relevance and importance of legacy based skills.
IDS (Internet Directory Server) enables internet, intranet and on-line service providers to offer directory based services. Examples being;
and many more such developments.
There are slight classification differentiators in the above five examples mentioned to point out aspects of IP (as in both intellectual property & internet protocol which allows IPv6 to be elaborated on), Copyright law, on-line Privacy (which not many people believe exists) and deliberates the question as to it's relevance and subsequent utilisation and purpose. Yes, ethical and moral dilemmas do emerge, and the question as to how it can be better managed with a constructive and positive outcome as it progresses to accommodate innovation and capitalism somehow makes things interesting.
Metacrawler and Dogpile are search engines which search other search engines like Google and MSN, hence called meta-search engines although the race to outperform each other is blurring the lines of demarcation. Referencing directories of information hosted by specific organisations which could be classified into various functional identities to include Business, Government and autonomous bodies are done by search engines and by meta-search engines depending on the user's choice of engine, just like we have a preference for a particular brand of car or tea or sugar for that matter, except that we don't pay the content providers or the multifarious software tools available for free on the internet. We, as end users pay the ISP's for an internet connection and it's only fair that we expect quality and fair services as determined by the user's awareness of technology and its impact if utilised to most if not all acceptable standards.
Metadirectories reference metadata (if open to an online search (as per permissions granted by the manager/s of the information) from another virtual POP (point of presence)).The storage devices (hardware & software) which store these reference tables in clearly defined structures provided by manufacturers of hard drives, OS's (operating systems) and application software, both distributed and consolidated, are getting inexpensive and more effective in terms of capacity utilisation. Storage providers such as IBM, Microsoft, Sun, Oracle, Macintosh, EMC, Hitachi have brilliant offerings. I recall the 90's and of NCR, Netscape, SGI, Tektronix, Tandem, Zenith, and many more such players. They were the good days of hardware innovation. Mobile phones were the size of briefcases and commercialisation of military equipment was starting to get more obvious. We started to hear of AT&T in recovery from its long battle with the Federal Government as also was IBM in turnaround mode. Things are starting to look a little blurred nowadays (a sign of progress I guess..) and the people are still around with their experiences and views on the topic both subjective and objective. We might choose to rationalise to the point of profit and entrepreneurial leadership, the existence of big business and its evolution for the benefit of the future or generate a critique' on all facets and details of this framework and deal with it as cognitively as possible because an emotional viewpoint is often either misunderstood or considered a skew from a cognitive perspective. We realise that rationalisation has its limits relative to the situation.
Sometime in 2000-01 when employed by IBM, the CIO of one of the larger automobile spares manufacturer pointed out to me that the IC&T industry is at it's infancy because of the level of control “the Vendor” exerts over its channels and the rather symbiotic dependency the end customer has on the producer of the product itself. He clarified that the automobile industry has achieved a level of maturity which the IC&T industry will take at least another 50years to emulate. He essentially meant that the distributors and dealers took on a lot more responsibility for the end user than anyone in the IC&T industry does. This got me thinking on reworking this paper which I originally drafted in late 1998.
“Datawarehousing and E-commerce” has been at the forefront and we see it formulating the foundations of the new era of miniaturisation hastened by nano-technology, VVLSI (very very large scale integration) & the need for Intelligent architectural frameworks. Defining this will encompass laws that surround “Directory-enabled-networks” which will facilitate “data-information-knowledge-Intellectual property” paradigms. Prioritising qualitative information would be a a little different from collating quantitative information, however synergistic tendencies obviously emerge with the better integration of performance matrices making significant impact in the board room. A few years ago, the major challenge was making the CIO an equal member of the board room, at least in the top 500 Corporations that is perhaps no longer an issue.
Sometime in 1995, I recall reading a newspaper article which mentioned that silicon wafers on chip will reach threshold capacities at 0.16micron and beyond that at that time they had not found a suitable alloy to conduct heat on-board which would allow normal functioning of other components on the chip set. I thought for a moment sometime in 1999 that Dell has come up with an ideal solution when they used copper on their Microprocessor to insulate, although at that time I wasn't able to find out much more. More recently, when discussing the notion of the implementation of the “Marshall plan” & its possible ramification being the US involvement in Iraq & Afghanistan one of my friends mentioned to me and showed me a web clip which boasted Intel's ability to wirelessly recharge mobile phones. I was told that this was just the tip of the iceberg in terms of DoD's ability to come up with solutions based on classified “intellectual property” of functional prototypes created by people such as Nichola Tesla way back. I am not qualified enough at this stage to embark on a historical perspective of stolen careers or patents, but all the noise surrounding such controversial topics is hair raising to say the least. An expert on International “Privacy” laws as it might relate to the ubiquitous nature of the internet might be in a better position to comment.