Currently we are getting surrounded by people calling themselves Information Architect. Hell, yeah, I call myself one! So where did that word come from? Quite some years ago (1997) I happened to be working on a project with John Thackara in Amsterdam. Bringing together people from two different fields: knowledge management and (interface) design. From the world of design we had quite an odd bunch of people: io360 from NY, Perspecta from Cambridge MASS, PlumbDesign, ... those people got me thinking about, what we call now IA. I asked one of them what his favorite book was on (interface) design, this happened to be INFORMATION ARCHITECTURE by RICHARD SAUL-WURMAN. Today we would call Richard's ideas Information Design, but still, his ideas come quite close to IA of today. As Christina Wodtke put it quite right: Pretty much before there was a web, before Jakob was going to war with design, before all that hoo-haw... There was Richard Saul Wurman saying that someone should design information in a way people could use it, and he called this person an Information Architect.
Tuesday, February 28, 2006
Monday, February 27, 2006
Semantic technologies encode meanings separately from data and content files and separately from application code. This enables machines as well as people to understand, share and reason with them at execution time. With semantic technologies, adding, changing and implementing new relationships or interconnecting programs in a different way can be just as simple as changing the external model that these programs share. With information technologies, on the other hand, meanings and relationships must be predefined and “hard wired” into data formats and the application program code at design time. This means that when something changes, or we want exchange information we hadn’t previously, or two programs need to interoperate in a new way, the humans must get involved. Off-line, the parties must define and communicate between them the knowledge needed to make the change, and then recode the data structures and program logic to accommodate it, and then apply these changes to the database and the application. Then, and only then, can they implement the changes. Semantic technologies are “meaning-centered.” They include tools for autorecognition of topics and concepts, information and meaning extraction, and categorization. Given a question, semantic technologies can directly search topics, concepts, associations that span a vast number of sources. The results are fast, relevant, and comprehensive. Plus, semantic technologies can deliver answers, not just lists of sources. Information technologies are data, page, and document centered. They can only directly search these primary sources, by browsing, by word or number indices, or with statistical categorization. Precision and recall is more limited, plus information technologies only return lists of pages, documents, and files to consult. Semantic technologies organize meanings using taxonomies, ontologies and knowledgebases. These are relatively easy to modify for new concepts, relationships, properties, constraints and instances. Semantic technologies integrate data, content, applications, and processes via a shared ontology, which minimizes costs and effort to develop and maintain. Information technologies organize meanings using flat files (simple schemas), relational data models (RDBMS), and object-oriented models (OODBMS). Database structure is relatively rigid, and difficult to modify for new concepts and relationships. Integration of data and processes typically requires point-to-point interfaces and connectors that are costly to develop and maintain since the knowledge required must be hard coded in each connection rather than shared via a common metamodel. Semantic technologies reason via associations, logic, constraints, rules, conditions, and axioms that are represented in the ontology separately from application code. This declarative structure allows reasoning in multiple directions. For example the same knowledgebase can be used to answer questions about how, why, and what-if as well as give factual responses. Also, semantic technologies allow development of programs that can “learn” (infer and create new knowledge) simulate and test, and adapt behavior based on experience. Information technologies reason via fixed algorithms that are embedded in application code. Information technologies give us situation awareness. For example, they answer questions about what, where, when, and how much. Algorithms are preprogrammed behaviors, like instinct. They perform a rote task. If anything is learned, people must update the logic off-line to create a new version of the program. Semantic technologies use ontologies to auto-discover and provision services and functionality (e.g., semantic web services, semantic grid services, etc.). They use ontologies to link applications into composites that deliver a comprehensive (e.g., virtual, 360 degree) view of situations with all data and information in context. By representing meanings in a language and media neutral form, semantic technologies can auto-generate text, graphics, drawings, documents, and natural language dialogs. Similarly, they can auto-personalize, customize, and generate multiple versions of communications from the same knowledgebase automatically. Semantic technologies enable “autonomics:” systems with self-knowledge that can selfconfigure, self-optimize, self-protect, self-heal, and self-manage. They provide the foundation for developing new categories of services and products that can know, learn, and reason as humans do. Information technologies require humans to manually discover and implement data and application connections and interfaces. Alternatively, humans must search to find data and information, and then put it into the right context for decision-making. Information technologies use computers as “electronic pencils” for humans author and develop content, visuals, and media formats.
The top level of the Dewey Decimal Classification (DDC) system is an example of how chunking information along a single dimension can be an effective way to communicate a coherent narrative about data. In the graph, I've laid out the DDC along the "Foof factor" dimension where Foof is a cross between Froofy and Poofy. The distribution of topic areas tells us the following story about the contents of Libraries: The bulk of writing lies in the middle of the curve in the soft sciences and humanities. There is considerably less writing at the ends of the spectrum: hard sciences and the arts. This makes sense since you could say that the primary by-product of the soft sciences and the humanities is expository (explanatory) writing whereas the hard sciences and the arts are more concerned with creating "things" as opposed to writing about things: ie. theorems, technologies or works of art.
Saturday, February 25, 2006
Friday, February 24, 2006
Thomas L. Friedman, New York Times "Foreign Affairs" columnist and author of "The World Is Flat: "History of the world twenty years from now, and they come to the chapter 'Y2K to March 2004,' what will they say was the most crucial development? The attacks on the World Trade Center on 9/11 and the Iraq war? Or the convergence of technology and events that allowed India, China, and so many other countries to become part of the global supply chain for services and manufacturing, creating an explosion of wealth in the middle classes of the world's two biggest nations, giving them a huge new stake in the success of globalization? And with this 'flattening' of the globe, which requires us to run faster in order to stay in place, has the world gotten too small and too fast for human beings and their political systems to adjust in a stable manner?"
Posted by Timo Kouwenhoven at Friday, February 24, 2006
Thursday, February 23, 2006
According to SpotLightingNews Bertelsmann will probably lead the multimedia search engine project, called Quaero. Project Quaero is considered to represent France's and Germany's response to Google and Yahoo, meant to lift Europe to the United States and Japan's research and development status. France's contribution (EUR150 million) will come from the Agency for Industrial Innovation, while Thomson and the French National Centre for Scientific Research will lead the French part of Quaero. Germany's participation is still unclear, though Angela Merkel's friend Heinrich von Pierer, Siemens chairman, is the Germany's coordinator in the multimedia search engine project, intended to provide navigation through the billion audio, video, text, and image files found on the world wide web. Bertelsmann's Empolis, the data processing subsidiary, will sign up this week, though officialy Empolis said they were only studying the Quaero project.
Until know the following organisations/companies were mentioned as active members of the French/German search engine for digital cultural heritage project called Quaero (in random order):
The BIRTH Television Archive is an innovative Web portal providing uniform access to digitised audiovisual material. Major European broadcast archives and specialised ICT companies joined forces to set up the basic infrastructure. Distributed content from various sources can be accessed from one central access point. Apart from moving image material, the BIRTH Television Archive also provides access to digitised programme schedules,stills, articles and much more. Particular attention is given to providing language independent search possibilities and to offering the option to compare the different development paths in several countries across Europe. (source: DigiCult Newsletter, issue 10, Oct. 2005, article by Johan Oomen of the Netherlands Institute for Sound and Vision).
Posted by Timo Kouwenhoven at Thursday, February 23, 2006
Geographic Categories: An Ontological Investigation: "Categories are an essential aspect of human cognition. Geographic categories have received little study, yet they are important to geographic information systems and spatial data transfer as well as to our understanding of geographic cognition in general. Deductive studies have indicated that geographic objects have important ontological features distinct from those of objects encountered at table-top scales (Smith and Mark 1998). This NSF-funded project is developing a formal ontology for geographic entities and categories, based on rigorous empirical research using human subjects. Parallel studies will be conducted in several languages and regions, so that the resulting ontology will be multilingual." (source: David M. Mark and Barry Smith, National Center for Geographic Information and Analysis)
BBC - iMP: "iMP is an application in development offering UK viewers the chance to catch up on TV and radio programmes they may have missed for up to seven days after they have been broadcast, using the internet to legally download programmes to their home computers. iMP uses peer to peer distribution technology (P2P) to legally distribute these programmes. Seven days after the programme transmission date the programme file expires (using Digital Rights Management - DRM - software) and users will no longer be able to watch it. DRM also prevents users emailing the files to other computer users or sharing it via disc."
Freeband Communication: "Increasingly software is being designed for networked computer devices (nodes, terminals) that permit other computer devices to utilize locally available storage space, communication bandwidth, processing capacity, and sometimes even hardware components. The advantage of sharing such resources with other nodes on the network is that limitations of individual nodes can be overcome by collaboration with others, or that access can be gained to (multimedia) information that may exist on other nodes. Sharing of computation and communication resources is in particular of importance for power, bandwidth, and cost-constrained networked devices such as hand-held terminal, mobiles, and PDAs." (source: FreeBand, project I-share, Technical University Delft).
Tuesday, February 21, 2006
Upon closer inspection Quaero seems to have a website of its own www.quaero.org. Due to a rather large amount of interest after Chirac's speech, the site temporarily closed down. Currently it redirects to the site of dfag, a site mentioning french/german-economical collaboration. Furthermore, Quaero seems to have taken its first hit: Deutsche Telekom has changed its position in the project, they want to remain a member a observing member though according to silicon.de and SudDeutsche Zeitung on 27-01-06. An important commercial member leaving at the very start doesn't sound like a good thing, does it? The French seem to have made a prestige project out of it, they want to speed it up at all costs. The Germans want to take smaller steps. Main reason for speeding up is the lack of search engine knowledge in Europe, compared to the US. Is that actually true? Do we lack that kind of knowledge since Google was not a European idea? To my opinion the degree of proficiency regarding multimedia retrieval in Europe is quite high! Just take a look at projects like: MultiMedian, Birth of TV, DigiCult, The European Library, ISLA, Físchlár TV, HMI, Image & Video processing group, ... just to name a few.
Monday, February 20, 2006
In a speech last year laying out his 2006 agenda, Chirac spoke to those concerns, saying: "We must take up the challenge posed by the American giants Google and Yahoo. For that, we will launch a European search engine, Quaero."
Quaero, which means "I seek" in Latin, still faces several hurdles, including scrutiny of its public funding by the European Commission and uncertainty in Germany, where no single company has taken the lead and a coalition government elected in November has yet to publicly endorse the project. Organizers are also fighting some skeptics who maintain that Quaero could waste taxpayers' money in academic research that produces no commercial benefit. With Quaero, the French and Germans are hoping to build expertise in the technologies that are shaping the distribution of information and entertainment. The project aims to develop next-generation leadership in search technology, software for managing copyrights and digital ownership and what one document called "cultural-heritage management."
Some observers suspect this last category is a reaction to separate plans by Google, Microsoft and Amazon.com to catalogue, digitize and index the world's books, many of which are still under copyright protection. French and German publishers have objected to the projects, and a separate European scanning effort is under way. Who's participating so far:
- Heinrich von Pierer, a former Siemens chief executive who is an adviser to the newly elected chancellor, Angela Merkel, is leading the private effort in Germany,
- Jean-Louis Beffa, chairman of Saint-Gobain, the French glass and ceramics group, is leading the French side.
- Both national phone companies, Deutsche Telekom and France Télécom, are members.
- Empolis (part of Bertelsmann) probably joins
- LTU Technologies (Paris-based image search technology expert) is part of the project RWTH-Aachen University, is contributing speech recognition and language translation technology to the project.
Compared by some participants to an Airbus-style cooperative effort to increase European standing, Quaero has also been met with skepticism by some industry experts who fear the program would be costly and unwieldy to administer and would produce no tangible commercial advances. February 15th is mentioned as the deadline for all proposals regarding Quaero. Today there is still no news, no website, no nothing. Searching on quaero gives hardly any results at all. Let alone some practical details:
- Try typing q u a e r o on your keyboard, then type g o o g l e... maybe it's easier on an AZERTY-keyboard ?
- Next take a look at www.quaero.com - this domain probably needs to be bought, or will it become a .EU domain ?
I've been creative today. Cool huh? I have had quite an extensive break from reality and came up with this brain dump. It's an overview of trends in today's internet developments. Sorry I can't provide a larger picture... you're not allowed to see the contents.
Sunday, February 19, 2006
Although it's kind of an ego-thing, getting published makes me feel good. Not trying to be too full of myself, I'm sharing the publication of an interview with my yet unknown readers. The interview took place in Octobre 2005 by a freelance publicist for a magazine called Intellectueel Kapitaal (Intellectual Capital)magazine, IKmagazine in short, the dutch 'Ik' meaning 'I or me' in english. The targeted reader group of the magazine is the group of professionals working in the information and knowledge management sector. The interview discusses a project at the dutch Ministry of Transportation, Public Works and Water Management of The Netherlands. Several project members are given the opportunity to tell about their part in the project. I've been involved in the project for about 1 year and designed a taxonomy for navigation purposes on their intranet (information and knowledge sharing space). Furthermore I benchmark tested their search engine and facilitated various working groups who brought together the content for the taxonomy. A coincidence is that the consultant who designed the taxonomy for the internet is my former colleague and roommate who happens to have started his own business. This makes the article even more worthy to archive in my own library! The magazine The ministry The former roommate's company Me
Saturday, February 18, 2006
For years this particular car has been my dream car. No no, it's not the fastest or most expensive car, it's not even the lastest model. It's the 964 Porsche from the 911 series. This car was produced from 1989 to 1994. First presented as Carrera 911 in 1989 with all-wheel-drive (Carrera 4). A new naturally-aspirated engine called the M64 was used for 964 models, which displaced 3.6 litres and produced 247 bhp (184 kW) @ 6100 rpm. Obviously a flat-6 boxer motor. In 1992, Porsche produced a super-lightweight, rear-wheel-drive only version of the 964 dubbed Carrera RS for the European market. It was based on Porsche's 911 "Carrera Cup" race car and harkened back to the 2.8 and 3.0 RS and RSR models. It featured a revised version of the standard engine, titled M64/03 internally, with an increased power output of 260 bhp (191 kW). A track-oriented suspension system with lower ride height, a stripped-out interior that devoid of power windows or seats, rear seats, air conditioning, sound deadening or a stereo system and new racing-bucket front seats were part of the package. A later ultra-low production version of the RS featuring a 300 bhp 3.8 litre version of the M64 motor was also sold briefly in Europe (Carrera RS 3.8).
Posted by Timo Kouwenhoven at Saturday, February 18, 2006
Friday, February 17, 2006
Some time ago we met the son in law of Baron de Driessen in Paris when shopping at the 'poshy' grocery shop in Les Galleries Lafayette. He was promoting the red wine of 2004 from Château la Colombière. This traditional Fronton wine is the best expression of the Négrette, on its terroir of graves, in Villaudric. An expressive nose of prune, black fruits (blackberry), violet and truffle gives a supple, round wine, well-balanced between the fruity and the acidity. The finish is long, minty and spicy with soft tannins. The Château la Colombière is an harmonious wine, which matches with game, white meat, rôtis, traditional French Cuisine or exotic and spicy Cuisine. Only available at the chateau itself or at Galleries Lafayette. When your in Paris anyway, take a detour to the old Opera Garnier and buy a box of wine just around the corner at Lafayette. It's detour worthy!
AOC COTES du FRONTONNAIS Located at 25km from Toulouse between the Tarn and the Garonne rivers, on a gravely and dry soil. The multiple planted grape varieties confer to this wine an elegant character: Négrette:50%, Malbec:30%, Syrah:10%, Cabernet franc:10%. The CHATEAU MARGUERITE should be served just below room temperature along with meats and cheese. Kept in an ageing cellar, it could age 2 or 3 years. This Fronton costs less than 4 Euros and goes very well with southern European tomato/herbs de Provence or Italian tomato/basilicum dishes. The Négrette grape gives the Fronton wines their distinguising flavour. By the way, the mentioning of 'serve at room temperature' doesn't mean the temperature of your average heated room. It refers to room temperature a century ago, the saying just never changed, room temperatures did! 18° Celcius is usually good for red wines. At higher temperatures wines tend to get quite an alcoholic nose, which is obviously not favourable.