A prototype for the spatial display and gestural navigation of large hierarchical datasets
Even with the advent of new digital technologies, the traditional book still stands out prominently among devices which allow us to find, index and store information in a relative compact and handy space. This data storage and retrieval system is in fact so successful that it has been utilized over centuries and that its usage and internal workings have somewhat become second nature to humans.
In one way or the other even most digital information systems (for example Websites or PDA and Cellphone interfaces) make use of the characteristics and paradigms that are typical for the analog book. For instance, even though there is no such thing as a physical representation on a Google search return, we speak of a "page". The pages are numbered. We read them from upper left to lower right scanning one search item after the other. We "flip" them back and forth as represented through left and right pointing arrows, and, similarly to the book, we even get a short interruption in the flow of information when we go from one page to the next. While this form of data representation has arguably many advantages (convention being among the strongest), it also brings with it some issues. Firstly, the eye is generally much faster than the hand. To stay with the Google example: It takes the eye usually no less than a few fractions of a second to scan a "page" for relevant bits of information, yet it takes comparably much longer to move the mouse to the "next" arrow, click it and wait for the next page to load. Secondly, digital representations of larga datasets such as Google search returns, rarely visualize how MUCH overall information the user is confronted with. A book communicates before we even read one sentence in it through its thickness. We do get an immediate sense of differentiation when putting a book with 100 pages next to one with 1000. Depending on our attitude a very thick and heavy book could stand for "Wealth of information", or it could make us think: "Oh my God! Do I really have to read all of that?". No such visual cue is given us through Google's search return number. Finally, as books (and models which rely on the book paradigm) are by design read in a linear fashion (front to back, upper left to lowr right) they do not lend themselves naturally to highly crossreferenced and nested information which One prime example for the book's shortcommings in accommodating for this type of information is Austrian philosopher Ludwig Wittgenstein's famous Tractatus logico-philosophicus (subtitled Philosophical Remarks), first published in 1921 as part of the magazine Annalen der Naturphilosophie, and later in various book formats. This text, which despite Wittgenstein's rather laconic and succinct language, is quite difficult to decipher, deals foremost with the relation of language and thought, logics and ideas and tries to establish a set of "true" and incontestable propositions which interdepend on each other and strive to be totally devoid of any metaphysical subtext. |
For our project the interesting part is however not so much the content, but more the way Wittgenstein has structured his text: There are seven main propositions (numbered 1 through 7), with various sets of subsidiary references (1.2, 2.1, 2.1.1 etc). The ordinary book form doesn't do this text much justice as the weight and prominence of the different statement levels (as given by the numbers) cannot be visualized due to the inherent linearity of the design. For instance, statement 3, which is on the same "level" as statement 2, might actually appear only pages after point 2, depending on how many sublevel items are grouped under point 2 (2.1 2.1.1, 2.2 etc). In other words, the relation of points to each other and their particular location withing the hierarchy is lost to the reader sooner rather than later. Of course, the Tractatus stands here only as a model for any text As a start we have formulated several requirements and prerequisites this application has to implement in order to be successful:
Please add points as you like TN 03/2006 |
anonymous user login
~1d ago
~1d ago
~9d ago
~11d ago
~13d ago
~16d ago
~16d ago
~23d ago
~30d ago
~30d ago