Saturday, July 17, 2004
Recursive, Wide, and Loopy
Puzzled monkeys reveal key language step by Gaia Vince
"The key cognitive step that allowed humans to become the only animals using language may have been identified, scientists say.
A new study on monkeys found that while they are able to understand basic rules about word patterns, they are not able to follow more complex rules that underpin the crucial next stage of language structure. For example, the monkeys could master simple word structures, analogous to realising that "the" and "a" are always followed by another word. But they were unable to grasp phrase patterns analogous to "if... then..." constructions. This grammatical step, upon which all human languages depend, may be "the critical bottleneck of cognition that we had to go through in order to develop and use language", says Harvard University's Marc Hauser, who carried out the study with fellow psychologist Tecumseh Fitch, at the University of St Andrews, Scotland. "Perhaps the constraint on the evolution of language was a rule problem," Hauser told New Scientist.
Fitch and Hauser carried out two aural tests on cotton-top tamarin monkeys in which sequences of one-syllable words were called out by human voices.
In the first test, random words were called out in a strictly alternating pattern of male followed by female voices. The monkeys responded to breaks in the male-female rule, by looking at the loudspeaker. This showed that they were able to recognise the simple rule.
In the next test, the grammatical rule dictated that the male voice could call out one, two or three words, as long as the female voice did the same. This type of slightly more complex pattern is called recursive, as it involves a rule within a rule.
This time, the monkeys were unable to recognise any breaks in the pattern. But twelve human volunteers given the same test had no such difficulty, although most were unable to explain what the rule actually was.
"Recursive ability is uniquely human and affects more than just our language, but most of our behaviour," says renowned primate language expert David Premack, who wrote an article accompanying the study published in Science. "For example, in a classroom we often see child A watch child B watch child C watch the teacher. But in chimps, we see chimp A watch its mother, chimp B watch its mother, chimp C watch its mother..."
Premack argues that although recursive ability is not absolutely necessary for language -- non-recursive sentences are possible -- being unable to master recursion may have been a stumbling block that prevented monkeys from developing language.
"Monkeys are also not physically capable of speech, they are unable to properly copy actions and they cannot teach -- all of which are skills required for language," he told New Scientist.
Mastery of the underlying rule of recursion is the key to human flexibility, Premack believes, allowing humans to think in the abstract, use metaphors and comprehend concepts such as time. It probably arose as the brain evolved into a more complex organ, but is not located in a single brain region.
However, it is not known whether modern humans are born with the ability to recognise recursive language patterns. More research into recursive ability in humans and their close relatives chimpanzees needs to be carried out, Hauser says."
Journal reference: Science (vol 303, p 377)
Google Search: define:recursive
Self-Processing: 'It would from many a blunder free us'
"With video we can know the difference between how we intend to come across and how we actually do come across. What we put out, what is taken by the tape, is an imitation of our intended image; it is our monkey. A video system enables us to get the monkey off our backs, where we can't see him, out onto the tape, where we can see him. That is the precise way in which we've been making a monkey of ourselves. The monkey has been able to get away with his business because he operates on the other side of the inside/outside barrier. The moebius tape strip snips the barrier between inside/outside. It offers us one continuous (sur)face with nothing to hide. We have the option of taking in our monkey and teaching him our business or letting him go on with his.
Taking in your own outside with video means more than just tripping around the moebius strip in private. One can pass through the barrier of the skin, pass through the pseudo-self to explore the entirety of one's cybernet -- i.e., the nexus of informational processes one is a part of.
[...] In fact, we live in multiple loops. [...]
The cybernetic extension of ourselves possible with video-tape does not mean a reinforcement of the ordinarily understood "self." Total touch with ones's cybernet precludes the capitalism of identity ...
Master Charge does not make you master of anything but involves you in an expensive economy of credit information processed by computer, your checking account, TV ads ... and busy telephones. The Master Charge card exploits the illusion of unilateral control over life the West has suffered with. "I am the Captain of my Soul; I am the Master of my Fate." We have yet to understand there is no master self. They are now putting photos on charge cards when they should be mapping the credit system the card involves you in. Video users are prone to the same illusion. It is easy to be zooming in on "self" to the exclusion of environmental or social systems.
Doing feedback for others, one comes to realize the necessity of taping and replaying context. I had the opportunity to do a kind of video meditation ..."
Paul Ryan - Cybernetics of the Sacred (Anchor Press/Doubleday, 1974, pages 30-31)
Michael Ventura - The World Is No Longer the World
"[...] In an interview with The New York Times, [Robert W. Taylor] was asked how the new broadband technologies would impact daily life. He answered: "You'll be able to wear -- an unobtrusive device that will record in full color and sound everything that you see or point your head at, or, depending on how many of them you have, everything that's around you. And share it. Every waking and sleeping moment in your life will be recorded. And you will be able to store and retrieve it and do what you will with it." (He added, almost as an afterthought, "there are obvious implications for privacy that will have to be worked through.") "How will that change the world?" the interviewer quickly asked. "I don't know," Taylor said, "but it will."
Every waking and sleeping moment -- but of course after only one day of life-recording, you would fall hopelessly behind your ability to manage and manipulate your material. There would not be enough waking moments on the following day for viewing the entire record of the first day -- though one could no doubt fast-forward. Still, much of the second day would be images of the person viewing the previous day's images -- mirrors facing mirrors. That's one possibility. Or, instead of letters, you could e-mail hours of your life as you'd lived it -- probably as you're living it! Or two people who'd spent the day (or night) together could run both records simultaneously on a split screen. Or maybe you'd set up a Web site on which, every day, you'd run your life of the previous day (or again, Web site it live!) -- instant replay might become a kind of learning aid to one's sense of identity, proof that you exist in a world that no longer entirely believes in any reality that can't be contained on a screen. Western civilization has already and thoroughly defined "life" as "self-consciousness," "I think therefore I am," but this technology would create an environment in which the recorded consciousness of a life would be defined strictly in terms of what could be seen and heard ..."
Interconnected: two things i've been ... (27 June 2004)
"Visual processing seems to me to have two main tasks. One is to assemble a world that is easily abstracted ... the other ... is to throw information away.
[...] We can learn lessons about how to throw away information. The two perceptual jobs are combined, of course.
[...] Intelligence is distributed over the environment because we throw information away. On the long scale (light from above), and the short scale (you know the time, but you haven't looked at your watch yet). Artificial objects, created interfaces that don't obey distance, or object-hood, or texture: they're either confusing, or, if used right, remarkably useful illusions (television)."
BBC News: How memories build during sleep
"What to do with too much information is the great riddle of our time."
Theodore Zeldin - An Intimate History of Humanity (Minerva, 1995, page 18)
Brain Candy by Floyd Skloot
[...] Early in the book [An Alchemy of Mind], examining how the brain adapts as we learn new information, [Diane] Ackerman says, "We arrive in this world clothed in the loose fabric of a self, which then tailors itself to the world it finds." Later, talking about emotions, she says, "Our ideas may behave, but our emotions are still Pleistocene ..."
The Scotsman - Critique - Whole worlds in his hands
"We are a species of pattern finders. Evolution made us so."
Thomas Dixon - From Passions to Emotions (Cambridge University Press, 2003)
"Today there is a thriving 'emotions industry' to which philosophers, psychologists and neuroscientists are contributing. Yet until two centuries ago 'the emotions' did not exist. In this path-breaking study Thomas Dixon shows how, during the nineteenth century, the emotions came into being as a distinct psychological category, replacing existing categories such as appetites, passions, sentiments and affections. By examining medieval and eighteenth-century theological psychologies and placing Charles Darwin and William James within a broader and more complex nineteenth-century setting, Thomas Dixon argues that this domination by one single descriptive category is not healthy. Overinclusivity of 'the emotions' hampers attempts to argue with any subtlety about the enormous range of mental states and stances of which humans are capable."
The Posthuman Touch: N. Katherine Hayles reviewed by Erik Davis
"[...] Though she recognizes the techno-transcendentalist nightmares tucked inside the computational universe ("a culture inhabited by posthumans who regard their bodies as fashion accessories"), Hayles is open to a future populated with increasingly brainy machines. Refreshingly, Hayles also suggests that the art of embodiment could be well served by some lessons of evolutionary psychology, which many pomo science types write off as an evil blasphemy. In a word, Hayles is willing to give up some of that much-vaunted human control. "The very illusion of control bespeaks a fundamental ignorance about the nature of the emergent process through which consciousness, the organism, and the environment are constituted." So how do we live with creative intelligence and awakened senses in a groundless world beyond our control?"
Interview with Russell Hoban [from Stride Magazine no. 26, 1986]
Rupert Loydell: You often use the theme of something living behind our eyes. Is that something you actually believe or just a literary device?
Russell Hoban: "No, I actually feel it. It is more than a matter of belief -- it just feels that way to me, as if we are inhabited by something that lives with us."
Rupert Loydell: Does that tie in with your interest in shamanism?
Russell Hoban: "I suppose so, in that shamanism is a mode of opening the self, opening the conscience, to forces that we can't ordinarily perceive in ordinary ways. In my writing I am always trying to be more open to things that don't come to our minds in ordinary states, and the way I do it is just by tuning in obsessively to ideas that come to me, by working late at night, by staying with things until I am very tired, and until this stiffness of the mind breaks down and loosens up, and thoughts come in that wouldn't come."
Running Backward Into The Future - Part One
"Marshall McLuhan made the point that the structure of any medium became the content of subsequent media. Thus the gift storytelling gave to writing was the content of the stories. The fact that the cadenced, nuanced, recursive and patterned structure of told stories was lost to the written text was only noticed much later.
Julian Jaynes made the same point with regard to the metaphors used by a culture to articulate the nature of human consciousness; the "landscape" of mind reflecting the topography and technology of each era." Richard Shand
EDGE 3rd Culture: Marc Hauser: Animal Minds
"For the past few years I have been using the theoretical tools from evolutionary biology to ask questions about the design of animal minds. I'm particularly interested in taking the approach that some people have had within evolutionary psychology, and saying look, this whole notion of the environment for evolutionary adaptedness which people have pegged as being associated with the hunter-gatherer period in the Pleistocene, may be true for some aspects of the human mind, but are probably wrong as a date for many other aspects. If we think about how organisms navigate through space, recognize what is an object, enumerate objects in their environment -- those are aspects that are probably shared across a wide variety of animals. Rather than saying that the human mind evolved and was shaped during the Pleistocene, it's more appropriate to ask what things happened in the Pleistocene that would have created a particular signature to the human mind that doesn't exist in other animals."
Are we still evolving? by Gabrielle Walker [from Prospect Magazine July 2004]
[...] "What's special about human beings is that we learn from one another," says Svante Pääbo, a geneticist from the Max Planck Institute in Leipzig. "When we invent something new like cars, we don't wait for evolution to make us better drivers or learn how to cross the street. We have driving schools and we teach our children to look both ways before crossing."
Slashdot: That's Sir Tim to You
Free was key, says Lee comments GillBates0
"In this day and age of superfluous patents [slashdot.org] and frivolous lawsuits [slashdot.org], Sir Tim Berners-Lee [w3.org] gently reminds us of the importance of free and selfless contribution [cnn.com] for the betterment of humanity. Speaking at the ceremony for winning the Millennium Technology Prize [technologyawards.org] (as reported earlier on Slashdot [slashdot.org]), he said that he would never have succeeded if he'd tried to charge money for his inventions. The prize committee agreed, citing the importance of Berners-Lee's decision never to commercialize or patent his contributions to the Internet technologies he had developed, and recognizing his revolutionary contribution to humanity's ability to communicate."
Interview with Tim Berners-Lee
Simon Winchester: It has been argued that the rapid developing of search techniques on the internet has made conducting research, particularly for school pupils, too easy. Do you think that school children are, perhaps, becoming too reliant on computers?
Tim Berners-Lee: Research isn't about finding lots of information - it is about understanding it: finding the relationships between concepts. The fact that you can get information more rapidly may help with speed, but when your task is to arrange this information in such a way that it personally makes sense to you, then the computer is a very useful tool. I don't know how reliant on computers you all are at Emanuel. Do you all have laptops and wireless networking? I think we should all be careful to spend as much time using the other parts of your brain as you do using a computer. Take a break and do a little calligraphy to let your thoughts settle. Keep a musical instrument within reach of your computer at home. [...]
Simon Winchester: Do you see the internet and the book in competition with each other? Especially in regard to young people?
Tim Berners-Lee: I do see the internet and TV in competition with each other, and I hope the internet will become such a fun, creative thing that it takes time away from the numbing effect of television. I don't think it will replace the book. First of all, the novel is a fundamentally important genre of communication whether you read it on paper or screen. Secondly, it will be a long time before a computer peripheral can compete with the warm feel of a wedge of paper, which can be taken and read anywhere, and within which each word has a physical position.
The Walter J. Ong Project
"Ong's work is often presented alongside the postmodern and deconstruction theories of Claude Lévi-Strauss, Jacques Derrida, Michel Foucault, Hélène Cixous, and others. His own work in orality and literacy shows deconstruction to be unnecessary: if you consider language to be fundamentally spoken, as language originally is, it does not consist of signs, but of events. Sound, including the spoken word, is an event. It takes time. The concept of "sign," by contrast, derives primarily not from the world of events, but from the world of vision. A sign can be physically carried around, an event cannot: it simply happens. Words are events."
Sky News -- WWW.TIM.GETS.GONG.UK
"Born in East Sheen, south-west London, in 1955, [Sir Tim Berners-Lee] was the eldest child of two mathematicians renowned within the computer industry for their work on Britain's first commercial computer, the Ferranti Mark I.
He studied at the Emanuel School in Wandsworth and later read physics at the Queen's College, Oxford, where he was banned from using the university's computer after being caught hacking.
Sir Tim later built his own computer, using an old TV set, a Motorola microprocessor and a soldering iron."
XMLMania.com - Tim Berners-Lee, Inventor of the World Wide Web, Knighted by Her Majesty Queen Elizabeth II
"[...] While working in 1980 as a consultant software engineer at CERN, the European Particle Physics Laboratory in Geneva, Switzerland, Sir Timothy wrote his own private program for storing information using the kind of random associations the brain makes. The "Enquire" program, which was never published, formed the conceptual basis for his future development of the Web.
Subsequently he proposed a global hypertext project at CERN in 1989, and by December 1990, the program "WorldWideWeb" became the first successful demonstration of Web clients and servers working over the Internet. All of his code was made available free on the Internet at large in the summer of 1991.
A London native, Sir Timothy graduated with a degree in physics from Queen's College at Oxford University, England in 1976. While there he built his first computer with a soldering iron, TTL gates, an M6800 processor and an old television."
Review: Mind Wide Open by Steven Johnson
"Neuroscientists now view the brain as an orchestra made up of "dozens of players". Forget the idea of the brain as a unitary supercomputer and think instead of an assemblage of different modules and chemicals (or "molecules of emotion"), each specialised for a different task. They are not always easy bedfellows." P.D. Smith
Brain cells become more discriminating when they work together
"Team work is just as important in your brain as it is on the playing field: A new study published online on April 19  by the Proceedings of the National Academy of Sciences reports that groups of brain cells can substantially improve their ability to discriminate between different orientations of simple visual patterns by synchronizing their electrical activity.
The paper, "Cooperative synchronized assemblies enhance orientation discrimination," by Vanderbilt professor of biomedical engineering A. B. Bonds with graduate students Jason Samonds and Heather A. Brown and research associate John D. Allison provides some of the first solid evidence that the exact timing of the tiny electrical spikes produced by neurons plays an important role in brain functioning. Since the discovery of alpha waves in 1929, experts have known that neurons in different parts of the brain periodically coordinate their activity with their neighbors. Despite a variety of theories, however, scientists have not been able to determine whether this "neuronal synchrony" has a functional role or if it is just a by-product of the brain's electrical activity ..."
Evolution of the yeast protein interaction network
Hong Qin, Henry H. S. Lu, Wei B. Wu and Wen-Hsiung Li (Proceedings of the National Academy of Sciences USA, 28 October 2003; 100 (22): 12820–12824)
"[...] A key question in the evolution of biological complexity is, how have integrated biological systems evolved? Darwinists proposed natural selection as the driving force of evolution. However, the striking similarities between biological and nonbiological complexities have led to the argument that a set of universal (or ahistorical) rules account for the formation of all complexities. The yeast protein interaction network is an example of a complex biological system and contributes to the complexity at the cellular level. By analyzing the growth pattern and reconstructing the evolutionary path of the yeast protein interaction network, we can address whether or not network growth is contingent on evolutionary history, which is the key disagreement between the Darwinian view and the universality view.
[...] The key disagreement between the Darwinian view and the universality view on the evolution of biological complexity is the role of historical contingency. Undoubtedly, efforts to search for universal rules benefit our understanding on biological complexity. However, by using the yeast protein interaction network as an example, we observed a correlation between network evolution and the universal tree of life. This observation strongly argues that network evolution is not ahistorical, but is, in essence, a string of historical events."
Delays, connection topology, and synchronization of coupled chaotic maps
Fatihcan M. Atay, Jürgen Jost and Andreas Wende (Physical Review Letters 9 April 2004)
Abstract: "We consider networks of coupled maps where the connections between units involve time delays. We show that, similar to the undelayed case, the synchronization of the network depends on the connection topology, characterized by the spectrum of the graph Laplacian. Consequently, scale-free and random networks are capable of synchronizing despite the delayed flow of information, whereas regular networks with nearest-neighbor connections and their small-world variants generally exhibit poor synchronization. On the other hand, connection delays can actually be conducive to synchronization, so that it is possible for the delayed system to synchronize where the undelayed system does not. Furthermore, the delays determine the synchronized dynamics, leading to the emergence of a wide range of new collective behavior which the individual units are incapable of producing in isolation."
Bruce Bower: Grannies give gift of longer lives
The Limits of Knowledge
"Laplace, Leibniz, Descartes, and Kant popularized the idea of the universe as a vast machine. The cosmos was likened to a watch, composed of many parts interacting in predictable ways. Implicit in this analogy was the possibility of predicting all phenomena. The perfect knowledge of Laplace's cosmic intelligence might be a fiction, but it could be approximated as closely as desired.
Today the most common reaction to Laplace's idea, among people who have some feel for modern physics, is to cite quantum uncertainty as its downfall. In the early twentieth century, physicists discovered that nature is nondeterministic at the subatomic scale. Chance enters into any description of quanta, and all attempts to exorcise it have failed. Since there is no way of predicting the behaviour of an electron with certainty, there is no way of predicting the fate of the universe.
The quantum objection is valid, but it is not the most fundamental one. In 1929, two years after Heisenberg formulated the principle of quantum uncertainty, Hungarian physicist Leo Szilard discovered a far more general limitation on empirical knowledge. Szilard's limitations would apply even in a world not subject to quantum uncertainty. In a sense, they go beyond physics and derive instead from the logical premise of observation. For twenty years, Szilard's work was ignored or misunderstood. Then in the late 1940s and early 1950s it became appreciated as a forerunner of the information theory devised by Claude Shannon, John von Neumann, and Warren Weaver.
Information theory shows that Laplace's perfect knowledge is a mirage that will forever recede into the distance. Science is empirical. It is based solely on observations. But observation is a two-edged sword. Information theory claims that every observation obscures at least as much information as it reveals. No observation makes an information profit. Therefore, no amount of observation will ever reveal everything -- or even take us closer to knowing everything.
Complementing this austere side of information theory is insight into how physical laws generate phenomena. Physicists have increasing reason to suppose that the phenomena of the world, from galaxies to human life, are dictated more by the fundamental laws of physics than by some special initial state of the world. In terms of Einstein's riddle, this suggests that God had little meaningful choice once the laws of physics were selected. Information theory is particularly useful in explaining the complexity of the world."
The Recursive Universe: Cosmic Complexity and the Limits of Scientific Knowledge by William Poundstone (Paperback edition: Oxford University Press, 1987, pages 21-22)
"Consider any day saved on which you have danced at least once." Fried Nietzsche
'Nature via Nurture' It's Genetic, Sometimes
Michael Ruse: I wonder if many today would really disagree with Ridley's basic claim in "Nature via Nurture" that essentially the nature-versus-nurture, biology-versus-culture, genes-versus-environment dichotomy has broken down and truly is less than useful to invoke. Organisms, and this applies especially to human organisms, are complex systems produced by genes, but very much molded by the experiences they encounter and situations to which they have to respond. "Genes are the mechanisms of experience," in the author's words.
Messages: How do animals communicate?
"[...] If you look at the history of work in animal communication, the greatest strides have been made in species that have specialized vocalizations, like bird song, where people have focused on the songs of a bird, even though birds have many other vocalizations. This is a specialized vocalization, it has dedicated neuro-circuitry, and it has a very specific function -- to attract mates and to guard territories. Because of that specialization, people have been able to quantify the mechanisms and the functions and make a lot of progress in understanding what the birds are talking about.
So the cotton-top tamarin in many ways is very bird-like. They are mainly monogamous. They pair bond for life. In part because of their social organization they have vocalization that's like bird song. There's a long call they give to maintain contact with other individuals, which is also used in inter-territorial interactions."
Essays on Recursion, Difference, Dialectics, Maps and Territories in Celebration of Gregory Bateson's centennial - SEED Editorial
"[...] Bateson stated in more than one text that his ideas were attuned to an epistemological monism, at first a notion of organicism, but after Bateson's embrace of cybernetics in the mid-1940s, an epistemology built around information and the fundamental ideas of cybernetics. These had circularity as their central concern, though as Bateson pointed out, circularity did not mean a precise circle in which events repeat themselves in the same circular path. All living forms reproduce and in doing so re-enter the domain of their forebears. These are recursive events, but in the case of species reproduction, they never step into the precise spot in the same stream twice-over. The arrow of time intervenes. A truly circular path would preclude emergence of new forms, and other forms of change or adaptation which are characteristic sequences of evolution. The passing of time always inflects recursion in human events, and human events must also, in the long run, share this formal characteristic of recursion in biological events. Bateson believed, contra Vico, there was no historical circularity which rolls human history along from barbarism to civilization and then returns human society to its starting point of barbarism once more. In between there is an enormous amount of information continually undergoing contextual change, but there are also fundamental premises, constraints to human understanding that endure, a structure of fundamental premises or 'verities' that give both form to, but permit freedom in, the variety of recursive events.
Thus Bateson indicated that a characteristic form or topology that captures the recursiveness of both biological form and human cultural and historical experience, is that of a spiral, a circularity that rotates in time. Nevertheless, the spiral was always a metaphor, rather than an operational framework in Bateson's epistemology. Instead, Bateson approached recursiveness in terms of the oscillations in heterarchical ordering. The idea of heterarchical (multi-level) order was first developed by the well known cyberneticist, Warren McCulloch. McCulloch suggested in his classic paper on the topology of human nervous nets (McCulloch, 1965: 40-45) that the topology of recursion in nervous nets not only differs between long-term memory and short-term memory but that within the overall recursiveness of neural nets are multiple hierarchies occurring among the synapses of the nervous system connected in recursive reverberation."
Gregory Bateson Centennial: Multiple Versions of the World
Saturday, November 20, 2004 (9:00 AM - 5:00 PM)
University of California at Berkeley, Lawrence Hall of Science
Confirmed Speakers include: Mary Catherine Bateson, Carol Wilder, Peter Harries-Jones, Terrence Deacon, Tyler Volk, Charles Hampden-Turner, Jesper Hoffmeyer & Jay Ogilvy
The Social Brain Conferences: Biology of conflicts and cooperation
Barcelona - July 17-20, 2004
News From Below: We've Been Expecting You
"Not often will we have such an opportunity as this one: to convene so many who've been inspired by Bateson and his illuminating insights into "the pattern which connects." Multiple Versions of the World promises to be a breakthrough conference." Jay Ogilvy
A little fruitful pandemonium (The Austin Chronicle 20 February 2004)
Michael Ventura: There's little grace or dignity in "Kid Auto Races in Venice," but it is six minutes and 10 seconds of weird prophetic poetry -- and mayhem. Later Chaplin had varying memories of how he created the Tramp's costume, trying on things at random in the costume room, but he concluded each version like this, as spoken to Chaplin biographer Robert Payne: "Even then I realized I would have to spend the rest of my life finding more about the creature. For me he was fixed, complete, the moment I looked in the mirror and saw him for the first time, yet even now I don't know all the things there are to be known about him." Chaplin always talked of the Tramp that way: from a distance and with respect. Even with awe, as when he told Payne: "There is death in him, and he is bringing life -- more life. That is his only excuse, his only purpose. That is why people recognized him everywhere. They wanted the ghosts to come and bring them life. It's very strange, isn't it? ... You see, the clown is so close to death that only a knife-edge separates him from it, and sometimes he goes over the border, but he always returns again. So in a way he is spirit -- not real. ... We know he cannot die, and that's the best thing about him. I created him, but I am not him, and yet sometimes our paths cross."
John Seely Brown interviewed by Seth Kahan (10 February 2003)
Seth: It sounds like each side is sifting through the other's "roots," exploring the periphery.
JSB: Yeah, it is an exploration -- each center is in the other's periphery, but it is also a clashing. I call it, the creative collision of craft. That collision taking place in a fabric of trust can go -- as we were saying with storytelling -- huge distances.
Robert W. Taylor's 2004 Draper Prize Acceptance Remarks
"Once upon a time and for many centuries, beginning with the first computer, the abacus, the purpose of computers was to solve arithmetic problems. With electricity, they could solve them faster, and the advent of the integrated circuit made them even faster, more reliable, and a lot cheaper. But they were still only arithmetic engines. Then a remarkable transformation occurred.
Xerox opened its Palo Alto Research Center in 1970, and it grew over time to about 300 people. Today, PARC is known for its innovative computer science research of the 1970s, but computer science was only a small part of its research investment. Most of it went into physics, materials science, and optics. But a few dozen computerists at PARC redefined the nature and purpose of computing, and their research put PARC on the map. The 2004 Draper Prize honors this research.
The four individuals named in this year's prize formed the cadre of that extraordinary group, which today reads like a "Who's Who" in computer science. In the last half of the 1960s, they were graduate students at a handful of universities, where, with support from ARPA, they built the first experimental interactive computer systems. From these, they gained insights into interactive computing that were not available to others. In the 1970s, when they were recruited to PARC, they shared a dream -- that computer systems could be completely redesigned and that this redesign could enable personal interactions with text, pictures, and networking for millions of individuals. The dream promised to encourage creative potential and new forms of communication. The value of connecting people and their interests could dwarf the value of computing only for arithmetic."
A General Theory of Rubbish: Moore Enlightenment
posted by Andrew 7/17/2004 08:08:00 AM