|
Sunday, August 29, 2004
A Multitude of Signs
San Antonio Current: Multimedia message by M. Solis
Current: Can you elaborate on the concepts of Rhythm Cinema and Rhythm Space?
Spooky: "To me, every time you even look at a building or see roads of a city from above, those are different kinds of visual patterns and rhythms. What I'm doing as a DJ, writer, and artist is thinking about life in our era as kind of exploring all the interrelated patterns that hold the fabric of the everyday world together. A building is a pattern. You can look at the points of structure, like a window, a corridor, a chamber; those are done with certain patterns. If you look at a skyscraper, if you look at a church, all of these are structures, but to me they are also beats. They're rhythms that are holding together a structure.
A DJ set of rhythms and patterns is the same thing as a building. It really doesn't have that much of a difference except for the material. Music is invisible. It's made of software, code. It's made of people playing in a unit. Buildings are made of steel and concrete. I just draw a bridge between the two. It's an urban funk culture.
That's why I feel like the era of the 21st century is all about information overload. That's where I get this idea of Rhythm Cinema because if you're doing multimedia from every direction and how the mind makes sense of that, it's putting it in patterns, putting it in structure."
Rhythm Science by Paul D. Miller [via datacloud]
The conceptual artist Paul Miller, also known as Dj Spooky that Subliminal Kid, delivers a manifesto for rhythm science -- the creation of art from the flow of patterns in sound and culture, "the changing same." Taking the Dj's mix as template, he describes how the artist, navigating the innumerable ways to arrange the mix of cultural ideas and objects that bombard us, uses technology and art to create something new and expressive and endlessly variable. Technology provides the method and model; information on the web, like the elements of a mix, doesn't stay in one place. And technology is the medium, bridging the artist's consciousness and the outside world. Miller constructed his Dj Spooky persona ("spooky" from the eerie sounds of hip-hop, techno, ambient, and the other music that he plays) as a conceptual art project, but then came to see it as the opportunity for "coding a generative syntax for new languages of creativity." For example: "Start with the inspiration of George Herriman's Krazy Kat comic strip. Make a track invoking his absurd landscapes ... What do tons and tons of air pressure moving in the atmosphere sound like? Make music that acts as a metaphor for that kind of immersion or density." Or, for an online "remix" of two works by Marcel Duchamp: "I took a lot of his material written on music and flipped it into a DJ mix of his visual material -- with him rhyming!" Tracing the genealogy of rhythm science, Miller cites sources and influences as varied as Ralph Waldo Emerson ("all minds quote"), Grandmaster Flash, W. E. B Dubois, James Joyce, and Eminem. "The story unfolds while the fragments coalesce," he writes.
James Boyle: The Apple of forbidden knowledge
"The Digital Millennium Copyright Act and equivalent laws worldwide were supposed to allow copyright owners to protect their content with state-backed digital fences that it would be illegal to cut. They were not supposed to make interoperability illegal, still less to give device manufacturers a monopoly over tied products, but that is exactly how they are being used. Manufacturers of printers are claiming that generic ink cartridges violate the DMCA. Makers of garage door openers portray generic replacements as "pirates" of their copyrighted codes. [...]
About 20 years ago, a stylish technology company with a clearly superior hardware and software system had to choose whether to make its hardware platform open, and sell more of its superior software, or whether to make it closed, and tie the two tightly together. It chose closed. Its name: Apple. Its market share, now? About 5 per cent. Of course, back then competition was legal. One wishes that the new generation of copyright laws made it clearer that it still is."
CODE : EDGE 42
George Dyson & John Brockman: A Dialogue
JB: Let's go back to ENIAC.
DYSON: OK. So you've got one computer alone that can be very powerful, but when they're in communication they become more powerful. It's the same way that a colony of cells with no nervous system at all can become a starfish or a sponge or something like that just simply by chemical communication.
JB: By communication you're talking about a network such as the Internet?
DYSON: Yes, but you have to have all sorts of other communication to make an organism happen: chemical, hormonal, mechanical. We are still immersed in the metaphor of fifty years ago, the computer as brain, the brain as electrical network, etc. The metaphor we haven't quite got to yet will come from molecular biology, when we start to see the digital universe less as an electrical switching network or giant computer and more as an environment swimming with different levels of code. How these increasingly complex one-dimensional strings of code actually do things, interacting with each other and with the three-dimensional world we live in, has more in common with the code-string and protein-folding world of molecular biology, where molecules interact with each other -- and do things -- by means of templates, rather than by reference to some fault-intolerant system of numerical address.
JB: There is no Internet -- there is only a process. When you stop a process to name it, it becomes dead. What we think of as the Internet is only a measure of its effect.
DYSON: Look at it from the point of view of the code itself, not the end user sitting at a terminal, which is either a synapse to some other coded process, or the means to some formalizable end. In ancient (computer) times code would run, be executed, and be terminated, that was the end of it. On the Internet code can keep moving around; it may escape termination by the local CPU, and when it arrives at a terminal, that doesn't mean it stops ...
The screen-age: Our brains in our laptops - CNN, 2 August 2004
Christine Boese: "My consciousness isn't just split between gray matter and a hard drive or two. Now part of it lives on the Internet and seems to stay there all the time. While I may feel a bit diffuse, mostly I observe changes in what McLuhan called our "sense ratios," like a goldfish changing from one kind of aquarium to another. We adapt. We gain some things, lose others. [...]
College students are the leading edge in adapting to this new goldfish bowl, these new multi-tasking sense ratios. Some of us will hold on to the old ways by our fingernails, afraid of losing a coherent self. Others will plunge into the new collective nerve center, our various selves loosely joined ..."
From Homer to Hip-Hop
Jeet Heer: Drawing on the work of the classicist Eric Havelock, Ong notes that "Plato's entire epistemology was unwittingly a programmed rejection of the old oral, mobile, warm, personally interactive lifeworld of oral culture (represented by the poets, whom he would not allow into his Republic). ... The Platonic ideas are voiceless, immobile, devoid of all warmth, not interactive but isolated, not part of the human lifeworld at all but utterly above and beyond it."
The Guardian - The BBC wizardry set to make waves
Owen Gibson: "The finest technology wizards at the BBC have been working for almost two years on a gizmo called the interactive Media Player (or iMP) that will allow licence-fee payers to watch BBC programmes at a time and place of their choosing. [...]
Central to iMP is the BBC's "anytime, anyplace, anywhere" philosophy. The idea being that you can download shows to a portable device -- be it a mobile phone, laptop computer or one of the so-called "video iPods" starting to emerge. The notion of thousands of people sitting on the train catching up on their previous night's viewing may seem fanciful -- until you look around and notice how many people are already fiddling with their mobiles, watching DVDs on their laptop or listening to their iPod on the way to work ..."
BBC News Online: The digital home takes shape
Darren Waters: Imagine a home in which films and TV programmes can be played on any screen in the house without wires trailing across floors; a home in which smart video recorders copy your favourite shows without being pre-programmed, to playback whenever you wish.
Imagine downloading your favourite movies and TV programmes from the internet in DVD quality, and watching them not on your PC screen but on TV in the comfort of your living room.
Imagine all your families' music stored on one device, but available wherever there are speakers in the house.
You can imagine it, or you could simply live it now - at a price.
Home entertainment devices such as a Sky Plus box, a Windows Media Center PC, an iPod with Airport Express and a wireless network do almost all of the above, but they are expensive gadgets which appeal primarily to the technically-minded.
But in the coming 12 months the market will be hit with a flurry of devices which will make all of the above possible for mainstream audiences.
In a Wireless World, Hearing Is Believing - The Washington Post
Rob Pegoraro: The appeal of a wireless media receiver -- a box plugged into your stereo to play the music saved on your computer -- got a simple demonstration after I recently moved. I had dozens of boxes to open and unpack and needed a soundtrack for the work, but all the CDs were still imprisoned in cardboard.
Fortunately, I had already set up the stereo, the computers and the WiFi access point. All I had to do was plug in two media receivers that I'd been testing, Apple's AirPort Express and Slim Devices' Squeezebox ...
Streaming media: A case for open standards
"With established streaming video standards and low distribution costs, virtually anyone can now become a "TV station" and deliver audio and video of a quality that rivals that of conventional television broadcasters." Rich Mavrogeanes
A digital revolution - The Miami Herald - 28 August 2004
Beatrice E. Garcia: HP sees a technology revolution revolving around digital content and the various devices now used to manage music, photos, and video converging into one. For consumers, this means easy-to-use devices that could eventually be very affordable as competition brings prices down. "HP is determined to lead that revolution," Carly Fiorina said. [...]
The other big HP news is that it will begin selling its own version of the Apple iPod and its new computers will come with Apple's iTunes Music jukebox and music store software preloaded. [...]
HP showed a prototype of a new device, the DJammer, that's being designed for club disc jockeys. HP brought in Gavin O'Connor, known as "DJGAWK1" when he spins, to show it off ... The wireless device allows a DJ "to interact with the music from anywhere in the club," says O'Connor, who can scratch and change the tempo and pitch of the music he's playing from anywhere in the club. "It has a large 'wow' factor."
With PC penetration at a peak, Michael McGuire, research director for Gartner Group in San Jose, Calif., says computer manufacturers have to evolve.
They need to develop products that will help people acquire and manage digital content. The next wave in the digital revolution is this "race into the living room," McGuire said ...
New Scientist: iTunes wireless music streaming cracked
Will Knight reports: Apple's wireless streaming technology for iTunes has been cracked to allow it to support non-Apple software platforms.
Norwegian computer programmer Jon Johansen has released a program called JustePort that defeats the encryption used on Apple's Airport Express [...]
Airport Express is a small base station that wirelessly connects a computer to the internet or to a local network. It also has an audio socket that can be used to link a computer to a conventional stereo or pair of speakers. This allows music stored digitally to be played remotely. Until now, however, this feature has only been compatible with Apple computers and an add-on for Apple's iTunes audio software called AirTunes.
Encryption algorithms ...
When iPod is the DJ : Tunes, a Hard Drive and (Just Maybe) a Brain
Rachel Dodes: "Mr. Ng said that the technology behind the Shuffle function has remained the same since the first-generation iPod. He declined to reveal the algorithm used to generate randomness on Shuffle, but said the only reason that an iPod might seem to know a listener's preferences is that the listener, after all, chose the music in the first place."
posted by Andrew 8/29/2004 08:08:00 PM
Monday, August 09, 2004
Migrating the folk process
DrunkenBlog: Convergence Kills [via Joe Katzman & Sven-S. Porst]
"[...] It's a sad truth, but yes, the iPod is going to go away. Everyone knows it; they just don't know when. This isn't dismissing the fact that it's shot out of the gates on a wildly successful run and become to MP3 players what Kleenex is to tissues, but it's eventually going to start losing share in one form or another.
[...] That's why they're so freaked out about what RealNetworks is doing, even though it'd sell iPods. At the end of the day it's not going to be about who is selling what end-play device, it's going to be about who is sitting in the middle. And Apple wants to be that benevolent dictator, parsing DRM-protected content to whatever device you're using at the time. It's also why the deal with Motorola is so significant; Apple can live without you buying an iPod, but if you're going to be buying DRM-protected content, Apple damn sure wants it to be through them.
[...] they're creating a new light-DRM platform that is riding on top of everyone else's platform. iMacs, Windows, mobile phones, everything. Google is also creating a platform riding on the backs of other platforms... except its based around becoming the access point for all things internet. Apple wants that, but for DRM content."
Vipin V. Nair: Harmony sparks dissent
"Every time we buy a CD, do we really worry about whether it will work on our music systems? We don't. Globally-accepted standards in digital storage of music CDs and DVDs make sure that regardless of the make of our music systems, they will work.
But those who buy music from the Internet don't enjoy such peace of mind yet, since each service provider has his own choice of audio compression format that runs only on a particular music device. So a user is, in a way, tied down to a particular online music store.
Take the instance of iPod, the most popular portable music device from Apple Computers. It plays only songs downloaded from the company's online music store iTunes. Apple uses a format called Advance Audio Coding (AAC), fortified by its Fairplay digital rights management (DRM) system, to encrypt songs in the iTunes store.
Now, a new software announced by RealNetworks seeks to do away with the incompatibilities that exist in the world of digital music. RealNetworks claimed recently that its Harmony Technology is the world's 'first DRM translation system' that enables users to play music purchased from anywhere on more than 70 devices, including Apple's iPod.
[...] Apple, the most successful company in the digital music business so far, has sold four million iPods and over 100 million tracks from iTunes. The company recently entered into a deal with mobile phone maker, Motorola to make iTunes compatible with Motorola handsets."
Society for the Study of Social Problems - 29.2 Review: Karen N. Werner
Question: What do eyeglasses, reproductive technologies, art forgeries, mimeographs, mannequins, parrots, sex dolls, Siamese twins, wax museums, Doublemint gum advertisements, carpel tunnel syndrome, and camouflage have in common?
Answer: According to Hillel Schwartz, they are all clues to understanding "the culture of the copy," a culture thick with doubling, mimicry, repetition, and simulation.
Real Life Rock Top 10 by Greil Marcus - City Pages - 30 June 2004
4) and 5), PJ Harvey, Uh Huh Her (Island) and Nick Catucci, "Carnal Fission" (Village Voice, June 9)
Harvey rubs, scrapes, drags chairs around the room; sometimes it feels as if her music comes from her guitar applying pressure to her skin rather than her fingers applying pressure to her guitar strings. Each album seems to gravitate toward the point where a certain state of mind and body will flare up into a single image--which will then burn out and disappear, leaving you incapable of remembering what the image was, only that you glimpsed it. Here, you're on the way with the pulse of "Shame," only the second song; you can feel the destination has been reached with the next, "Who the Fuck?" which combines a Lenny Kravitz beat with an extremist, primitivist Sheryl Crow vocal -- an affinity that lets you hear Harvey listening to Crow, lets you hear Harvey hearing something in Crow's voice nobody else hears, maybe including Crow herself.
That's the problem with artists: They know things other people don't. They feel compelled to say what those things are, and to conceal the strangeness and alienation of the act. If there is an "I" in their work, it ceases to refer back to the person writing, painting, singing; the person whose name is on the work has momentarily replaced herself with a made-up person who can say or do anything. This is what makes such a person an artist, and it's why critics who try to reduce an artist's work to her life are cretins. Thus we have Nick Catucci in the Village Voice, assuring his readers that Uh Huh Her is "a break-up album"--"as all save her last have been," he adds, in case you think there might be something out there that doesn't fit into a thimble. Forget that situations everyone goes through might go through Harvey differently than they do through you or me; don't worry that there might be anything here that isn't immediately obvious; after all, Catucci says, she's "an easy read" and "she's got a one-track mind." "We know she's been fucking and fighting, probably in equal measures, and maybe in the same moments." You can almost smell him, can't you?
6) and 7), Patti Smith, "Radio Baghdad," from Trampin' (Columbia) and Michael Kamber, photo accompanying Edward Wong's "Deputy Foreign Minister Is Fatally Shot in Baghdad" (New York Times, June 13)
Smith has been selling death for years -- but now mere husband, brother, friends, and poet comrades take a backseat to a whole city, a whole civilization: civilization itself! That's what was destroyed when the U.S. took Iraq. For Smith it's a chance to gas up the piety boilers, and remind us that we (or, rather, "they," which is us, but not her, unless we accept her vision, in which case we can be her, gazing with sadness and disgust at those who remain "they") destroyed a perfect city, the center of the world, where once walked "the great Caliph." How does it sound? Silly. The Aloha-Elvis wall hanging you could see in the background of Kamber's photo--captioned "American soldiers searched a suspected stronghold of the Mahdi Army, the militia loyal to Moktada al-Sadr"--was infinitely more interesting. Why doesn't Smith write a song about what Elvis was doing there: about a "we" that even she might not be able to make into a "they," unless the "they" included Iraqis, too?
TVTechnology - Net Soup: Love and Theft by Frank Beacham (08.06.03)
"[...] The organized commercial recording industry -- enabled by cheap, salable recording media -- has existed for less than 100 years. Before that people sang and performed for each other. Folk singer Pete Seeger has called the oral tradition of constantly learning and revising songs "the folk process." The constant variations of songs were passed from artist to artist and finally refined to versions that lived on through the ages. Today, lawyers call that copyright infringement.
Seeger's "folk process" is a bigger threat to the music industry than Internet freeloaders seeking a song. In fact, perhaps more than any other media, the Internet's sharing capability has brought a return to this oral tradition of trading and revising words and music. Thus, we witness the harsh fight by large corporations to retain control of the sale and distribution of their recordings.
Those who appreciate Bob Dylan's work flash a knowing smile when confronted with the Wall Street Journal's revelations. "Bob Dylan often walks a fine line between plagiarism and allusion, and therein lies his genius," wrote Geoff McMaster in a recent article on Dylan's work.
In fact, said Dr. Stephen Scobie, a Dylan biographer and former University of Alberta professor, noted that another song on Love and Theft -- titled High Water (for Charley Patton) -- included more than a dozen quotations from sources as varied as English Nursery Rhymes, African-American Blues, an obscure 1950s pop song, and even Charlotte Bronte's Jane Eyre. In some instances, whole lines and even couplets are lifted verbatim from the source.
"Dylan takes the whole idea of love and theft very seriously," said Scobie. "He loves the stuff, but also unashamedly steals it. At what point does allusion become quotation or become theft?"
Don't confuse Dylan's art with a historian's work, warned Jon Pareles, a music critic for the New York Times. "Mr. Dylan was not purporting to present original research on the culture of yakuza, the Japanese gangsters. Nor was he setting unbroken stretches of the (Saga) book to music... He was simply doing what he has always done: writing songs that are information collages. Allusions and memories, fragments of dialogue and nuggets of tradition have always been part of Mr. Dylan's songs, all stitched together like crazy quilts."
Sometimes Dylan cites his sources, wrote Pareles, but more often he does not. The music critic groups Dylan with performers such as Woody Guthrie and the Carter Family, who "thought of themselves as part of a folk process, dipping into a shared cultural heritage in ways that speak to the moment."
Pareles muses that the hoopla over Dylan's use of Dr. Saga's book is "a symptom of a growing misunderstanding about culture's ownership and evolution, a misunderstanding that has accelerated as humanity's oral tradition migrates to the Internet. Ideas aren't meant to be carved in stone and left inviolate; they're meant to stimulate the next idea and the next."
Not so, argue America's major media companies, who fear a day when they won't be able to profit from music, movies and even digital television programming. Thus, the conflict between digital information technology -- where the shared cultural heritage becomes more accessible -- and the media company gatekeepers who want to place roadblocks in the way of open access.
WHAT'S FAIR?
At stake in this dispute is the right of "fair use," a legal doctrine that allows limited use of copyrighted material without payment. Also in jeopardy is "public domain," the period after copyright expiration when works can be freely copied and distributed. Both are essential components to artistic freedom.
"The absolutely original artist is an extremely rare and possibly imaginary creature, living in some isolated habitat where no previous works or traditions have left any impression," wrote Pareles. "Like virtually every artist, Mr. Dylan carries on a continuing conversation with the past. He's reacting to all that culture and history offer, not pretending they don't exist. Admiration and iconoclasm, argument and extension, emulation and mockery -- that's how individual artists and the arts themselves evolve."
Personally, I hope to ask Sony's Howard Stringer to expound a little more on the differences between Internet "thieves" and Love and Theft. If that day comes, we'll report back to you."
Plagiarism in Dylan, or a Cultural Collage?
The New York Times - 12 July 2003
Jon Pareles: "An alert Bob Dylan fan was reading Dr. Junichi Saga's "Confessions of a Yakuza" (Kodansha America, 1991) when some familiar phrases jumped out at him. There were a dozen sentences similar to lines from songs on Mr. Dylan's 2001 album, " 'Love and Theft,' " particularly one called "Floater (Too Much to Ask)."
In the book a father is described as being "like a feudal lord," a phrase Mr. Dylan uses. A character in the book says, "I'm not as cool or forgiving as I might have sounded"; Mr. Dylan sings, "I'm not quite as cool or forgiving as I sound." Mr. Dylan has neither confirmed nor denied reading the book or drawing on it; he could not be reached for comment, a Columbia Records spokeswoman said.
The Wall Street Journal reported the probable borrowings ... as front-page news. After recent uproars over historians and journalists who used other researchers' material without attribution, could it be that the great songwriter was now exposed as one more plagiarist?
Not exactly. Mr. Dylan was not purporting to present original research on the culture of yakuza, the Japanese gangsters. Nor was he setting unbroken stretches of the book to music. The 16 verses of "Floater" include plenty of material that is not in "Confessions of a Yakuza," although the song's subtitle and its last line -- "Tears or not, it's too much to ask" -- do directly echo the book. Unlike Led Zeppelin, which thinly disguised Howlin' Wolf's "Killing Floor" as "The Lemon Song" and took credit for writing it, Mr. Dylan wasn't singing anyone else's song as his own.
He was simply doing what he has always done: writing songs that are information collages. Allusions and memories, fragments of dialogue and nuggets of tradition have always been part of Mr. Dylan's songs, all stitched together like crazy quilts.
Sometimes Mr. Dylan cites his sources, as he did in "High Water (for Charley Patton)" from the " 'Love and Theft' " album. But more often he does not. While die-hard fans happily footnote the songs, more casual listeners pick up the atmosphere, sensing that an archaic turn of phrase or a vaguely familiar line may well come from somewhere else. His lyrics are like magpies' nests, full of shiny fragments from parts unknown.
Mr. Dylan's music does the same thing, drawing on the blues, Appalachian songs, Tin Pan Alley, rockabilly, gospel, ragtime and more. "Blowin' in the Wind," his breakthrough song, took its melody from an antislavery spiritual, "No More Auction Block," just as Woody Guthrie had drawn on tunes recorded by the Carter Family. They thought of themselves as part of a folk process, dipping into a shared cultural heritage in ways that speak to the moment.
The hoopla over " 'Love and Theft' " and "Confessions of a Yakuza" is a symptom of a growing misunderstanding about culture's ownership and evolution, a misunderstanding that has accelerated as humanity's oral tradition migrates to the Internet. Ideas aren't meant to be carved in stone and left inviolate; they're meant to stimulate the next idea and the next.
Because information is now copied and transferred more quickly than ever, a panicky reaction has set in among corporations and some artists who fear a time when they won't be able to make a profit selling their information (in the form of music, images, movies, computer software). As the Internet puts a huge shared cultural heritage within reach, they want to collect fees or block access. Amazingly enough, some musicians want to prevent people from casually listening to their music, much less building new tunes on it.
Companies with large copyright holdings are also hoping to whittle away the safe harbor in copyright law called fair use, which allows limited and ambiguously defined amounts of imitation for education, criticism, parody and other purposes. The companies also want to prevent copyrighted works from entering the public domain, where they can be freely copied and distributed. The Supreme Court recently ruled, in Eldred v. Ashcroft, that individual copyrights could extend for 70 years after the life of the creator, or in the case of a corporation, for 95 years. As a result, Mickey Mouse will be kept out of the public domain -- that shared cultural heritage -- until 2024.
The absolutely original artist is an extremely rare and possibly imaginary creature, living in some isolated habitat where no previous works or traditions have left any impression. Like virtually every artist, Mr. Dylan carries on a continuing conversation with the past. He's reacting to all that culture and history offer, not pretending they don't exist. Admiration and iconoclasm, argument and extension, emulation and mockery -- that's how individual artists and the arts themselves evolve. It's a process that is neatly summed up in Mr. Dylan's album title " 'Love and Theft,' " which itself is a quotation from a book on minstrelsy by Eric Lott.
Hip-hop, ever in the vanguard, ran into problems in the mid-1980's when the technique of sampling -- copying and adapting a riff, a beat and sometimes a hook or a whole chorus to build a new track -- was challenged by copyright holders demanding payment even for snippets. Although sampling was just a technological extension of the age-old process of learning through imitation, producers who use samples now pay up instead of trying to set precedents for fair use.
That might be a good idea; a song that recycles a whole melody (like Puff Daddy's productions) calls for different treatment than a song that borrows a few notes from a horn section, and courts are not the best place for aesthetic distinctions. But in practice, it means fewer samples per track, and it can make complex assemblages prohibitively expensive. Mixes heard only in clubs and bootleg recordings are now the outlets for untrammeled sampling experiments. Yet, samples have extended and revived careers for many musicians when listeners went looking for the sources.
Mr. Dylan has apparently sampled "Confessions of a Yakuza," remixing lines from the book into his own fractured tales of romance and mortality on " 'Love and Theft.' " The result, as in many collages and sampled tracks, is a new work that in no way affects the integrity of the existing one and that only draws attention to it.
Dr. Saga has no need to keep his book isolated. He told The Associated Press that he was ecstatic to have inspired such a well-known songwriter. And as news of the Dylan connection surfaced, sales of "Confessions of a Yakuza" jumped ...
Of course, Dr. Saga can't be too possessive about the writing. The book is an oral history, told to him by the yakuza gangster of the title. It's another story that has drifted into humanity's oral tradition. Mr. Dylan's complete lyrics are freely available at www.bobdylan.com. As for the song, if someone asks Mr. Dylan for sampling rights, it would be only fair to grant them."
Interview with Chuck D & Hank Shocklee of Public Enemy by Kembrew McLeod
Stay Free!: What are the origins of sampling in hip-hop?
Chuck D: Sampling basically comes from the fact that rap music is not music. It's rap over music. So vocals were used over records in the very beginning stages of hip-hop in the 70s to the early '80s. In the late 1980s, rappers were recording over live bands who were basically emulating the sounds off of the records. Eventually, you had synthesizers and samplers, which would take sounds that would then get arranged or looped, so rappers can still do their thing over it. The arrangement of sounds taken from recordings came around 1984 to 1989.
[...]
Stay Free!: How did the Bomb Squad [Public Enemy's production team, led by Shocklee] use samplers and other recording technologies to put together the tracks on It Takes a Nation of Millions.
Hank Shocklee: The first thing we would do is the beat, the skeleton of the track. The beat would actually have bits and pieces of samples already in it, but it would only be rhythm sections. Chuck would start writing and trying different ideas to see what worked. Once he got an idea, we would look at it and see where the track was going. Then we would just start adding on whatever it needed, depending on the lyrics. I kind of architected the whole idea. The sound has a look to me, and Public Enemy was all about having a sound that had its own distinct vision. We didn't want to use anything we considered traditional R&B stuff -- bass lines and melodies and chord structures and things of that nature.
[...]
Chuck D: Corporations found that hip-hop music was viable. It sold albums, which was the bread and butter of corporations. Since the corporations owned all the sounds, their lawyers began to search out people who illegally infringed upon their records. All the rap artists were on the big six record companies, so you might have some lawyers from Sony looking at some lawyers from BMG and some lawyers from BMG saying, "Your artist is doing this," so it was a tit for tat that usually made money for the lawyers, garnering money for the company. Very little went to the original artist or the publishing company.
[...]
Stay Free!: As you probably know, some music fans are now sampling and mashing together two or more songs and trading the results online. There's one track by Evolution Control Committee that uses a Herb Alpert instrumental as the backing track for your "By the Time I Get to Arizona." It sounds like you're rapping over a Herb Alpert and the Tijuana Brass song. How do you feel about other people remixing your tracks without permission?
Chuck D: I think my feelings are obvious. I think it's great.
AlterNet: Protest Music by Annalee Newitz (7 July 2004)
"By waging a war of litigation on file sharers and copyright infringers, the Recording Industry Association of America has unwittingly created a new kind of protest art. Mash-ups -- digitally knitted-together compositions made up of two or more popular songs -- are anti-authoritarian folk music for a generation whose "establishment" is represented by corporate intellectual-property owners.
[...] Often called "bastard pop" or simply "bootlegs," mash-ups are as easy to perform as a rip-off of a Bob Dylan tune. Cheap audio software allows anyone with a half-decent computer to convert the act of copyright infringement into something undeniably gorgeous and amusing by turns. Australian masher Dsico - who has been repeatedly threatened with legal action for his work - traces the style back to modernist art: "Much as Duchamp once drew a mustache on a copy of the Mona Lisa, bastard pop artists deface mainstream pop music." New York University professor and copyright reformer Siva Vaidhyanathan calls the movement a combination of innovation and infringement, adding, "Some of the greatest innovators of the past 100 years were accused of being infringers."
The very structure of the music itself is a direct response to the conditions under which it's made: lovingly assembled from pop sifted down off P2P networks, a Dsico creation like "Compton Magic" (NWA vs. Olivia Newton-John) seems to echo the mixed-up, black-market cacophony of an eDonkey addict's music collection.
Dodging lawyers' cease and desist orders, mash-up DJs often change their names and move their music from host to host in order to keep serving it up. But they soldier on, sharing tips and litigation horror stories on Brit mash-up site Get Your Bootleg On (gybo.proboards4.com) partly in the hope that one day their efforts will change copyright law. "I would love to see a form of copyright where as long as money isn't changing hands, everything is up for grabs," San Francisco mash-up DJ Adrian says.
Grey Tuesday, a recent mash-up protest organized by anti-RIAA group Downhill Battle, inspired more than 100,000 people to download copies of DJ Danger Mouse's dubiously legal bastard pop creation The Grey Album (a mash of the Beatles' White Album and Jay-Z's Black Album). "There's a public interest served by making this album available," protest organizer Holmes Wilson argues. "If people can't hear works that the copyright regime suppresses, they can't make an informed decision about what these laws should be."
More interesting than Wilson's considered stance are the sometimes-fantastical copyright theories of the DJs, promoters and activists who make up the bootleg community. Without a legal background in how copyright works, mashers feel free to develop a whole range of ideas about why their music is legal or illegal. For example, Adrian told me that as long as he plays mashed-up ASCAP music in an ASCAP-licensed venue, it's OK.
Unfortunately, it's not: Mash-ups are derivative works (a big I.P.-law no-no). Adrian also argued that since he's crediting the artists he mashes and giving away his mixes for free, he isn't hurting anyone. This theory wouldn't hold up in court, but it's far more commonsensical than current I.P. law.
Mash-ups also spawn social mixing that mimics the genre's political agenda: At a recent mash-up event in San Francisco, famous underground hackers mingled with locally known drag queens and wide-eyed indie rockers. And many bootlegs are explicitly designed to create mixes that cross racial or sexual identity lines -- thus, a mash-up might combine a Village People song with something by Public Enemy. A kind of political hopefulness or idealism seems to animate many of these mixes.
As a masher on GYBO recently posted, "Everything is illegal." Under an I.P. regime where artists feel like nothing goes, it seems that everything could. The infringement generation aims to mash up copyright law in pursuit of better music. But it also has a chance to challenge social divisions more profound than the distinctions between hip-hop, rock and electroclash."
Neuronal Resonance Fields, Aoidoi, and Sign Processes
"We must reconstruct, not abandon, an ideal of authenticity in our lives. Whatever we come up with, authenticity can no longer be rooted in singularity, in what the Greeks called the idion, or private person. That would be, in our culture of the copy, idiocy ... The impostors, "evil" twins, puppets, "apes," tricksters, fakes and plagiarists ... may be agents provocateurs to a more coherent, less derelict sense of ourselves. They may call us away from the despair of uniqueness towards more companionate lives."
Hillel Schwartz - The Culture of the Copy: Striking Likenesses, Unreasonable Facsimiles (New York: Zone Books, 1996, page 17)
Sony Walkman - Music to whose ears?
"The social pleasure of sharing music was terminated when people clamped plugs in their ears and tuned into a selfish sound. Music in the Walkman era ceased to connect us one to another. It promoted autism and isolation, with consequences yet untold."
Norman Lebrecht
Building Brandwidth in an Internet Economy
"She walked right up to me and got within my comfort field," Crandall stammered. "I was taken aback. She pulled out the earbuds on her iPod and indicated the jack with her eyes."
Warily unplugging his own earbuds, Crandall gingerly plugged them into the woman's iPod, and was greeted by a rush of techno.
"We listened for about 30 seconds," Crandall said. "No words were exchanged. We nodded and walked off."
The Shortwave And the Calling (3 August 2004 - The Washington Post)
David Segal: "In a cluttered home office in the World's End section of London, Akin Fernandez is trolling the dial of his newly acquired shortwave radio. It's December 1992 and it's late at night, when the city is quiet and the mad-scientist squawks of international broadcasts have an otherworldly tone. Fernandez, the owner and sole employee of an indie music label, is about to trip across a mystery that will take over his life.
Shortwave signals are bouncing, as they always do, around the globe, caroming off a layer of the atmosphere a few hundred miles above the Earth and into antennas all over the world. Fernandez can hear news from Egypt and weather reports from China. But his browsing stops when he tunes in something startling: the mechanized voice of a man, reading out numbers.
No context, no comment, no station identification. Nothing but numbers, over and over, for minutes on end. Then the signals disappear, as if somebody pulled the plug in the studio. And it's not just one station. The more he listens, the more number monologues he hears. [...]
What's with the numbers?
Answering that question, it turns out, would take Fernandez years, and it left him nearly penniless, at least for a while. It also brought him a horde of admirers on another continent, eventually earned him a credit in a Tom Cruise movie and sparked a legal battle with the acclaimed band Wilco.
Fernandez would study numbers stations largely because he couldn't stop even if he tried -- which is to say, he fell into the grip of an obsession. But along the way, by both accident and design, he discovered amid all that static the raw material for a point he likes to make, with characteristic zeal, about the future of rock-and-roll."
New Perspectives Quarterly - Summer 2004 - Neural Darwinism
Gerald Edelman: "The most important thing to understand is that the brain is "context bound." It is not a logical system like a computer that processes only programmed information; it does not produce preordained outcomes like a clock. Rather it is a selectional system that, through pattern recognition, puts things together in always novel ways. It is this selectional repertoire in the brain that makes each individual unique, that accounts for the ability to create poetry and music, that accounts for all the differences that arise from the same biological apparatus -- the body and the brain. There is no singular mapping to create the mind; there is, rather, an unforetold plurality of possibilities. In a logical system, novelty and unforeseen variation are often considered to be noise. In a selectional system such diversity actually provides the opportunity for favorable selection.
Here, Darwin and his effort to explain variance within biological populations through natural selection provided the key idea. In considering the brain, we are talking about a population of hundreds of billions of cells that far exceeds the number of stars in the sky. The number of possible connections these cells can make exceeds the number of particles in the universe."
The Guardian: Nicholas Lezard reviews 'Falling' by Garret Soden
"The path of virtue, said Thomas Browne at the beginning of his Christian Morals, is not only narrow: it's "funambulatory", a tightrope over an abyss. [...]
There are primatologists and anthropologists who suggest that it was a very strongly vested interest in not falling that led to our development of consciousness."
The Poetics of Gardens
Charles W. Moore, William J. Mitchell, & William Turnbull, Jr.
"A garden path can become the thread of a plot, connecting moments and incidents into a narrative. The narrative structure might be a simple chain of events with a beginning, middle, and end. It might be embellished with diversions, digressions, and picaresque twists, be accompanied by parallel ways (subplots), or deceptively fork into blind alleys like the alternative scenarios explored in a detective novel."
Randolph Jordan - The Echopeople: Reflections on the concept of echolocation in Gerry: Part 1
"They struggle to remember what they have done, to retrace their steps within the space of their minds, to create a coherent space out of the multitude of environments the desert has engulfed them with ..."
The Hippocampus as a Cognitive Map
"One wonders whether there will come a breaking point where, as eventually they must, the trails within dissolve to waving grass and the crossroad signs lie twisted and askew on rotting posts. Where, then, will the wanderer turn?"
Loren Eiseley - The Mind as Nature (Harper & Row, 1962, page 32)
Finnegans Wake - Wikipedia, the free encyclopedia
The book begins with the fall of Finnegan, a hod carrier, from a scaffold. At his wake, in keeping with the song "Finnegan's Wake," a fight breaks out, whiskey splashes on Finnegan's corpse, and he rises up again alive. Note how the simple removal of the song's apostrophe emphasizes and universalizes the theme of awakening: At Finnegan's wake, Finnegans wake. (Not only is the "wake" simultaneously Finnegan's funeral and his birth, the beginning of the dream in which he is paradoxically awakened, it is also the turbulence left by his absence, the expanding ripples and rhythm in the wake of his vessel.)
Continuing past the original song, Joyce has Finnegan put back down again ("Now be aisy, good Mr Finnimore, sir. And take your laysure like a god on pension and don't be walking abroad"). Someone else is sailing in to take over the story: Humphrey Chimpden Earwicker, whose initials HCE ("Here Comes Everybody") lend themselves to phrase after phrase throughout the book.
HCE is a foreigner who has taken a native Irish wife, Anna Livia Plurabelle (whose initials ALP as well are found in phrase after phrase), and they settle down to run ..."
posted by Andrew 8/09/2004 08:08:00 AM
Saturday, July 31, 2004
Recursive, Wide, and Loopy 2
MSNBC - How humans got the gift of gab by Kathleen Wren
"[...] The fact that a virtually infinite number of phrases can be nested inside one another gives human language an open-endedness, allowing us to express new ideas. In a commentary that accompanies the Science study, psychologist David Premack has proposed that the flexibility of human grammar may be a central aspect of human intelligence.
Whatever it is about the brain that allows such linguistic flexibility may also be key to the human imagination, according to Premack. Unlike other animals, which specialize in various skills, humans are supremely adaptable, able to learn new tasks and develop new technologies.
"Human intelligence and evolution are the only flexible processes on Earth capable of producing endless solutions to the problems confronted by living creatures," Premack writes."
Keep Talking: That's what makes us human
David Premack (Science 2004 303:318) makes several points including:
[...]
2) The grammar or syntax of human language is certainly unique. Like an onion or Russian doll, it is recursive: One instance of an item is embedded in another instance of the same item. Recursion makes it possible for the words in a sentence to be widely separated and yet dependent on one another. "If-then" is a classic example. In the sentence "If Jack does not turn up the thermostat in his house this winter, then Madge and I are not coming over," "if" and "then" are dependent on each other even though they are separated by a variable number of words. Are animals capable of such recursion? Fitch and Hauser have reported that tamarin monkeys are not capable of recursion. Although the monkeys learned a nonrecursive grammar, they failed to learn a grammar that is recursive. Humans readily learn both.
[...]
5) What are the factors that distinguish human intelligence? A major distinctive feature of human intelligence is flexibility. Animals, by contrast, are specialists. Bees are adept at sending messages through their dances, beavers at building dams, the nuthatch at remembering the location of thousands of caches of acorns it has buried. But each of these species is imprisoned by its adaptation; none can duplicate the achievement of the other. The nuthatch cannot build dams; bees do not have an uncanny memory for hidden caches of food; beavers cannot send messages. Humans, by contrast, could duplicate all these achievements and endlessly more. Why? Is recursive language the key to human flexibility?
6) Human intelligence and evolution are the only flexible processes on Earth capable of producing endless solutions to the problems confronted by living creatures. Did evolution, in producing human intelligence, outstrip itself? ...
Monkeys Deaf to Complex Communication, Study Says
Stefan Lovgren for National Geographic News (22 January 2004)
"[...] Fitch ... stresses that there is no one "magic bullet" that gives us language. "I don't believe that such magic bullets exist," he said. "Language is a complex mosaic including many important abilities, and any attempt to reduce it to just one will be simplistic and unsatisfactory."
Humble Factors
In his article, Premack suggests that other reasons, apart from an animal's inability to understand complex grammar, explain why they have not evolved languages. He says recursive language, which is one way to achieve more complex phrase structure, is not the key factor to consider.
"There has been too much tendency to think that because animals don't have recursion they don't have language," Premack said. "But the reason why they don't have non-recursive language or any other language is because they lack a whole bunch of simpler things."
He says one reason that animals don't have language is because they don't have voluntary control of sensory-motor systems, specifically voice and face, which are essential for speech and sign.
Another reason, he says, is that animals don't teach the way humans do.
"Although human mothers do not teach children grammar, they definitely teach them words," said Premack. "Humans are the only species that teach. Evolution, being endlessly clever, might produce words that don't require teaching, but until it does, it is not clear how any species other than humans could evolve language."
Animals are also not as flexible as humans. While bees may be able to send messages through dance, humans have dozens of ways of sending messages.
Imitation may be yet another factor. While many species can copy a role model's choice of object or location, they can't copy the motor action. This second-level of imitation, Premack maintains, is needed for the evolution of language." "Recursive language is very powerful and it enables us to talk in the fancy way we do," he said. "But suppose we only had non-recursive language. You could still ask questions, use descriptions, and make requests, only it would not be half as wonderful as the system we have."
Interview with Ursula Goodenough by Jill Neimark
Neimark: If you look at the evolutionary ladder, where do you think the sense of meaning begins? Do organisms other than humans have it?
Goodenough: All life has a kind of seamlessness. All creatures have to be aware of their environment, and there has been an evolution of the capacities needed for detecting increasingly complex stimuli. I have no problem calling this "meaning," since all creatures pick out meaningful facets of their environment. For the first creatures, these facets were physical and mediated by receptor proteins. Sperm and eggs find each other by protein shapes; photosynthetic bacteria find light by protein shapes. The impetus to figure out what's going on is still very much programmed into our highly complex brains.
Neimark: How does meaning in humans differ qualitatively from the rest of life on Earth?
Goodenough: My sense is that in developed human minds, the notion of meaning has expanded beyond what's immediately out there. We're constantly trying to figure out what caused something. That's true of all sorts of brain-based organisms, but perhaps the difference in humans is that if we can't see an obvious cause, we postulate. If you're lying in bed and hear a noise outside, you might imagine it's a burglar or perhaps Prince Charming. The point is, we form hypotheses and draw up scenarios for what that stimulus might mean.
I think this whole need to understand cause expanded early in humans -- we see it in cave paintings. If you are spending time with children, you see that they do this quite early: "What made me, Mommy and Daddy?" "What made Mommy and Daddy?"... That recursive kind of seeking causal explanations for things is part of us.
Manohla Dargis: 'Before Sunrise' sequel makes love worth believing in
"In a sense, "Before Sunset" is a movie about how we create selves just by talking. But it's also, as Jesse suggests at one point, about how we become prisoners of time."
Justice + Beauty = Sublime
Atlantic Unbound - 13 July 2004 -- The acclaimed poet Alice Fulton talks about Cascade Experiment, her new collection of poems, and why art must aim to be "fair" -- in both senses of the word [...]
Sarah Cohen: What exactly is a cascade experiment, and why did you choose the phrase as a title for the selected poems?
Alice Fulton: In science, a cascade experiment is a sort of domino effect, a trip wire, where one small catalyst causes an event and then that event causes the next event, and so forth. Each event changes the next one, so it becomes an avalanche of cause and effect. I called the book Cascade Experiment because when I looked back at the poems, I saw that I couldn't have predicted where each would lead, or the way one book would lead to another. "Cascade Experiment" was originally the title of one of the poems, now called "Shy One," and the idea comes up in another poem with the line "one touch and worlds take place." In general, I always look for a book title that I find inherently interesting. I like book titles that don't give everything away, that won't be understood down to the ground. That mystery is at the bottom of poetry: it's a recursive process that has no end.
Attention acts as visual glue
Speech is special
"Redundancy is implicitly built into language structure ...
Social habits which enable one to distribute the interpretative workload redundantly across many communication channels at once, and embed small completed chunks of sentences within other chunks, lessen the communicative demands placed on short-term memory and articulatory skill.
Conversations today are inevitably embedded in rather ritualized markers for greetings, turn-taking, demonstrating assent or dissent, and indicating objects. It seems reasonable that such language rituals would have been far more prominent during the early phases of brain-language co-evolution, not due to any greater innate predisposition but in response to intense social selection on communicative habits."
Terrence Deacon - The Symbolic Species: The co-evolution of language and the human brain (Penguin, 1998, pages 363-364)
Fractovia and Recursion (Self-engulfing: Recursive)
Rules about rules
"At Oxford in the 1970s, the experimental psychologist Jerome Bruner videotaped toddlers learning to speak. He was struck by the game-like quality of their verbal interactions with adults. The rules of language seemed to him like the rules of tennis or any other game. So far from relying on special brain wiring, language was, he thought, a by-product of flexible intelligence and especially of a general aptitude for making rules -- exhibited also in children at play.
'They very quickly get into the realm of pretend, where they're making real rules and real conventions about fictional things and fictional characters,' Bruner said in 1976. 'They soon make rules about rules themselves -- how to make rules -- which is after all what culture is about. How we do things with words, how we invent appropriate conventional behaviour. And isn't it clever of Nature to have arranged that play, like most other important things in life, doesn't work unless there's some fun to it?'
Those who favoured brains pre-adapted to language could of course assert that rules of games and make-believe are by-products of language skills, rather than the other way around. Chomsky's own view of language evolution was more open-minded than those wanting hard-wired grammar. He did not even insist on natural selection favouring language. It could be a by-product of big brains favoured by evolution for other reasons -- which need not be incompatible with Bruner's fun-loving brains.
In the decades that followed, there was no answer to the chicken-and-egg question of which came first, general cleverness or language. Chomskyan grammar itself evolved into an increasingly abstract system for judging sentences, with less and less connection with real life in a polyglot world. Without abandoning the quest for universal principles, some mainstream linguists therefore went back to Chomsky's starting point. They looked in detail at many real languages, searching for clues to mental and social mechanisms of grammar in the many differences between them, as well as the similarities stressed by Chomsky."
Nigel Calder - Magic Universe: The Oxford Guide to Modern Science
(Oxford University Press, 2003, pages 343-344)
Blog of Collective Intelligence: Emergent learning is figuring itself out
Jay Cross: "Emergent learning implies adaptation to the environment, timeliness, flexibility and space for co-creation. It is the future. We haven't figured it out yet. Or, from the perspective of complexity science, it hasn't figured itself out yet."
Out walking the dogma...
[...] Even if you can explain away the problems of mind-body dualism and the object/subject distinction, even if you can explain away the falsification paradigm and the "fact" that nothing scientific is ever proven to be true (we can only falsify or provide support), empiricism is still shackled with single variable, unidirectional causation. We do not have an adequate scientific model that allows for multiple and/or bidirectional/recursive causation that does not resort to statistics. And as soon as we resort to statistics, we lose the ability to describe with precision the behavior of specific individuals, since statistics is based on populations.
If you're still unconvinced of the fallibility of hard science, take a look at its bleeding edge -- Occam's Razor. If two theories explain a phenomenon equally well, then the simpler of the two is the true explanation.
Now, that's what I call the pinnacle of objectivity.
When it gets down to brass tacks, the most powerful ideas of the hard sciences are not clockwork, mechanistic descriptions of directly perceived empirical phenomena -- the most powerful ideas of science are metaphors.
So, I have no problem believing that sometimes a cigar is just a cigar.
Sometimes, that is.
Bob
Recursion [from encyclopedia.thefreedictionary.com]
Recursion is a way of specifying a process by means of itself. More precisely (and to dispel the appearance of circularity in the definition), "complicated" instances of the process are defined in terms of "simpler" instances, and the "simplest" instances are given explicitly.
Recursion in language
Mathematical linguist Noam Chomsky produced evidence that unlimited extension of a language such as English is possible only by the recursive device of embedding sentences in sentences.
[...] Niels K. Jerne, the 1984 Nobel Prize laureate in Medicine and Physiology, used Chomsky's transformational-generative grammar model to explain the human immune system, equating "components of a generative grammar ... with various features of protein structures." The title of Jerne's Stockholm Nobel lecture was The Generative Grammar of the Immune System.
Here is another, perhaps simpler way to understand recursive processes:
1. Are we done yet? If so, return the results. Without such a termination condition a recursion would go on forever.
2. If not, simplify the problem, solve those simpler problem(s), and assemble the results into a solution for the original problem. Then return that solution.
A more humorous illustration goes: "In order to understand recursion, one must first understand recursion." Or perhaps more accurate is the following due to Andrew Plotkin: "If you already know what recursion is, just remember the answer. Otherwise, find someone who is standing closer to Douglas Hofstadter than you are; then ask him or her what recursion is."
"Mind is a pattern perceived by a mind. This is perhaps circular, but it is neither vicious nor paradoxical." Douglas Hofstadter
Metamanda's Weblog: Prelude... Ant Fugue -- Douglas Hofstadter
"Fugues have that interesting property, that each of their voices is a piece of music in itself; and thus a fugue might be thought of as a collection of several distinct pieces of music, all based on one single theme, and all played simultaneously. And it is up to the listener ... to decide whether it should be perceived as a unit, or as a collection of independent parts, all of which harmonize."
Douglas Hofstadter - Prelude... Ant Fugue
Swarm-semiotics: The swarming body by Jesper Hoffmeyer
"[...] From nest building in termites to the dreams and fantasies which imprison human intelligence is a long jump, and I personally don't believe that intelligence can ever be modelled at all in a disembodied medium. It is tempting, nevertheless, to think of intelligence as a swarm-phenomenon, because this would bring us away from the ever returning homunculus problem: that there seems to be nobody - no homunculus - inside our brain who does the thinking, there just is no central processor to control the activities of the mind.
My point is that the swarm in which intelligence manifests itself is exactly that entity we call the body. Biologically speaking, the body can be understood as a swarm of cells and tissues which, unlike the swarms of bees or ants, stick relatively firmly together. However, the swarm of cells constituting a human body is a very different kind of swarm from that of the social insects. The body swarm is not built on ten thousand nearly identical units such as a bee society. Rather it should be seen as a swarm of swarms, i.e., a huge swarm of more or less overlapping swarms of very different kinds. And the minor swarms again are swarm-entities, so that we get a hierarchy of swarms. At all levels these swarms are engaged in distributed problem solving based on an infinitely complicated web of semetic interaction patterns which in the end can only be explained through reference to the actual history of the body system, evolution."
It's a jungle in there
"The mind [is] a network of distinct modules... The brain is much more like an ecosystem than a list of stable personality traits ..."
Steven Johnson - Mind Wide Open (Allen Lane, 2004, page 29)
The Baldwin Effect: A Bibliography
Look Who's Talking
"I remember a conversation with cultural anthropologist Edward T. Hall, who pointed out to me that the most significant, the most critical inventions of man were not those ever considered to be inventions, but those that appeared to be innate and natural."
John Brockman
Early Voices: The Leap to Language by Nicholas Wade
(The New York Times - 15 July 2003)
"[...] Language, as linguists see it, is more than input and output, the heard word and the spoken. It's not even dependent on speech, since its output can be entirely in gestures, as in American Sign Language. The essence of language is words and syntax, each generated by a combinatorial system in the brain.
If there were a single sound for each word, vocabulary would be limited to the number of sounds, probably fewer than 1,000, that could be distinguished from one another. But by generating combinations of arbitrary sound units, a copious number of distinguishable sounds becomes available. Even the average high school student has a vocabulary of 60,000 words.
The other combinatorial system is syntax, the hierarchical ordering of words in a sentence to govern their meaning.
Chimpanzees do not seem to possess either of these systems. They can learn a certain number of symbols, up to 400 or so, and will string them together, but rarely in a way that suggests any notion of syntax. This is not because of any poverty of thought. Their conceptual world seems to overlap to some extent with that of people: they can recognize other individuals in their community and keep track of who is dominant to whom. But they lack the system for encoding these thoughts in language.
[...]
Ending the Silence
Linguists Return to Ideas of Origins
[...] Having posited in the early 1970's that the ability to learn the rules of grammar is innate, a proposition fiercely contested by other linguists, Dr. Chomsky might be expected to have shown keen interest in how that innateness evolved. But he has said very little on the subject, a silence that others have interpreted as disdain.
As Dr. Jackendoff, the president of the Linguistic Society of America, writes: "Opponents of Universal Grammar argue that there couldn't be such a thing as Universal Grammar because there is no evolutionary route to arrive at it. Chomsky, in reply, has tended to deny the value of evolutionary argumentation."
But Dr. Chomsky has recently taken a keen interest in the work by Dr. Hauser and his colleague Dr. W. Tecumseh Fitch on communication in animals. [In 2002] the three wrote an article in Science putting forward a set of propositions about the way that language evolved. Based on experimental work by Dr. Hauser and Dr. Fitch, they argue that sound perception and production can be seen in other animals, though they may have been tweaked a little in hominids.
A central element in language is what linguists call recursion, the mind's ability to bud one phrase off another into the syntax of an elaborate sentence. Though recursion is not seen in animals, it could have developed, the authors say, from some other brain system, like the one animals use for navigation.
Constructing a sentence, and going from A to Z through a series of landmarks, could involve a similar series of neural computations. If by some mutation a spare navigation module developed in the brain, it would have been free to take on other functions, like the generation of syntax. "If that piece got integrated with the rest of the cognitive machinery, you are done, you get music, morality, language," Dr. Hauser said.
The researchers contend that many components of the language faculty exist in other animals and evolved for other reasons, and that it was only in humans that they all were linked. This idea suggests that animals may have more to teach about language than many researchers believe, but it also sounds like a criticism of evolutionary psychologists like Dr. Pinker and Dr. Dunbar, who seek to explain language as a faculty forced into being by specifics of the human lifestyle.
Dr. Chomsky rejects the notion that he has discouraged study of the evolution of language, saying his views on the subject have been widely misinterpreted.
"I have never expressed the slightest objection to work on the evolution of language," he said in an e-mail message. He outlined his views briefly in lectures 25 years ago but left the subject hanging, he said, because not enough was understood. He still believes that it is easy to make up all sorts of situations to explain the evolution of language but hard to determine which ones, if any, make sense.
But because of the importance he attaches to the subject, he returned to it recently in the article with Dr. Hauser and Dr. Fitch. By combining work on speech perception and speech production with a study of the recursive procedure that links them, "the speculations can be turned into a substantive research program," Dr. Chomsky said.
Others see Dr. Chomsky's long silence on evolution as more consequential than he does. "The fact is that Chomsky has had, and continues to have, an outsize influence in linguistics," Dr. Pinker said in an e-mail message. Calling Dr. Chomsky both "undeniably, a brilliant thinker" and "a brilliant debating tactician, who can twist anything to his advantage," Dr. Pinker noted that Dr. Chomsky "has rabid devotees, who hang on his every footnote, and sworn enemies, who say black whenever he says white."
"That doesn't leave much space," Dr. Pinker went on, "for linguists who accept some of his ideas (language as a mental, combinatorial, complex, partly innate system) but not others, like his hostility to evolution or any other explanation of language in terms of its function."
Biologists and linguists have long inhabited different worlds, with linguists taking little interest in evolution, the guiding theory of all biology. But the faculty for language, along with the evidence of how it evolved, is written somewhere in the now decoded human genome, waiting for biologists and linguists to identify the genetic program that generates words and syntax."
Early hominid ears primed for speech: New Scientist - 22 June 2004
"Early humans evolved the anatomy needed to hear each other talk at least 350,000 years ago. This suggests rudimentary form[s] of speech developed early on in our evolution.
The conclusion comes from studies of fossilised skulls discovered in the mountains of Spain. A team of Spanish and US researchers used CT scans to measure the bones and spaces in the outer and middle ears of five specimens ..."
Stone Age Ear for Speech: Ancient finds sound off on roots of language: Science News Online - 26 June 2004
"Using digital enhancements of skull fragments from five prehistoric individuals dating to more than 350,000 years ago, anthropologists argue that these human ancestors probably had hearing similar to that of people today.
Since the ears of social mammals are typically designed to perceive sounds made by fellow species members, the humanlike hearing of these ancient folk probably was accompanied by speech, contend Ignacio Martínez of the University of Alcalá in Spain, and his colleagues ..."
Bruce Bower
Hearing babies babble with hands: BBC News - 14 July 2004
"[...] Most babies make a babbling 'ba, ba, ba' sound at around seven months.
Some scientists say this is merely a motor activity driven largely by the baby's emerging control over the movement of their mouth and jaw.
Others believe it is an attempt to mimic human speech and reflects the baby's innate sensitivity to the rhythm of language.
Dr [Laura-Ann] Petitto has argued that deaf babies who are exposed to sign language learn to babble using their hands in the same way that hearing babies learn to vocally babble with their mouths.
Her latest research ... shows hearing babies exposed to sign language also begin to babble with their hands. [...]
The findings would not be possible unless all babies were born with a sensitivity to specific rhythmic patterns at the heart of human language and the capacity to use them, she said."
New Scientist: Babies babble in sign language too - 15 July 2004
Alison Motluk: "Babies exposed to sign language babble with their hands, even if they are not deaf. The finding supports the idea that human infants have an innate sensitivity to the rhythm of language and engage it however they can, the researchers who made the discovery claim.
Everyone accepts that babies babble as a way to acquire language, but researchers are polarised about its role. One camp says that children learn to adjust the opening and closing of their mouths to make vowels and consonants by mimicking adults, but the sounds are initially without meaning.
The other side argues that babbling is more than just random noise-making. Much of it, they contend, consists of phonetic-syllabic units - the rudimentary forms of language.
Laura-Ann Petitto at Dartmouth College in Hanover, New Hampshire, a leader in this camp, has argued that deaf babies who are exposed to sign language learn to babble using their hands the way hearing babies do with their mouths. [...]
Pattern recognition
Sign-exposed babies produced two distinct types of rhythmic hand activity, a low-frequency type at 1 hertz and a high-frequency one at 2.5 hertz. The speech-exposed babies had only high-frequency moves. There was a "unique rhythmic signature of natural language" to the low-frequency movements. "What is really genetically passed on," Petitto says, "is a sensitivity to patterns."
But Peter MacNeilage, of the University of Texas at Austin, is not persuaded. "She makes a blanket statement that there is an exact correspondence between the structures of speech and sign," he says. "But there is no accepted evidence for this view at the level of phonological structure or in the form of a rhythm common to speech and sign."
Journal reference Cognition (vol 93, p 43)
Kalevi Kull: A sign is not alive -- a text is [pdf]
(Sign Systems Studies 30.1, 2002)
"Since semiosis is not an action of just one sign, since semiosis involves always a multitude of signs, it is a textual process like translation is."
Notes on 'Aramis' by Bruno Latour
"In the translation model, there is no transportation without transformation."
Bruno Latour - Aramis or The Love of Technology, translated by Catherine Porter (Cambridge: Harvard UP 1996, page 119)
Cybernetic Explanation
"The idea that communication is the creation of redundancy or patterning can be applied to the simplest engineering examples. Let us consider an observer who is watching A send a message to B. The purpose of the transaction (from the point of view of A and B) is to create in B's message pad a sequence of letters identical with the sequence which formerly occurred in A's pad. But from the point of view of the observer this is the creation of redundancy. If he has seen what A had on his pad, he will not get any new information about the message itself from inspecting B's pad.
Evidently, the nature of "meaning," pattern, redundancy, information and the like, depends upon where we sit. In the usual engineers' discussion of a message sent from A to B, it is customary to omit the observer and to say that B received information from A which was measurable in terms of the number of letters transmitted, reduced by such redundancy in the text as might have permitted B to do some guessing. But in a wider universe, i.e., that defined by the point of view of the observer, this no longer appears as a "transmission" of information but rather as a spreading of redundancy. The activities of A and B have combined to make the universe of the observer more predictable, more ordered, and more redundant. We may say that the rules of the "game" played by A and B explain (as "restraints") what would otherwise be a puzzling and improbable coincidence in the observer's universe, namely the conformity between what is written on the two message pads.
To guess, in essence, is to face a cut or slash in the sequence of items and to predict across that slash what items might be on the other side."
Gregory Bateson: Steps to an Ecology of Mind (University of Chicago Press, 2000, pages 412-413)
posted by Andrew 7/31/2004 08:08:00 PM
Saturday, July 17, 2004
Recursive, Wide, and Loopy
Puzzled monkeys reveal key language step by Gaia Vince
"The key cognitive step that allowed humans to become the only animals using language may have been identified, scientists say.
A new study on monkeys found that while they are able to understand basic rules about word patterns, they are not able to follow more complex rules that underpin the crucial next stage of language structure. For example, the monkeys could master simple word structures, analogous to realising that "the" and "a" are always followed by another word. But they were unable to grasp phrase patterns analogous to "if... then..." constructions. This grammatical step, upon which all human languages depend, may be "the critical bottleneck of cognition that we had to go through in order to develop and use language", says Harvard University's Marc Hauser, who carried out the study with fellow psychologist Tecumseh Fitch, at the University of St Andrews, Scotland. "Perhaps the constraint on the evolution of language was a rule problem," Hauser told New Scientist.
Fitch and Hauser carried out two aural tests on cotton-top tamarin monkeys in which sequences of one-syllable words were called out by human voices.
In the first test, random words were called out in a strictly alternating pattern of male followed by female voices. The monkeys responded to breaks in the male-female rule, by looking at the loudspeaker. This showed that they were able to recognise the simple rule.
In the next test, the grammatical rule dictated that the male voice could call out one, two or three words, as long as the female voice did the same. This type of slightly more complex pattern is called recursive, as it involves a rule within a rule.
This time, the monkeys were unable to recognise any breaks in the pattern. But twelve human volunteers given the same test had no such difficulty, although most were unable to explain what the rule actually was.
"Recursive ability is uniquely human and affects more than just our language, but most of our behaviour," says renowned primate language expert David Premack, who wrote an article accompanying the study published in Science. "For example, in a classroom we often see child A watch child B watch child C watch the teacher. But in chimps, we see chimp A watch its mother, chimp B watch its mother, chimp C watch its mother..."
Human flexibility
Premack argues that although recursive ability is not absolutely necessary for language -- non-recursive sentences are possible -- being unable to master recursion may have been a stumbling block that prevented monkeys from developing language.
"Monkeys are also not physically capable of speech, they are unable to properly copy actions and they cannot teach -- all of which are skills required for language," he told New Scientist.
Mastery of the underlying rule of recursion is the key to human flexibility, Premack believes, allowing humans to think in the abstract, use metaphors and comprehend concepts such as time. It probably arose as the brain evolved into a more complex organ, but is not located in a single brain region.
However, it is not known whether modern humans are born with the ability to recognise recursive language patterns. More research into recursive ability in humans and their close relatives chimpanzees needs to be carried out, Hauser says."
Journal reference: Science (vol 303, p 377)
Google Search: define:recursive
Self-Processing: 'It would from many a blunder free us'
"With video we can know the difference between how we intend to come across and how we actually do come across. What we put out, what is taken by the tape, is an imitation of our intended image; it is our monkey. A video system enables us to get the monkey off our backs, where we can't see him, out onto the tape, where we can see him. That is the precise way in which we've been making a monkey of ourselves. The monkey has been able to get away with his business because he operates on the other side of the inside/outside barrier. The moebius tape strip snips the barrier between inside/outside. It offers us one continuous (sur)face with nothing to hide. We have the option of taking in our monkey and teaching him our business or letting him go on with his.
Taking in your own outside with video means more than just tripping around the moebius strip in private. One can pass through the barrier of the skin, pass through the pseudo-self to explore the entirety of one's cybernet -- i.e., the nexus of informational processes one is a part of.
[...] In fact, we live in multiple loops. [...]
The cybernetic extension of ourselves possible with video-tape does not mean a reinforcement of the ordinarily understood "self." Total touch with ones's cybernet precludes the capitalism of identity ...
Master Charge does not make you master of anything but involves you in an expensive economy of credit information processed by computer, your checking account, TV ads ... and busy telephones. The Master Charge card exploits the illusion of unilateral control over life the West has suffered with. "I am the Captain of my Soul; I am the Master of my Fate." We have yet to understand there is no master self. They are now putting photos on charge cards when they should be mapping the credit system the card involves you in. Video users are prone to the same illusion. It is easy to be zooming in on "self" to the exclusion of environmental or social systems.
Doing feedback for others, one comes to realize the necessity of taping and replaying context. I had the opportunity to do a kind of video meditation ..."
Paul Ryan - Cybernetics of the Sacred (Anchor Press/Doubleday, 1974, pages 30-31)
Michael Ventura - The World Is No Longer the World
"[...] In an interview with The New York Times, [Robert W. Taylor] was asked how the new broadband technologies would impact daily life. He answered: "You'll be able to wear -- an unobtrusive device that will record in full color and sound everything that you see or point your head at, or, depending on how many of them you have, everything that's around you. And share it. Every waking and sleeping moment in your life will be recorded. And you will be able to store and retrieve it and do what you will with it." (He added, almost as an afterthought, "there are obvious implications for privacy that will have to be worked through.") "How will that change the world?" the interviewer quickly asked. "I don't know," Taylor said, "but it will."
Every waking and sleeping moment -- but of course after only one day of life-recording, you would fall hopelessly behind your ability to manage and manipulate your material. There would not be enough waking moments on the following day for viewing the entire record of the first day -- though one could no doubt fast-forward. Still, much of the second day would be images of the person viewing the previous day's images -- mirrors facing mirrors. That's one possibility. Or, instead of letters, you could e-mail hours of your life as you'd lived it -- probably as you're living it! Or two people who'd spent the day (or night) together could run both records simultaneously on a split screen. Or maybe you'd set up a Web site on which, every day, you'd run your life of the previous day (or again, Web site it live!) -- instant replay might become a kind of learning aid to one's sense of identity, proof that you exist in a world that no longer entirely believes in any reality that can't be contained on a screen. Western civilization has already and thoroughly defined "life" as "self-consciousness," "I think therefore I am," but this technology would create an environment in which the recorded consciousness of a life would be defined strictly in terms of what could be seen and heard ..."
Interconnected: two things i've been ... (27 June 2004)
"Visual processing seems to me to have two main tasks. One is to assemble a world that is easily abstracted ... the other ... is to throw information away.
[...] We can learn lessons about how to throw away information. The two perceptual jobs are combined, of course.
[...] Intelligence is distributed over the environment because we throw information away. On the long scale (light from above), and the short scale (you know the time, but you haven't looked at your watch yet). Artificial objects, created interfaces that don't obey distance, or object-hood, or texture: they're either confusing, or, if used right, remarkably useful illusions (television)."
Matt Webb
BBC News: How memories build during sleep
"What to do with too much information is the great riddle of our time."
Theodore Zeldin - An Intimate History of Humanity (Minerva, 1995, page 18)
Brain Candy by Floyd Skloot
[...] Early in the book [An Alchemy of Mind], examining how the brain adapts as we learn new information, [Diane] Ackerman says, "We arrive in this world clothed in the loose fabric of a self, which then tailors itself to the world it finds." Later, talking about emotions, she says, "Our ideas may behave, but our emotions are still Pleistocene ..."
The Scotsman - Critique - Whole worlds in his hands
"We are a species of pattern finders. Evolution made us so."
Tom Adair
Thomas Dixon - From Passions to Emotions (Cambridge University Press, 2003)
"Today there is a thriving 'emotions industry' to which philosophers, psychologists and neuroscientists are contributing. Yet until two centuries ago 'the emotions' did not exist. In this path-breaking study Thomas Dixon shows how, during the nineteenth century, the emotions came into being as a distinct psychological category, replacing existing categories such as appetites, passions, sentiments and affections. By examining medieval and eighteenth-century theological psychologies and placing Charles Darwin and William James within a broader and more complex nineteenth-century setting, Thomas Dixon argues that this domination by one single descriptive category is not healthy. Overinclusivity of 'the emotions' hampers attempts to argue with any subtlety about the enormous range of mental states and stances of which humans are capable."
The Posthuman Touch: N. Katherine Hayles reviewed by Erik Davis
"[...] Though she recognizes the techno-transcendentalist nightmares tucked inside the computational universe ("a culture inhabited by posthumans who regard their bodies as fashion accessories"), Hayles is open to a future populated with increasingly brainy machines. Refreshingly, Hayles also suggests that the art of embodiment could be well served by some lessons of evolutionary psychology, which many pomo science types write off as an evil blasphemy. In a word, Hayles is willing to give up some of that much-vaunted human control. "The very illusion of control bespeaks a fundamental ignorance about the nature of the emergent process through which consciousness, the organism, and the environment are constituted." So how do we live with creative intelligence and awakened senses in a groundless world beyond our control?"
Interview with Russell Hoban [from Stride Magazine no. 26, 1986]
Rupert Loydell: You often use the theme of something living behind our eyes. Is that something you actually believe or just a literary device?
Russell Hoban: "No, I actually feel it. It is more than a matter of belief -- it just feels that way to me, as if we are inhabited by something that lives with us."
Rupert Loydell: Does that tie in with your interest in shamanism?
Russell Hoban: "I suppose so, in that shamanism is a mode of opening the self, opening the conscience, to forces that we can't ordinarily perceive in ordinary ways. In my writing I am always trying to be more open to things that don't come to our minds in ordinary states, and the way I do it is just by tuning in obsessively to ideas that come to me, by working late at night, by staying with things until I am very tired, and until this stiffness of the mind breaks down and loosens up, and thoughts come in that wouldn't come."
Running Backward Into The Future - Part One
"Marshall McLuhan made the point that the structure of any medium became the content of subsequent media. Thus the gift storytelling gave to writing was the content of the stories. The fact that the cadenced, nuanced, recursive and patterned structure of told stories was lost to the written text was only noticed much later.
Julian Jaynes made the same point with regard to the metaphors used by a culture to articulate the nature of human consciousness; the "landscape" of mind reflecting the topography and technology of each era." Richard Shand
EDGE 3rd Culture: Marc Hauser: Animal Minds
"For the past few years I have been using the theoretical tools from evolutionary biology to ask questions about the design of animal minds. I'm particularly interested in taking the approach that some people have had within evolutionary psychology, and saying look, this whole notion of the environment for evolutionary adaptedness which people have pegged as being associated with the hunter-gatherer period in the Pleistocene, may be true for some aspects of the human mind, but are probably wrong as a date for many other aspects. If we think about how organisms navigate through space, recognize what is an object, enumerate objects in their environment -- those are aspects that are probably shared across a wide variety of animals. Rather than saying that the human mind evolved and was shaped during the Pleistocene, it's more appropriate to ask what things happened in the Pleistocene that would have created a particular signature to the human mind that doesn't exist in other animals."
Are we still evolving? by Gabrielle Walker [from Prospect Magazine July 2004]
[...] "What's special about human beings is that we learn from one another," says Svante Pääbo, a geneticist from the Max Planck Institute in Leipzig. "When we invent something new like cars, we don't wait for evolution to make us better drivers or learn how to cross the street. We have driving schools and we teach our children to look both ways before crossing."
Slashdot: That's Sir Tim to You
Free was key, says Lee comments GillBates0
"In this day and age of superfluous patents [slashdot.org] and frivolous lawsuits [slashdot.org], Sir Tim Berners-Lee [w3.org] gently reminds us of the importance of free and selfless contribution [cnn.com] for the betterment of humanity. Speaking at the ceremony for winning the Millennium Technology Prize [technologyawards.org] (as reported earlier on Slashdot [slashdot.org]), he said that he would never have succeeded if he'd tried to charge money for his inventions. The prize committee agreed, citing the importance of Berners-Lee's decision never to commercialize or patent his contributions to the Internet technologies he had developed, and recognizing his revolutionary contribution to humanity's ability to communicate."
Interview with Tim Berners-Lee
Simon Winchester: It has been argued that the rapid developing of search techniques on the internet has made conducting research, particularly for school pupils, too easy. Do you think that school children are, perhaps, becoming too reliant on computers?
Tim Berners-Lee: Research isn't about finding lots of information - it is about understanding it: finding the relationships between concepts. The fact that you can get information more rapidly may help with speed, but when your task is to arrange this information in such a way that it personally makes sense to you, then the computer is a very useful tool. I don't know how reliant on computers you all are at Emanuel. Do you all have laptops and wireless networking? I think we should all be careful to spend as much time using the other parts of your brain as you do using a computer. Take a break and do a little calligraphy to let your thoughts settle. Keep a musical instrument within reach of your computer at home. [...]
Simon Winchester: Do you see the internet and the book in competition with each other? Especially in regard to young people?
Tim Berners-Lee: I do see the internet and TV in competition with each other, and I hope the internet will become such a fun, creative thing that it takes time away from the numbing effect of television. I don't think it will replace the book. First of all, the novel is a fundamentally important genre of communication whether you read it on paper or screen. Secondly, it will be a long time before a computer peripheral can compete with the warm feel of a wedge of paper, which can be taken and read anywhere, and within which each word has a physical position.
The Walter J. Ong Project
"Ong's work is often presented alongside the postmodern and deconstruction theories of Claude Lévi-Strauss, Jacques Derrida, Michel Foucault, Hélène Cixous, and others. His own work in orality and literacy shows deconstruction to be unnecessary: if you consider language to be fundamentally spoken, as language originally is, it does not consist of signs, but of events. Sound, including the spoken word, is an event. It takes time. The concept of "sign," by contrast, derives primarily not from the world of events, but from the world of vision. A sign can be physically carried around, an event cannot: it simply happens. Words are events."
Sky News -- WWW.TIM.GETS.GONG.UK
"Born in East Sheen, south-west London, in 1955, [Sir Tim Berners-Lee] was the eldest child of two mathematicians renowned within the computer industry for their work on Britain's first commercial computer, the Ferranti Mark I.
He studied at the Emanuel School in Wandsworth and later read physics at the Queen's College, Oxford, where he was banned from using the university's computer after being caught hacking.
Sir Tim later built his own computer, using an old TV set, a Motorola microprocessor and a soldering iron."
XMLMania.com - Tim Berners-Lee, Inventor of the World Wide Web, Knighted by Her Majesty Queen Elizabeth II
"[...] While working in 1980 as a consultant software engineer at CERN, the European Particle Physics Laboratory in Geneva, Switzerland, Sir Timothy wrote his own private program for storing information using the kind of random associations the brain makes. The "Enquire" program, which was never published, formed the conceptual basis for his future development of the Web.
Subsequently he proposed a global hypertext project at CERN in 1989, and by December 1990, the program "WorldWideWeb" became the first successful demonstration of Web clients and servers working over the Internet. All of his code was made available free on the Internet at large in the summer of 1991.
A London native, Sir Timothy graduated with a degree in physics from Queen's College at Oxford University, England in 1976. While there he built his first computer with a soldering iron, TTL gates, an M6800 processor and an old television."
Review: Mind Wide Open by Steven Johnson
"Neuroscientists now view the brain as an orchestra made up of "dozens of players". Forget the idea of the brain as a unitary supercomputer and think instead of an assemblage of different modules and chemicals (or "molecules of emotion"), each specialised for a different task. They are not always easy bedfellows." P.D. Smith
Brain cells become more discriminating when they work together
"Team work is just as important in your brain as it is on the playing field: A new study published online on April 19 [2004] by the Proceedings of the National Academy of Sciences reports that groups of brain cells can substantially improve their ability to discriminate between different orientations of simple visual patterns by synchronizing their electrical activity.
The paper, "Cooperative synchronized assemblies enhance orientation discrimination," by Vanderbilt professor of biomedical engineering A. B. Bonds with graduate students Jason Samonds and Heather A. Brown and research associate John D. Allison provides some of the first solid evidence that the exact timing of the tiny electrical spikes produced by neurons plays an important role in brain functioning. Since the discovery of alpha waves in 1929, experts have known that neurons in different parts of the brain periodically coordinate their activity with their neighbors. Despite a variety of theories, however, scientists have not been able to determine whether this "neuronal synchrony" has a functional role or if it is just a by-product of the brain's electrical activity ..."
Evolution of the yeast protein interaction network
Hong Qin, Henry H. S. Lu, Wei B. Wu and Wen-Hsiung Li (Proceedings of the National Academy of Sciences USA, 28 October 2003; 100 (22): 12820–12824)
"[...] A key question in the evolution of biological complexity is, how have integrated biological systems evolved? Darwinists proposed natural selection as the driving force of evolution. However, the striking similarities between biological and nonbiological complexities have led to the argument that a set of universal (or ahistorical) rules account for the formation of all complexities. The yeast protein interaction network is an example of a complex biological system and contributes to the complexity at the cellular level. By analyzing the growth pattern and reconstructing the evolutionary path of the yeast protein interaction network, we can address whether or not network growth is contingent on evolutionary history, which is the key disagreement between the Darwinian view and the universality view.
[...] The key disagreement between the Darwinian view and the universality view on the evolution of biological complexity is the role of historical contingency. Undoubtedly, efforts to search for universal rules benefit our understanding on biological complexity. However, by using the yeast protein interaction network as an example, we observed a correlation between network evolution and the universal tree of life. This observation strongly argues that network evolution is not ahistorical, but is, in essence, a string of historical events."
Delays, connection topology, and synchronization of coupled chaotic maps
Fatihcan M. Atay, Jürgen Jost and Andreas Wende (Physical Review Letters 9 April 2004)
Abstract: "We consider networks of coupled maps where the connections between units involve time delays. We show that, similar to the undelayed case, the synchronization of the network depends on the connection topology, characterized by the spectrum of the graph Laplacian. Consequently, scale-free and random networks are capable of synchronizing despite the delayed flow of information, whereas regular networks with nearest-neighbor connections and their small-world variants generally exhibit poor synchronization. On the other hand, connection delays can actually be conducive to synchronization, so that it is possible for the delayed system to synchronize where the undelayed system does not. Furthermore, the delays determine the synchronized dynamics, leading to the emergence of a wide range of new collective behavior which the individual units are incapable of producing in isolation."
Bruce Bower: Grannies give gift of longer lives
The Limits of Knowledge
"Laplace, Leibniz, Descartes, and Kant popularized the idea of the universe as a vast machine. The cosmos was likened to a watch, composed of many parts interacting in predictable ways. Implicit in this analogy was the possibility of predicting all phenomena. The perfect knowledge of Laplace's cosmic intelligence might be a fiction, but it could be approximated as closely as desired.
Today the most common reaction to Laplace's idea, among people who have some feel for modern physics, is to cite quantum uncertainty as its downfall. In the early twentieth century, physicists discovered that nature is nondeterministic at the subatomic scale. Chance enters into any description of quanta, and all attempts to exorcise it have failed. Since there is no way of predicting the behaviour of an electron with certainty, there is no way of predicting the fate of the universe.
The quantum objection is valid, but it is not the most fundamental one. In 1929, two years after Heisenberg formulated the principle of quantum uncertainty, Hungarian physicist Leo Szilard discovered a far more general limitation on empirical knowledge. Szilard's limitations would apply even in a world not subject to quantum uncertainty. In a sense, they go beyond physics and derive instead from the logical premise of observation. For twenty years, Szilard's work was ignored or misunderstood. Then in the late 1940s and early 1950s it became appreciated as a forerunner of the information theory devised by Claude Shannon, John von Neumann, and Warren Weaver.
Information theory shows that Laplace's perfect knowledge is a mirage that will forever recede into the distance. Science is empirical. It is based solely on observations. But observation is a two-edged sword. Information theory claims that every observation obscures at least as much information as it reveals. No observation makes an information profit. Therefore, no amount of observation will ever reveal everything -- or even take us closer to knowing everything.
Complementing this austere side of information theory is insight into how physical laws generate phenomena. Physicists have increasing reason to suppose that the phenomena of the world, from galaxies to human life, are dictated more by the fundamental laws of physics than by some special initial state of the world. In terms of Einstein's riddle, this suggests that God had little meaningful choice once the laws of physics were selected. Information theory is particularly useful in explaining the complexity of the world."
The Recursive Universe: Cosmic Complexity and the Limits of Scientific Knowledge by William Poundstone (Paperback edition: Oxford University Press, 1987, pages 21-22)
"Consider any day saved on which you have danced at least once." Fried Nietzsche
'Nature via Nurture' It's Genetic, Sometimes
Michael Ruse: I wonder if many today would really disagree with Ridley's basic claim in "Nature via Nurture" that essentially the nature-versus-nurture, biology-versus-culture, genes-versus-environment dichotomy has broken down and truly is less than useful to invoke. Organisms, and this applies especially to human organisms, are complex systems produced by genes, but very much molded by the experiences they encounter and situations to which they have to respond. "Genes are the mechanisms of experience," in the author's words.
Messages: How do animals communicate?
"[...] If you look at the history of work in animal communication, the greatest strides have been made in species that have specialized vocalizations, like bird song, where people have focused on the songs of a bird, even though birds have many other vocalizations. This is a specialized vocalization, it has dedicated neuro-circuitry, and it has a very specific function -- to attract mates and to guard territories. Because of that specialization, people have been able to quantify the mechanisms and the functions and make a lot of progress in understanding what the birds are talking about.
So the cotton-top tamarin in many ways is very bird-like. They are mainly monogamous. They pair bond for life. In part because of their social organization they have vocalization that's like bird song. There's a long call they give to maintain contact with other individuals, which is also used in inter-territorial interactions."
Marc Hauser
Essays on Recursion, Difference, Dialectics, Maps and Territories in Celebration of Gregory Bateson's centennial - SEED Editorial
"[...] Bateson stated in more than one text that his ideas were attuned to an epistemological monism, at first a notion of organicism, but after Bateson's embrace of cybernetics in the mid-1940s, an epistemology built around information and the fundamental ideas of cybernetics. These had circularity as their central concern, though as Bateson pointed out, circularity did not mean a precise circle in which events repeat themselves in the same circular path. All living forms reproduce and in doing so re-enter the domain of their forebears. These are recursive events, but in the case of species reproduction, they never step into the precise spot in the same stream twice-over. The arrow of time intervenes. A truly circular path would preclude emergence of new forms, and other forms of change or adaptation which are characteristic sequences of evolution. The passing of time always inflects recursion in human events, and human events must also, in the long run, share this formal characteristic of recursion in biological events. Bateson believed, contra Vico, there was no historical circularity which rolls human history along from barbarism to civilization and then returns human society to its starting point of barbarism once more. In between there is an enormous amount of information continually undergoing contextual change, but there are also fundamental premises, constraints to human understanding that endure, a structure of fundamental premises or 'verities' that give both form to, but permit freedom in, the variety of recursive events.
Thus Bateson indicated that a characteristic form or topology that captures the recursiveness of both biological form and human cultural and historical experience, is that of a spiral, a circularity that rotates in time. Nevertheless, the spiral was always a metaphor, rather than an operational framework in Bateson's epistemology. Instead, Bateson approached recursiveness in terms of the oscillations in heterarchical ordering. The idea of heterarchical (multi-level) order was first developed by the well known cyberneticist, Warren McCulloch. McCulloch suggested in his classic paper on the topology of human nervous nets (McCulloch, 1965: 40-45) that the topology of recursion in nervous nets not only differs between long-term memory and short-term memory but that within the overall recursiveness of neural nets are multiple hierarchies occurring among the synapses of the nervous system connected in recursive reverberation."
Gregory Bateson Centennial: Multiple Versions of the World
Saturday, November 20, 2004 (9:00 AM - 5:00 PM)
University of California at Berkeley, Lawrence Hall of Science
Confirmed Speakers include: Mary Catherine Bateson, Carol Wilder, Peter Harries-Jones, Terrence Deacon, Tyler Volk, Charles Hampden-Turner, Jesper Hoffmeyer & Jay Ogilvy
The Social Brain Conferences: Biology of conflicts and cooperation
Barcelona - July 17-20, 2004
News From Below: We've Been Expecting You
"Not often will we have such an opportunity as this one: to convene so many who've been inspired by Bateson and his illuminating insights into "the pattern which connects." Multiple Versions of the World promises to be a breakthrough conference." Jay Ogilvy
A little fruitful pandemonium (The Austin Chronicle 20 February 2004)
Michael Ventura: There's little grace or dignity in "Kid Auto Races in Venice," but it is six minutes and 10 seconds of weird prophetic poetry -- and mayhem. Later Chaplin had varying memories of how he created the Tramp's costume, trying on things at random in the costume room, but he concluded each version like this, as spoken to Chaplin biographer Robert Payne: "Even then I realized I would have to spend the rest of my life finding more about the creature. For me he was fixed, complete, the moment I looked in the mirror and saw him for the first time, yet even now I don't know all the things there are to be known about him." Chaplin always talked of the Tramp that way: from a distance and with respect. Even with awe, as when he told Payne: "There is death in him, and he is bringing life -- more life. That is his only excuse, his only purpose. That is why people recognized him everywhere. They wanted the ghosts to come and bring them life. It's very strange, isn't it? ... You see, the clown is so close to death that only a knife-edge separates him from it, and sometimes he goes over the border, but he always returns again. So in a way he is spirit -- not real. ... We know he cannot die, and that's the best thing about him. I created him, but I am not him, and yet sometimes our paths cross."
John Seely Brown interviewed by Seth Kahan (10 February 2003)
Seth: It sounds like each side is sifting through the other's "roots," exploring the periphery.
JSB: Yeah, it is an exploration -- each center is in the other's periphery, but it is also a clashing. I call it, the creative collision of craft. That collision taking place in a fabric of trust can go -- as we were saying with storytelling -- huge distances.
Robert W. Taylor's 2004 Draper Prize Acceptance Remarks
"Once upon a time and for many centuries, beginning with the first computer, the abacus, the purpose of computers was to solve arithmetic problems. With electricity, they could solve them faster, and the advent of the integrated circuit made them even faster, more reliable, and a lot cheaper. But they were still only arithmetic engines. Then a remarkable transformation occurred.
Xerox opened its Palo Alto Research Center in 1970, and it grew over time to about 300 people. Today, PARC is known for its innovative computer science research of the 1970s, but computer science was only a small part of its research investment. Most of it went into physics, materials science, and optics. But a few dozen computerists at PARC redefined the nature and purpose of computing, and their research put PARC on the map. The 2004 Draper Prize honors this research.
The four individuals named in this year's prize formed the cadre of that extraordinary group, which today reads like a "Who's Who" in computer science. In the last half of the 1960s, they were graduate students at a handful of universities, where, with support from ARPA, they built the first experimental interactive computer systems. From these, they gained insights into interactive computing that were not available to others. In the 1970s, when they were recruited to PARC, they shared a dream -- that computer systems could be completely redesigned and that this redesign could enable personal interactions with text, pictures, and networking for millions of individuals. The dream promised to encourage creative potential and new forms of communication. The value of connecting people and their interests could dwarf the value of computing only for arithmetic."
A General Theory of Rubbish: Moore Enlightenment
posted by Andrew 7/17/2004 08:08:00 AM
|
|