Tuesday, December 22, 2009

"Avatar" - James Cameron Dreams of Electric Actors

This is the first of in a series of short posts about James Cameron's new blockbuster Avatar.

Make no mistake about it, out of the gate Avatar is a landmark science fiction film.  People may argue whether it succeeds as a work of story-telling, but no one can deny that Cameron has orchestrated a combination of cutting-edge movie-making techniques to realize a vision of an alien world in a realistic and compelling way that will serve as the standard of comparison for science fiction films for years to come.

In this regard Avatar springs into the world full-blown, much as Stanley Kubrick's 2001, A Space Odyssey did in 1968 - so different from its predecessors that it startles us, and demands we revise our expectations for the medium.  Everything that came before is revealed to have been nothing more than cardboard spaceships sporting sputtering sparklers and bow-legged actors cavorting in foam-rubber monster costumes.

It's important to keep in mind that science fiction films, as much as they aspire to be movies about ideas, first and foremost strive to fabricate believable visions either of the future of our own planet or of the landscapes of others light-years away.  With Avatar Cameron has triumphed by creating a marvelous new world for both his characters and his audience to inhabit.

In addition Avatar will likely establish itself as a watershed in the relentless march of cinema from its reliance on flesh-and-blood actors to the routine use of what may come to be called "synthetic" players.  The melding of live action and CGI (computer generated imagery) through the use of motion-capture technologies will give way to the construction of genuinely autonomous virtual actors, informed, perhaps, by the smoky voice of Marlene Dietrich or the smoking curves of Marilyn Monroe.

So, with Avatar James Cameron has allowed us to glimpse another future world, one in which the film director has become a painter of characters, methodically composing actors from a palette of performers, some living, some dead, and some drawn solely from his or her imagination.

Creative Commons License
"Avatar" - James Cameron Dreams of Electric Actors by Marc Merlin is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at thoughtsarise.blogspot.com.

Monday, November 30, 2009

The Movie "2012" - Who Lost Tibet?

The world as we know it pretty much comes to an end by the end of Roland Emmerich's new globe-busting block-buster 2012.  My apologies if this comes as a spoiler, but, honestly, revealing that an Emmerich film finds a way to decimate the population of the planet - flora and fauna - is like revealing that a romantic comedy winds up with its initially mismatched, bickering couple entwined in each others arms as the closing credits roll.  Not exactly a surprise.

Apparently. such is the power of cataclysmic crustal displacement that no tectonic plate will be left unturned, as an infelicitous planetary alignment and a stampede of mutant solar neutrinos conspire to wipe every nation, save one, from the face of the earth.  One nation is spared this particular fate, but not because it miraculously survives the natural disasters that Emmerich serves up, but because it has met an untimely editorial demise long before 2012 went into production.  That nation is the nation of Tibet.

To watch this film is to be taken by an odd sense of geographical "dislocation", in a very literal sense of that word.

Scenes of Tibet and its people appear early on.  Indeed, images of Tibet, including one of an iconic maroon-and-yellow-robed Tibetan Buddhist monk gazing meditatively as a tsunami sweeps across the mountains of the Tibetan Plateau and a Tibetan monastery is engulfed by waves in the distance, have been used to promote the movie.  It is a Tibetan family whose acts of compassion and heroism deliver the American Curtis family, the featured characters, to a safe haven in the final act of the film, which unfolds on the crucial high ground surrounding Mount Everest, which itself lies near the border of present-day Tibet and within the land claimed by the ancestral Tibetan kingdom.

But the word "Tibet" is never uttered in the film.  It is even absent from the map that is used by the Curtis family to direct their flight across a crumbling continent and a tsunami-riddled ocean to safety.  Their destination is China, only China, a vast, monolithic China.  It is as though, for purposes of 2012, Tibet has become "the province that dare not speak its name".

Now, what may appear, superficially, to be an act of omission is anything but.  The elimination of references to Tibet in the film is the result of a high-level financial calculation, and it is also an illustration of what happens when the standing of a culture perched on the roof of the world runs afoul of the bottom-line of one of the most expensive movies ever made.

The fact of the matter is that the cost for making and marketing 2012 is estimated to exceed a quarter of a billion dollars.  There is no way on God's green earth - or on Roland Emmerich's lava-riven one - for the people and corporations who invested in that film to turn a profit without massive international ticket sales.  Critical participants in that prospective box-office are tens of millions of Chinese movie-goers.  And there is also no way, given the current political climate, that the People's Republic would tolerate the distribution of a mass-market film that placed Tibet or Tibetans anywhere near front and center.

Be that as it may, I must admit to being taken aback that Emmerich's kowtowing in response to either actual or to anticipated editorial demands by the Chinese authorities would result in the removal of every mention of Tibet from the 2012 screenplay.  Sadly, given the money at stake, some amount of obsequiousness on the part of the director could have been expected, but what Emmerich has done here by tossing Tibet under the bus - or the ark, as the case may be - approaches the Orwellian.

The phrase "memory hole" was devised by George Orwell in Nineteen Eight-four, his classic dystopian novel set in a near-future totalitarian state, as a nickname for the chutes into which potentially damaging or embarrassing political documents - even scraps of paper - were tossed to be incinerated and thus expunged from the historical record.

From all appearances, every allusion to Tibet was excised from the production documents of 2012 by is creators, and the resulting scraps of paper were collected and tossed down a Tibetan memory hole, one made to order for the film.

Admittedly, artists make compromises to realize their most cherished visions.  But what artistic vision was realized by Roland Emmerich in making of 2012 that was so worthy that it demanded that he purge the words "Tibet"  and "Tibetans" from his movie, inconvenient bits of truth, jettisoned in the pursuit of globe-busting box-office receipts?  Unfortunately, the answer is, "none at all."

Creative Commons License
2012 - Who Lost Tibet? by Marc Merlin is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at thoughtsarise.blogspot.com.

Saturday, November 21, 2009

District 9 - Yes, We Need Another Hero

I've been working on a essay about Neill Blomkamp's "District 9" since late summer and it has morphed into a dissertation.  What follows is a summary - my take on the movie as a first installment in a film saga that a re-imagines the "hero's journey" in new and interesting ways.

Cutting-edge science fiction films are expected to be technically ambitious, to employ the latest in computer-generated imagery and bring to life alien creatures, other-worldly environments and marvels of engineering which previously existed only on the printed page or in a director's imagination.

Neill Blomkamp's District 9 succeeds in this regard. And it does so in a way that reminds us that special effects are most - shall we say - effective when, like accomplished supporting actors, having been introduced with modest fanfare, they retreat from the limelight and allow the lead performers to get on with the show, all the while laboring diligently, but unobtrusively, in service of the production.

What one doesn't expect is that a science fiction film introduce us to an intriguingly different kind of hero and launch him on a remarkably different kind of journey. District 9 does just this.

We are accustomed to our cinema heroes being promoted from the ranks of rogues. They may start off as hard-boiled detectives or egoistic renegades, for example, but they are soon revealed to be fundamentally decent, and even to posses the capacity for selfless courage. Such reluctant heroes are good - even noble - men, beaten down by life's tragedy or by lost love, who have cynically turned their backs on the world of good deeds.

This is not at all the case with Wikus Van De Merwe (Sharlto Copely), the more-than-reluctant hero of District 9. We first meet him in a non-descript contemporary office setting where he is an undistinguished middle-level bureaucrat, a cog in a paramilitary corporation that goes by the name Multi-National United (MNU).  (Think Blackwater Worldwide on steroids.)  MNU has been contracted to manage the lives of the million or more extraterrestrial castaways, referred to disparagingly as prawns, who have been marooned in Johannesburg, South Africa since 1989.

Twenty years on, the prawns live in a squalid squatter community, the District 9 of the movie's title, on the outskirts of the city. Disliked and unwelcome, they are about to be forcibly relocated to a new home miles away, which turns out will be little more than an out-of-sight concentration camp. Possessing none of the ambition that it would take to advance on his own, Wikus has been designated by his scheming father-in-law, a high-ranking MNU executive, to lead the prawn eviction.

Although the epithet "little Eichmann" has been recklessly misapplied of late, it is hard to come up with a more fitting characterization for Wikus. Like Adolf Eichmann, the architect of Hitler's final solution, Wikus is not a bad man, at least according to any superficial reckoning. Friendly and self-effacing, he offers a ready, if somewhat nervous, bonhomie to co-workers and strangers alike. Hardly menacing, so desperate is he to avoid confrontation that he comes across as a dithering coward. And, to seal his "good man" bona fides, Wikus is a devoted husband to his beloved wife, Tania (Vanessa Haywood). So besotted is he with her, that the mere mention of her name sends him into a worshipful reverie.

But Wikus is also professionally engaged in a monstrously brutal enterprise, one which asks him to commit morally reprehensible acts as a matter of daily routine. For example, he does not hesitate to threaten the well-being of a prawn child in order to elicit compliance from its father with a relocation order. Similarly, for nominally "hygienic" purposes, Wikus orders the torching of a nursery of prawn larvae, explaining to the camera crew documenting the eviction that this approach to the problem is far more efficient than the cumbersome procedure of destroying the developing infants one-at-a-time.

What distinguishes Wikus, a veritable Eichmann manqué, from the maestro himself is that he is not the author of the oppression that he dispenses. Not much more than a rule-bound MNU functionary, he follows orders and does his job, all the while trying not to draw much attention to himself. In fact, Wikus is  thrown off balance by the promotion that his father-in-law foists upon him. No company man, he would prefer, given the option, to be at home with his adored Tania, basking in the glow of their eternal puppy love.

To put it bluntly, the problem with Wikus is not so much that he is evil, the problem is that he lacks a soul.

The challenge for Blomkamp in District 9 is, therefore, to stir Wikus from his ethical torpor, to awaken him to the reality of the suffering of others, to afford him the opportunity for redemption for his sins, and to launch him on the essential hero's quest, and that is the quest for a deeper understanding of oneself and a more humane appreciation of a wider world.

In this way District 9 announces itself as the first installment of a film saga in which Wikus, an unlikely Ulysses, is forced kicking and screaming into a miserable exile - ironically in plain sight - whence he begins his own odyssey, a desperate struggle to find his way back home and back into the arms of his beloved Tania.

Blomkamp's creative re-imagining of the beginning of this epic story is both creative and exciting.  I look forward to seeing how Wikus's voyage continues.


Creative Commons License
District 9 - Yes, We Need Another Hero by Marc Merlin is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at thoughtsarise.blogspot.com.

Sunday, October 25, 2009

Anti-Vaccination - A Real Crime with Bill Maher

You know you're in trouble when Bill Frist makes a fool out of you in an impromptu scientific debate on a nationally-televised talk show, especially if it is your own nationally-televised talk show.

Frist, if you will recall, is a heart surgeon and erstwhile Republican Senate majority leader who, in 2001, damaged his reputation as a doctor, if not a politician, by challenging the accepted diagnosis that Terri Schiavo was in a persistent vegetative state based only on viewing a videotape.  This "learned" opinion was offered in support of federal legislation hastily constructed to prevent the removal of a feeding tube that was keeping that brain-dead woman alive.  His professional misbehavior in this case will serve for years as on object lesson in the improper application of medical authority.

That said, it appears Bill Frist knows his way around peer-reviewed medical journals and appreciates the significance of the results of well-run clinical trials.  Unfortunately, the same cannot be said of Bill Maher, his opponent in their argument over the safety and effectiveness of the H1N1 (swine) flu vaccine.  It would be an understatement to say that Frist emerged as the victor in this dust-up with Maher on Real Time with Bill Maher; to use the vernacular of YouTube where this clip of their debate has been posted, Maher was thoroughly pwned.  This blog in The New York Times reaches a similar conclusion.

The combination of intellectual dishonesty and scientific ignorance exhibited by Maher in this short exchange is especially disappointing to me, since, prior to my learning about his association with the anti-vaccination movement, I had held him in high regard, both as a comic talent and as a useful instigator of public discussion on controversial issues of the day.  But, by characterizing the government as being categorically untrustworthy and by asserting that vaccinations are intrinsically ineffective, Maher demonstrated a willingness to resort both to the kind of demagoguery popular with right-wing conspiracy theorists and to the kind of misunderstanding of the theory of evolution popular with know-nothing creationists.

So why single out Bill Maher for criticism? After all, there is no shortage of anti-vaccination alarmists, stirring unfounded fears about this important public health matter, although, admittedly, few with the kind of national audience that Maher commands.  What makes Maher a conspicuous target for me is not his opposition to respectable medical research, per se, but the fact that, as a very public atheist, he ordinarily champions the cause for the skeptical examination of the very kind of irrational claims that support his anti-vaccination position.

Maher's atheism has probably best become known as a result of his 2008 movie, Religulous, an entertaining and, at times, thought-provoking road trip through the world of mainstream and fringe religious belief.  The film consists of (mostly) friendly encounters between Maher and God-fearing folk, during which the usually iconoclastic Maher (mostly) sets aside his trademark mocking tone and, instead, engages his opponents with bemused curiosity and a modicum of respect.

I imagine that the relative popular success of Religulous was one reason why Maher was chosen by the Atheist Alliance International to receive the 2009 Richard Dawkins Award at their convention this month.  Yet, given Maher's views on vaccination, how can his selection for this honor by a group that consistently identifies itself with scientific rationalism be explained?

As far as I can tell, this misstep has something to do with a shift of the focus within the atheist community, where championing of the power of reason has been displaced, to some extent, by blanket opposition to religious belief.  The resulting difference of opinion has given rise to a tension among non-believers which was featured in a recent story on NPR's Morning Edition (A Bitter Rift Divides Atheists).  This dispute - not unlike the one that raged between the Mensheviks and the Bolsheviks in the years prior to the Russian Revolution of 1917 * - is primarily one concerning tactics with, on one extreme, the "live-and-let-live" atheists, endorsing an ecumenical form of constructive engagement with believers and, on the other, the "take-no-prisoners" atheists, advocating relentless confrontation brimming with contempt and ridicule.

What is often lost in this internecine squabble is that the fundamental intellectual program of atheism should be based not on opposition to religion, in and of itself, but on opposition to that kind of unreason upon which religion often relies, which can be at times a touchstone for harmless personal observance and at others, the cornerstone of despicable public policy.

Sometimes the purveyors of unreason emerge from within our own ranks.  Or, as Walt Kelly observed in his most memorable Pogo quotation, "we have met the enemy and he is us."

Which brings me back to Bill Maher.

While the "zero-tolerance" atheist commanders have been directing a frontal assault on religious belief in all its forms, an agent of corrosive unreason, namely Maher, not only has been operating openly within their home territory, but, indeed, has been receiving citations for his meritorious service to their cause.  Ironically, it would be hard to identify a religious leader in this country today who represents more of a concrete threat to the health and safety of his fellow Americans than Bill Maher.  Encouraging his viewers, specifically pregnant women, not to receive the H1N1 vaccine is so reckless that it borders on the criminal.

The potential danger of such misguided advice was illustrated in an article from last week's Science Times, Flu Story - A Pregnant Woman's Ordeal, which details the story of Aubrey Opdyke, who, as a result of contracting swine flu in late June, lost her baby, was hospitalized for four months, spent five weeks in a coma, suffered six collapsed lungs and a near-fatal seizure.

Although Maher would likely dismiss Aubrey's story as an ignorable anecdote, as he did Bill Frist's report of the death of an otherwise healthy man in his thirties from an H1N1 infection, the facts of the matter are that her personal tragedy is incontrovertibly linked to H1N1 and that the threat posed by the swine flu virus to pregnant women has been established in carefully scrutinized epidemiological studies.  In all likelihood, had an H1N1 vaccine been available before Aubrey was infected and had she been vaccinated with it, she and her baby would be living happy, healthy lives today.

If the insidious recommendations of Maher and other anti-vaccination crusaders are widely adopted, many more people will become infected with H1N1, some of these will experience a fate similar to Aubrey Opdyke's, and others, ones far worse.

So, now is the time for the atheist community to step up to the plate, thank Bill Maher for his service to the cause of reason in other regards, but remind him, in no uncertain terms, that the fight for rationality is not restricted to defeating dangerous religious beliefs, and that it must also confront so-called scientific claims not grounded in the results of systematic peer-reviewed research, especially claims that jeopardize public health.

* The Mensheviks were thoroughly pwned.

Creative Commons License
Anti-Vaccination - A Real Crime with Bill Maher by Marc Merlin is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at thoughtsarise.blogspot.com.

Sunday, October 11, 2009

The Nobel Peace Oscar

There's no way I'm going to wade headlong into the contretemps over the awarding of the 2009 Nobel Peace Prize. It hardly makes sense to me to engage in a debate about whether Barack Obama does or doesn't deserve that honor when there appears to be no consistent basis for determining who the winner of the prize should be. Instead of arguing about the worthiness of this or that recipient we should be focusing our questions on what the Nobel Peace Prize is - or should be - about.

Take this quote from the Nobel committee chairman, Thorbjorn Jagland, for starters.
The question we have to ask is who has done the most in the previous year to enhance peace in the world.
Perhaps this is an accurate statement of some committee guideline or other - I don't know - but it strains credulity to suggest that the prize is awarded based primarily on events of the previous year, given that a cursory examination of the list of past peace prize recipients indicates everything to the contrary. Admittedly, the qualification for nomination for the prize is dictated by a submission deadline, and the prize itself is associated with the year of its award, but to confine the "eligibility" based on the calendar year is to make the Nobel Peace Prize resemble the Oscar competition. Certainly some advances in peace are of such moment that they demand almost immediate recognition, but seldom are the implications of diplomatic breakthroughs, for instance, fully realized in such a short period of time.

The comparison of the Nobel Peace Prize Committee and the Academy of Motion Picture Arts and Sciences is apt in other ways. Award of the Oscars is, nominally, based only on the talent and craft of the competitors. But, of course, the politics - of both the films and the actors under consideration - influence the process and jockeying by contenders for last-minute year-end theatrical release - a brazen acknowledgement of the limited attention span of members of the Academy who vote for the awards - has become an accepted tactic. The peace prize selection seems sometimes to be subject to similar caprice, driven by perceived political opportunity and late-breaking news.

Now, of course, the work of the Nobel Peace Prize Committee is necessarily political, at times even pointedly so. The award of the prize to Burmese opposition leader Aung San Suu Kyi in 1991 was, in part, motivated by the immediate political interest of assuring her safety by drawing international attention to her struggle and the threat posed by the Myanmar State Law and Order Restoration Council (SLORC, a disaster of an acronym, if there ever was one). Using the prize for this kind of humanitarian intervention, though, stands in stark contrast to sending abstract messages of approval for the changing of administrations in the United States, for example.

The Oscars can be forgiven for their stepping outside strict guidelines - to the extent they exist - for their selection process. The Academy is, after all, a large association of member artists, and the results of their vote are little more than an collective expression of personal opinions. The Nobel Peace Prize Committee is another kind of beast entirely. It is a small, deliberative body, and we have every right to expect that their choices be based on a well-considered - and clearly stated - philosophy.

To that end, I would recommend that the committee take a careful look at their history of "successful" awards, that is those that have stood the tests of time and repeated scrutiny.

One category, which appears early on in the history of laureates, includes ambassadors and political leaders who, through their bold action and diplomatic prowess, have worked to end ongoing armed conflicts. Anwar Sadat comes to mind in this regard. Then there are the institutional winners, such organizations as the International Committee of the Red Cross or Amnesty International or Doctors without Borders, who have created and sustained non-governmental programs that labor year in and year out, over periods of decades, in the furthering of human rights and human dignity.

But, lest the Nobel Committee lose sight of their mission and hopelessly dilute the "brand" of which they are, in some sense, only temporary custodians, they must remember to turn their spotlight routinely on heroic individuals - not government officials - in the struggle for peace and justice; Martin Luther King, Jr, Albert Schweitzer, Desmond Tutu, Rigoberta Menchu, Nelson Mandela are examples. They constitute the central pillar of the peace prize. Their work - in the face of persistent personal danger and in spite of repeated personal trials and disappointments - reminds us of the fact that the struggle for peace is essentially an individual struggle, one in which we can all aspire to participate.

So, first and foremost, the Nobel Committee should see announcement of the peace prize award as an opportunity to elicit from us, not hair-splitting debate, but admiration and hope. This is, ultimately, what the Nobel Peace Prize is about. Of course, there will always be controversial choices, but people will maintain confidence in the selection process as long as it not considered arbitrary and, instead, is consistently grounded in recognizing the lesser-sung champions in the struggle for a better world.

Creative Commons License
The Nobel Peace Oscar by Marc Merlin is licensed under a Creative Commons Attribution-No Derivative Works 3.0 United States License.
Based on a work at thoughtsarise.blogspot.com.

Tuesday, October 6, 2009

Why We Fight - For Science

A couple of nights ago, on the edge of the meadow at Piedmont Park, over a convivial dinner that included an appropriate amount of beer and wine, the conversation turned to science - more precisely to how to promote interest in science to the public at large. What, in conventional circles, would have been an unusual dinner-time topic, was an unexceptional one with this group, since we were members of the Atlanta Science Tavern, and we are given to talk about science every chance we get.

Enamored with science. but, somewhat blinded by our adoration, we are sometimes puzzled that others don't share our enthusiasm for the object of our affection. So, when we get together, we often wonder, "how can we encourage our friends to better support and appreciate science?"

A common answer to this question begins with a recitation of the connections between important developments in the history of science and the benefits that have accrued to modern society as a result: the double helix of DNA and cancer-fighting medical diagnosis; quantum physics and high-performance computer chips; genetic engineering and increases in agricultural productivity; Maxwell's theory of electromagnetic waves and near-instantaneous global communications; Newton's orbital mechanics and hurricane tracking. The list goes on and on. It is extraordinarily convincing.

[So as not to whitewash the matter, I readily acknowledge that science has been implicated in its share of failures and catastrophes. On balance, I believe that science comes out ahead in the cost-benefit tally, but some, notably Theodore Kaczynski - better known as the Unabomber - have constructed serious arguments to the contrary. Should, for example, the most dire predictions for global warming be borne out, Ted may well be proven right for his skepticism about technology being an unequivocal force for good, although he should never be excused for the psychotic tactics he used in trying to disrupt its advance.]

Although this kind of utilitarian argument for science is persuasive, I find it, in some respects, disingenuous and, in others, incomplete. It is less than forthright in that it fosters a misconception about why people undertake scientific careers. No doubt there are those who do so motivated primarily by their interest in benefiting mankind, but, in my experience, scientists are, more often than not, driven by an unabashedly self-centered desire to better understand the world in which they live. Public service, although a welcome side effect, is not preeminent among their personal goals. In addition, although arguing for science based on its practical applications may be the strongest hand we have to play in general, the fact of the matter is that many significant fields of scientific research have no chance of bearing technological fruit.



On the morning of the day of the informal Science Tavern dinner the New York Times had published a front-page article announcing the successful reconstruction of a skeleton nicknamed Ardi, the fossil remains of 4.4 million-year-old hominid, and a member of a likely bipedal species which may turn out to be a direct ancestor of our own. To say the least, it would be quite a stretch to come up with a justification for supporting such a masterwork in paleoanthropology based on its potential contribution to our practical technological progress.



I am no stranger, personally, to the rather quixotic pursuits that are part and parcel of basic research. As a graduate student in the late 1970s I worked with a group at the Fermi National Accelerator Laboratory (Fermilab) studying neutrino interactions. Neutrinos are subatomic particles notorious for having little or nothing to do with the real world. Cruising at near the speed of light, they would hardly notice planets placed in their path; the Greta Garbo of elementary particles, after all is said and done, they want to be alone. Consequently, they are seldom considered to be of much practical use, although a once-secret patent was issued for the far-fetched scheme of employing neutrinos to communicate with deep-ocean submarines. Why would anyone pay to study them?

Likewise, what is true about neutrino research in particular is true about the enterprise of elementary particle physics more generally; outside the realm of speculative science-fiction, it is hard to imagine how the knowledge revealed in the course of these investigations into the fundamental structure of matter could lead to anything of practical value. But the value, practical or not, of such basic research is a question we cannot avoid. CERN's Large Hadron Collider (LHC), a Europe-based successor to the accelerator at Fermilab, and the most ambitious instrument yet devised to advance our understanding of the submicroscopic workings of the universe, is scheduled to begin full-fledged operation within a year, at a cost of almost $6 billion. How do we begin to justify such an extravagant expense, given that there is no reasonable prospect of deriving practical benefits from the results that the experiments that will be performed there will produce?

A very similar question had been posed to post-war American researchers, during an era when that country, which had placed a high-stakes wager on the success of the Manhattan Project and had won, was eager to fund the research efforts of the generation of scientists who had participated in the development of the atomic bomb. Robert R. Wilson was not only one of the best of that wartime cohort of physicists, he was also a sculptor, an architect, and the driving force behind the development and construction of Fermilab, as well as its director for a number years, including the brief period that I worked there.

In 1969 Wilson was called before a joint congressional committee on atomic energy to give an accounting as to why the public should continue funding the building of his giant proton accelerator, which, when completed, would measure almost 4 miles in circumference and cost over $250 million, at a time, it should be recalled, when $250 million was a significant line-item in the federal budget.

It was the height of the cold war, and any relationship to military purposes could have been offered by Wilson as an explanation and would have been accepted on the spot. But Wilson, who had been deeply affected by the regret he felt for his work on the atomic bomb and had distanced himself from the defense establishment as a result, did not take this easy way out. Instead, declining to use "national security" as a justification, he said this of his Fermilab project:
It has only to do with the respect with which we regard one another, the dignity of men, our love of culture. It has to do with: Are we good painters, good sculptors, great poets? I mean all the things we really venerate in our country and are patriotic about. It has nothing to do directly with defending our country except to make it worth defending.
Promoting public appreciation of science in this way is much more challenging than appealing to concrete interests based on the expectations of advances, for example, in nutrition or healthcare or transportation or power production or consumer electronics or, in Wilson's case, national defense. But the fact of the matter is that it is the only honest way to argue for public support for many areas of basic research, and often more accurately reflects the motives of those engaged in scientific endeavors. In addition, it serves to reframe the debate about what genuinely constitutes the public interest and expands the conventional definition beyond concrete practical concerns. Ultimately the triumph of our civilization is not only the elevation of our comfort and our security, but also of our culture.

Friday, July 24, 2009

The Movie "Moon" and the Morality of the Scapegoat Bargain


[If] the hypothesis were offered us of a world in which [proposed] utopias should all be outdone, and millions kept permanently happy on the one simple condition that a certain lost soul on the far‑off edge of things should lead a life of lonely torture, what except a specifical and independent sort of emotion can it be which would make us immediately feel, even though an impulse arose within us to clutch at the happiness so offered, how hideous a thing would be its enjoyment when deliberately accepted as the fruit of such a bargain?
- William James, The Moral Philosopher and the Moral Life


In his 1891 address before the Yale Philosophical Club William James pondered the origins of our "moral perceptions" and concluded that some are likely "brain-born" and that their violation elicits an autonomic sense of revulsion in us. He uses as an illustration the scheme above in which one "certain lost soul" is forced to suffer so that, somehow, the rest of humanity might live lives of unimagined happiness.

Of course the notion of the scapegoat, traditionally one punished to purchase the redemption of a community of believers, was not new with James. The scapegoat ritual described in Leviticus has been our prototype for centuries, but the practice of animal and human sacrifice in any one of a number of ancient cultures could serve just as well as an example. What James has done in his taut rendition is to both universalize the scapegoat bargain and simultaneously magnify its moral dimension. The benefits are not restricted to the members of a particular religious group, but extend to all mankind. And the price paid by the victim is not a quick and certain death but a life of unending torture.

James was not the first modern writer to confront the essential moral questions raised by the scapegoat bargain. In his novel, The Brothers Karamazov, published 10 years before James's Yale address, Fyodor Dostoyevsky explores a similar arrangement in a passage concerning the parable of the Grand Inquisitor.
Imagine that you are creating a fabric of human destiny with the object of making men happy in the end, giving them peace and rest at last, but that it was essential and inevitable to torture to death only one tiny creature ... and to found that edifice on its unavenged tears, would you consent to be the architect on those conditions?
Almost one hundred years later, Ursula K. Le Guin, first crediting James and later acknowledging Dostoyevsky as a likely influence, published her short story, The Ones Who Walk Away from Omelas, in which she explores the same moral dilemma within the confines of a small universe of her own devising.*

Although these ancient and contemporary conceptions of the scapegoat bargain differ in many respects, they are similar in that they are all metaphysical in nature. In other words, there is no physical mechanism that connects the suffering of the selected victim with the benefits others derive. In the Old Testament the connection is presumed to originate as part of the covenant between God and his chosen people. In Karamazov it emerges, by fiat, from the very "fabric of human destiny".

Nonetheless, the question could be posed: Might it be possible to reformulate this metaphysical arrangement in plausibly realistic terms? With his science fiction film, Moon, writer-director Duncan Jones has done just this.

The utopia of the near-future world that Jones imagines in Moon is, unlike its 19th-century forerunners, not one predicated on the possibility of human moral perfectibility. Instead it is founded concretely upon the availability of an inexhaustible source of clean energy, an isotope of the element helium, He3, which is used to fuel thermonuclear reactors across the face of the globe. In Moon, limitless, inexpensive, carbon-free electrical power has, it appears, eliminated the contention for resources that has historically been the root of human conflict and, in turn, ushered in a golden age of plenty, for rich and poor nations alike.

Troubling moral complications arise in this brave new energy-rich world because, it turns out, precious He3 must be scraped from the surface of the far-side of the Moon -
James's "far-off edge of things" - by means of a vast mining operation which, in spite of advances in technology, is not entirely automated. This exquisitely engineered, thoroughly computerized He3 factory has one flaw, and that flaw is that it requires the services of a single human being to keep the extraction bulldozers running smoothly and thus insure an uninterrupted flow of utopia-sustaining fuel to planet Earth, a quarter of a million miles away.

For Moon this lone individual - James's "certain lost soul" - is astronaut Sam Bell, who, when he is introduced to us, is desperate to soon conclude his 3-year tour of duty on this lonely lunar outpost
. At first glance Sam, dispirited and disheveled, strikes us more like a beaten-down refugee than a right-stuff-bearing spaceman. Separation from his wife and young daughter, not to mention utter isolation from other members of his species has taken an enormous emotional toll on Sam. We pause to wonder how any "modern" corporation could be so morally bankrupt as to contract for the kind of labor that would result, inevitably, in such severe psychological decline. Little do we know that the crimes inflicted by his employer on Sam - and, shall we say, others very much like him - are far worse than we even dare to imagine.

As the startling moral transgressions that underlie the scapegoat bargain in Moon are revealed, we come to appreciate how masterfully Jones and screenwriter Nathan Parker have taken the metaphysical problem outlined by James and Dostoyevsky and created a convincingly naturalist realization. Not only does Moon succeed in its own right - as a character study and as a suspense-thriller - it
also succeeds as an exemplary work of science fiction in that it grabs hold of a profound, but abstract, philosophical question and recasts it as a flesh-and-blood human tale, brought to life by plausible speculation that ventures just beyond the limits defined by our current scientific capabilities.

* See this blog post, The Scapegoat in Fyodor Dostoyevsky, Ursula K. Le Guin, and William James?, by Horace Jeffery Hodges for an insightful discussion of the treatments by these three authors.

Monday, June 1, 2009

Is "Flat" Science "Real" Science?

In his informative - and entertaining - talk at the May meeting of the Atlanta Science Tavern, entitled Artificial evolution: a guide for hobbyists, Ichiro Matsumura, Associate Professor of Biochemistry at Emory University School of Medicine, began by reviewing the historical patterns of general scientific progress and proceeded to focus on his research efforts which attempt to explain how complex biochemical pathways of cells originate and adapt. He concluded his presentation by discussing the emerging community of unorthodox "scientists", well outside the academic and corporate mainstream, who are pursuing experiments similar to his own, but in kitchens and basements, far removed from the luster - and expense - of state-of-the-art university laboratories.

So, in some ways Ichiro's talk was a presentation of his recent discoveries about how complex cellular processes evolve, but in other ways it was a call for science to return, at least in part, to its table-top roots. Apparently, these days an amateur with a few hundred dollars and kitchen counter space to spare can purchase the materials and equipment necessary and, in short order, alter the genetic makeup of commonly available bacteria. For Ichiro this represents a "flattening" of the scientific enterprise, a welcome alternative to the "hierarchical" restrictions of conventional science that require not only professional credentials, but large sums of money, often acquired only after running the exhausting grant application gauntlet of established funding agencies.

I share Ichiro's excitement for the opportunity that these low-cost-of-entry home laboratories have created for more people to become involved in science-oriented hobbies. Like him, I think that, within the constraints demanded by public safety, this kind of experimentation should be encouraged. Also, like Ichiro, I believe that the spread of these do-it-yourself labs is inevitable. With increasing economies of scale, the costs will only drop and, with Internet resources, the essential technical information will only become more available. The genie is out of the bottle, as they say.

Where I believe I disagree with Ichiro has to do with whether this new field of DIY genetic engineering, "flat" as it is, constitutes "real" science and whether the hierarchy problem of modern science is, in some fundamental way, avoidable.

Now, although I am not prepared here to define science in any comprehensive sense, I do think that a case can be made that hobbyism, for lack of a better word, is not science. Another way of stating my position is to say, "a laboratory does not a scientist make." By making this distinction I do not intend to demean hobbyists or, conversely, to put scientists on a pedestal, but to point out what I believe is a critical feature of the scientific enterprise, and that is the obligation to communicate the details and results of one's investigations so that they can be subjected to public scrutiny and, where appropriate, correction, and so that they may also serve as a basis for further investigation.

Consider, for historical comparison, the too-much-maligned alchemists of the middle-ages. They were hobbyists extraordinaire and, in a very real sense the proto-scientists who laid the groundwork for the science of experimental chemistry that was to follow. I doubt that they lacked the brains or the temperament to be real scientists. What I do think that they lacked, in particular, were a reliable postal system and other ready means to publicize the results of their laboratory work.

The advent of the printing press and of the establishment of a network of roads and public services that made possible the routine delivery of mail over long distances addressed these deficiencies, to some extent. But these innovations were not in themselves sufficient to transform hobbyism into the science that we know today. For this to happen, "natural philosophers" who were involved in the publication of books and the exchange of letters had also to form organizations to distribute and discuss the results of their scientific investigations. In this regard, one could argue that the founding of the Royal Society in 1660 marked the beginning of what we would call modern science. It also likely marked the beginning of the kind of hierarchy problem for science that Ichiro referred to in his talk (not to be confused with the hierarchy problem that besets physics today.)

I imagine that in the 17th century the number of reports and opinions about scientific matters - even concerning a relatively specialized area of research - far exceeded what any individual could consistently review. In a world awash in scientific findings how does one begin to decide whom to trust without resorting to expert opinion? Although our democratic inclinations tend to imbue us with a reflexive disdain for "elites", we have no choice other than to rely on people whose experience and judgment are widely recognized. Once experts are designated, either, in the 17th century, as celebrated fellows of the Royal Society, or, today, as the anonymous peers who enact the review process characteristic of the contemporary funding and publication of science, a hierarchy is created.

Of course such stratification of scientific authority is not without its perils. It brings to mind the age-old conundrum, captured by the Roman poet Juvenal with the query, "who will guard the guards themselves?" A delicate - and unavoidable - balance must be maintained between what, on one extreme, would result in lifeless orthodoxy and, on the other, in intellectual chaos. This is, in some sense, the sociological challenge of modern science, to effectively filter the vast amount of new information that is generated while not censoring well-considered novel contributions that threaten the established order. It's a tough job, but someone's got to do it.

So, once again, let's hear it for those folks who are enthusiastically working away in their at-home laboratories modifying bacterial genes. Like the alchemists before them, they are involved in a personal process of discovering fascinating new things about the nature of the world. But, until their private investigations become public ones and they engage in the kind of dialog that leads to the dissemination and review of their discoveries, they will remain hobbyists, not scientists. And, like it or not, when they choose to cross the divide which demands that they publish their findings and subject them to the criticism of their peers, gatekeepers will, of necessity, arise to manage the, otherwise, overwhelming flow of information. The problem of hierarchy will be born anew. There's no way around it.

Monday, May 18, 2009

This I Don't Believe

Let's suppose you had the opportunity to interview a judge who had recently published an opinion on an important criminal case, one in which she had found, for purposes of concreteness, in favor of the defendant. Your questions turn to the matter of the judge's objectivity, and then it is revealed that the judge has had a long-standing prior relationship with the accused.

Pressing the issue you ask whether this relationship may have influenced her decision. "No, not at all," she responds. "On the contrary, my acquaintance with the defendant didn't skew my judgment, it helped to inform my decision."

At this point in the interview you may begin to doubt - not necessarily the judge's personal integrity, since she may, after all, have had no untoward interest with regard to the outcome of the case - but her judicial faculties. Does she understand the concept of objectivity well enough to realize that it requires that she distance herself from her prejudices and, most certainly, not rely on them?

Such a failure to appreciate the meaning of objectivity is illustrated in a recent interview with Barbara Bradley Hagerty, NPR reporter and author of the forthcoming book, Fingerprints of God, by Weekend Edition Sunday host Liane Hansen. By virtue her authorship Hagerty has positioned herself as a judge of the question, "is spiritual experience real or a delusion?" Although Hansen broaches the issue that Hagerty's upbringing as a Christian Scientist might have affected her analysis, Hagerty proceeds to insist that, in fact, "Christian Science really helped me with my research."

Hagerty goes on to claim that "Christian Science was about 100 years ahead of its time," based on her dubious equation of Mary Baker Eddy's belief in prayer-healing with the emerging field of mind-body studies called psychoneuroimmunology. In this regard, she succeeds, somehow, in demeaning both religion and science. We would all agree that Christian Science is more than simply a theory about emotional health affecting physical well-being (that was hardly breaking news in the 19th century) and, likewise, we would agree that nowhere do contemporary scientific studies of the human brain presuppose supernatural influences on neurological function.

What is, perhaps, more troubling about the interview, having nothing to do with Hagerty's particular take on the religion-science debate, is that it calls into question whether NPR is adhering to its own professional standards. Specifically, the piece opens with the statement:
The golden rule of journalism decrees that reporters take nothing on faith, back up every story with hard evidence, and question everything. NPR's religion correspondent Barbara Bradley Hagerty kept that rule in mind when she decided to explore the science of spirituality.
This is hardly borne out by the exchange between Hansen and Hagerty that follows.

Is it appropriate for NPR to bestow the imprimatur of objectivity on Hagerty's tendentious opinions about religion and science without criticism? It would be one thing if she had simply endeavored to report on the contemporary scientific understanding of the origins religious experience, but Hagerty goes much further. She concludes in the interview, explicitly, that belief in God is a rational choice. This is a profound, and profoundly contentious, question that should not be presented without challenge.

Indeed, the interview and the 5-part series that it previews, Is This Your Brain on God, could be confused with a promotional campaign for Hagerty's upcoming book. Here NPR's own standing as fair "judge" could be called into question. Is Hagerty's book being featured for its merits or is it, to some extent, receiving the spotlight based on its author's long relationship as a reporter for the news organization? To the extent that Hagerty takes a disputed position on a matter of public importance, isn't it incumbent on NPR to present alternative points-of-view? I'm not sure whether NPR, like the New York Times, has a public editor to consider such concerns, but it would seem that its own journalistic standards would demand such consideration.

Wednesday, April 8, 2009

Cautious Darwin, a Fossil Sneeze, and the Real Octomom

Charles Darwin was a cautious scientist. More than 20 years passed between the conclusion of his voyage on the Beagle, during which he made many of the observations that would impel the development of his theory of natural selection, and the publication of his world-transforming work, "The Origin of Species". Darwin understood the potential trouble posed by the iconoclastic things he had to say, both with respect to his personal life - his wife Emma was a devout Christian - and with respect to his standing in the scientific community.

Although we imagine now that Darwin had "nailed it" when he published "Origin" in 1859, there were significant elements of his nascent theory that were subject to reasonable challenge. For one thing, Darwin had no knowledge of the mechanisms of inheritance upon which his process of natural selection relied. It would be half a dozen years before Gregor Mendel's discoveries involving plant hybridization would be published, and these would go largely unnoticed for another three decades. How, indeed, were "favored traits", so central to Darwin's hypothesis, transmitted from generation to generation? Without anything like a theory of genetics, Darwin hadn't a clue.

To make matters worse, Darwin was painfully aware of the problems presented by what he called "the imperfection of the geological record", enough so that the topic merited a chapter of its own in his book. The absence of fossil evidence for transitional species - so-called missing links - was a vulnerability that Darwin addressed as best he could. Today we can make allowances for the fact that the collection and identification of paleontological specimens was then still in its infancy as a systematic enterprise - a defense that was hardly available to Darwin at the time. Much to the chagrin of creationists, who - contrary to the wealth of discoveries made in the last 150 years - insist that these deficits in the fossil record remain, few missing-links have gone missing. In fact, the contemporary fossil record is rich in finely graduated intermediate forms.

It does turn out that Darwin was wrong about one claim he made in "Origin" with regard to fossilization, one that underscores his cautious expectations. In enumerating the many limitations of geological evidence, he despaired that "no organism wholly soft can be preserved." Shells and bones are all that we could count on, according to Darwin, and even they "can disappear when left on the bottom of the sea".

Meet Keuppia levante, a 95-million-year-old octopus, whose fossilized remains were recently discovered by Dirk Fuchs and fellow researchers from the Freie Universität Berlin and announced March of this year. Since octopuses consist pretty much of muscle and skin, what isn't eaten almost immediately by scavengers upon their demise, decays rapidly into a blob of slime. The odds of finding a fossil octopus have been compared to those of finding a "fossil sneeze". Looking at this Cretaceous-era octopus, so exquisitely preserved in Lebanese limestone, it would appear it is about time - finally - for someone offer an appropriate "fossil gesundheit".

What is, perhaps, more remarkable, according to Fuchs and his colleagues, is how closely this ancient specimen resembles its present-day descendants. Such invariance in form would have come as a surprise to Darwin, since his conception of evolution by natural selection imagined the accumulation of favored differences - gradually, but inexorably - over time. To him, the probability of a particular species remaining largely unmodified over the eons of geological time would have seemed to have been astronomically small.

(As an aside I'll note that this "error" on Darwin's part serves to remind us that, although his were the first words in the theory of evolution, they are certainly not the last. Unfortunately, there is a tendency in some quarters to equate Darwin's hypotheses in "Origin" with our current understanding of the process of natural selection. Enough confusion exists that some suggest, "let's get rid of Darwinism," declaring to the world, in all deference to the master, that we have, in the last 150 years, moved significantly beyond many of his original proposals. I think they have a point.)

But, although Darwin could possibly be faulted for not anticipating that, by dint of the hard work of 21st century paleontologists such as Dirk Fuchs, fossil sneezes, such as Keuppia levante, would ultimately be unearthed, there is a method available today to evolutionary science to probe the biological past that he could never have anticipated in his wildest dreams - namely, the analysis of the DNA of living creatures.

With this in mind, allow me to introduce Megaleledone setebos or, as I call her, the real octomom.

Thirty million years ago octomom or, more accurately, members of a species quite similar to hers, roamed the waters surrounding Antarctica. Sea ice was, for the first time, massing on the surface, removing fresh water from the Southern Ocean and leaving below it a highly saline environment enriched in oxygen - a fitting octopus habitat. As the climate continued to cool and the Antarctic ice shelf grew, streams of these salty, oxygenated waters flowed northward along the ocean floor carrying with them octomom's sisters and cousins, who would, themselves, become the founding mothers of new, distinct deep-water octopus species in other parts of the globe.

How do we know the details of this octopus "out of Antarctica" saga? Since they are so exceedingly rare, scientists couldn't use fossilized octopus remains to reconstruct this prehistoric exodus, so, absent a fossil record, they turned to the genetic one, the record written in the DNA of the cells of octopuses living today.

Thanks to the work of the Census of Marine Life, a decade-long project begun in 2000 "to assess and explain the diversity, distribution, and abundance of marine life", specimens representing a number of the living species of deep-sea octopuses were collected and delivered to researchers at Queens College in Belfast. There, by analyzing extracted DNA, biologist Jan Strugnell was able to formulate a family tree for these creatures which not only demonstrated the extent they were related to one another, but also resulted in a calculated genetic profile for their common ancestor, who, although 30 million years older, turns out to be a dead ringer, so to speak, for Megaleledone setebos, our very own octomom.

Darwin's misstep in doubting the possibility of the fossilization of organisms consisting almost entirely of soft tissue is quite understandable. How could he have anticipated the discovery of a fossil sneeze? His concerns about the deficiencies in the geological record have been - and continue to be - addressed and rectified by the diligent work of paleontologists, who, from all appearances, leave no stone unturned. To augment these traditional methods of evolutionary investigation researchers use the tools of molecular biology and genetic science now at their disposal. These allow them to peer into the deep time of the history of life on earth using DNA from the cells of living organisms - a window into the past that Darwin could not have imagined.

All said, Darwin would, no doubt, be astounded by and proud of what his scientific heirs have made of the simple - and cautious - beginnings of his theory of evolution by natural selection.

Creative Commons License
Cautious Darwin, a Fossil Sneeze, and the Real Octomom by Marc Merlin is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at thoughtsarise.blogspot.com.

Wednesday, March 18, 2009

An Ode to Darwin's Pigeons and to Wikimedia Commons

"... from so simple a beginning endless forms most beautiful and most wonderful have been, and are being evolved. " With these words Charles Darwin closes his own most beautiful and most wonderful book, "The Origin of Species". But Darwin's case for his theory of natural selection therein has its own simple beginning, and not one having to do with an ancient or exotic organism, but with the species Columba livia, the humble rock-pigeon, that unremarkable denizen of our urban landscape.

I came to appreciate the importance of Columba livia to Darwin's argument only recently, as a result of undertaking a re-reading of "Origin" at the suggestion of Josh Gough, the organizer of the Atlanta Science Tavern Meetup. Although I was eager to re-familiarize myself with the famous text, to put it honestly, I wasn't looking forward to slogging my way through chapter 1, "Variation under Domestication", again. My interest in biology has always been with its most abstract concerns, so taxonomy, the naming, description and classification of animals and plants had never held much allure for me. But I had learned in the years since my first reading of "Origin" that Darwin's section on selection of domestic species, specifically pigeons, was crucial to the presentation of his greater theory. I was determined to understand why this was so, and I wanted to write about it.

This is where Wikimedia Commons came to my rescue.

In their own words, "Wikimedia Commons is a media file repository making available public domain and freely-licensed educational media content (images, sound and video clips) to all." To my good fortune, this repository contained a photo or illustration that was representative of each of the 10 or so domesticated pigeon breeds that Darwin mentions in his book. Their ready availability made this essay possible and the images themselves allowed me to see for the first time why the variation exhibited by these pigeon breeds was so central to Darwin's argument.

Take a moment to take a look at the following slide show, created with photos from Wikimedia Commons, which I've annotated with descriptions from "Origin", and, unless you are already a student of pigeons, you will be struck, as I was, with the astonishing differences among these breeds. (If you have trouble with the slide show or want to view larger images, please follow this link.)




To understand why the variation resulting from the domestication of pigeons was essential to Darwin's presentation of his theory of natural selection, one must fully appreciate the difficulty that Darwin faced in persuading his readers of the possibility that any ancestral population could generate, over time, individuals quite unlike themselves. How could it be that organisms as different as crocodiles and crocuses are related by common descent? Without recourse to knowledge of the mechanisms responsible for the modification and transmission of biological traits - much less a theory of genetics - why should anyone believe that living things possess the kind of intrinsic malleability that such enormous variation in form would demand?

Cue the pigeons.

These breeds, so different from one another, provided Darwin with compelling evidence that the potential for evolutionary change was, in fact, present in a single species, Columba livia. The implication was that, if such variation could be achieved with pigeons by means of breeding choices over the course of human history, then, perhaps, it was not unreasonable to imagine that the enormous variety of the biological world could have been achieved by means of natural processes over the the vast expanse of geological time.

For Darwin to make his case, though, it was not enough for him to demonstrate the striking variation in breeds of pigeon, he had to convince his readers that these breeds were, in fact, breeds of rock-pigeon, and not themselves each a descendant of a different "aboriginal" pigeon species. Darwin does this through several lines of argument. He speculates that is was improbable that "uncivilized man" had undertaken the domestication of so many different pigeon species; he notes the absence of existing wild populations of the hypothetical aboriginal types; he describes how crosses within each of the breeds exhibit markings of the proposed rock-pigeon ancestor; and, anticipating the modern definition of biospecies, he observes that the various pigeon breeds are capable of producing fertile "mongrel offspring". Finally Darwin concludes, "I can feel no doubt that all our domestic breeds have descended from Columba livia."

Using these pigeons Darwin developed a plan in "Origin" to fend off the challenges faced by any scientific theory based, not upon reproducible experiment, but upon historical evidence. Variation under domestication was the closest thing Darwin had to a laboratory to "test" his ideas. He understood that this "experiment" with pigeons was the unassuming foothold he needed in his book before he could begin his ascent to his bold and comprehensive view of the origin of the range of living organisms. The highly-varied domestication of Columba livia was Darwin's simple beginning, and his theory evolution by natural selection was its beautiful and wonderful descendant.

Creative Commons License
An Ode to Darwin's Pigeons and to Wikimedia Commons by Marc Merlin is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at thoughtsarise.blogspot.com.

Friday, March 6, 2009

The Physics of Silly Names

Physicists are a silly lot, especially when it comes to naming things. Given some of the humdingers that they have come up with, you might think that they might have their very own Ministry of Silly Names (MoSN), something like the Ministry of Silly Walks, featured in a 1970 sketch from the "Monty Python's Flying Circus" television show.

Take, for example, the names given to the various types of quarks, those reclusive, fractionally-charged, point-like particles that are the building blocks of protons and neutrons. First, it would have to be conceded that the name quark, itself, is pretty silly - not surprising, given that it was selected because it sounds like the call of a duck. This silliness is only compounded when one notes that quarks don't come in types, they come in flavors - as though specifying a fundamental characteristic of some of the tiniest bits of matter was akin to ordering an ice cream cone at the neighborhood Baskin Robbins.

A little dignity is restored by the fact that the two "original" quark flavors have the unremarkable names up and down. To a good approximation, protons are constructed from 2 up quarks, designated, conventionally, by the letter 'u', and one down quark, designated by 'd'; neutrons, likewise, from 2 down quarks and one up quark.

The second generation of quarks flavors - more massive than up and down, so it took collisions created by high-energy particle accelerators to produce them in abundance - were given the apparently silly names, strange ('s') and charm ('c').

Strange, it turned out, was not all that silly a choice. When particles called K-mesons, or kaons, were first created, they were observed to decay into a triplet of garden-variety pions, but they took their good time doing so. Something "strange" was keeping them from decaying as quickly as had been expected. A special property was proposed, strangeness, to account for the kaon's longevity. Such a property, called a conserved quantity, is difficult, if not impossible, to shed - sort of like a bad cosmological penny. Thus, the kaon is stymied in its routine attempts to self-destruct, and must resort to slow-as-molasses assistance from the feeble weak interaction to get the job done. You see, the weak interaction doesn't hold strangeness in any special regard and would just as soon eradicate it as keep it around, of course taking in its own sweet time.

I wish I could say that there was a similarly dignified story to account for the origin of the designation charm as a quark flavor, but, to be frank, once strange found its way into the particle physics lexicon, a kind of silliness mania took hold. Indeed, the flavors of the third generation of quarks, then yet to be discovered, were christened beauty and truth by some silly researchers. There were fears - smirking hopes in some quarters, actually - that it would just be a matter of time before newspaper headlines would appear proclaiming such things as: "Beauty Revealed by Fermilab Scientists" or "Particle Physicists Seek Truth with New Accelerator". Something had to be done.

As a result, the Ministry of Silly Names put its foot down and a silly-name reformation was launched. When the quark dust settled, charm, old enough to sound quaint, was, graciously, grandfathered in, but beauty and truth were sent packing, replaced with the names bottom ('b')and top ('t'). How the silly how fallen!

Not going down without a fight, the forces of silliness mounted a rear guard action, so to speak. When the b quark was first created in particle collisions, it was always produced in conjunction with its antimatter counterpart, the b antiquark. So, given the way matter and antimatter cancel each other out, the particle that they formed possessed no net "bottomness". This was just the kind of opening that the silliness resistance needed. The search was on for creating a so-called B meson, a particle containing a single, unbalanced b antiquark, one that would brazenly show its "bare bottom". It was a last hurrah for quark name silliness.

Well, silly or not, the sextet of quarks - up, down, strange, charm, bottom, and top - have now all been detected, advancing the cause of the Standard Model of elementary particles, if not the cause of respectable physics names, considerably. This is as close as physicists have come so far to a long-sought theory of everything (ToE), and by that I mean a comprehensive theory of matter and energy and (most of) the forces of nature that has undergone rigorous experimental tests.

There is, though, one important piece still missing from from the Standard Model puzzle and it is called the Higgs particle, an eponym, named for the theoretician Peter Higgs, so hardly a silly name. Unfortunately, because of the pivotal role the Higgs plays in bestowing mass on other particles in the Standard Model, Leon Lederman, who, it turns out, was a co-discoverer of the b quark, nick-named it the God Particle. No doubt this nickname will be the inspiration for silly headlines when the Higgs is detected, as is likely to be the case, in the next couple of years. There's not much that can be done to avoid this embarrassment. Particle nicknames - much like Bush-era financial markets - are pretty much unregulated.

To put things in perspective, and to give the silly physicists their due, let's turn to the book of Genesis for some guidance about the challenge posed by naming things.

In the creation myth of the first chapter, after creating man in His image, God grants him dominion over every living thing. And in the creation tale of chapter 2, immediately after fashioning Adam out of dust,
the Lord God formed every beast of the field, and every fowl of the air; and brought them unto Adam to see what he would call them: and whatsoever Adam called every living creature, that was the name thereof.
How formidable a task for Adam - himself just now created - to be called upon to name things, things not only unfamiliar to him, but things entirely new to the world!

I imagine that Adam, struggled and stumbled, yet persevered, and came up with names for the animals, as commanded. No doubt, some of these names were fitting and clever, while others were out-and-out silly. But, what choice did Adam have? To have dominion over a thing means you have to call it by name.

To the list of attributes that have been used to characterize to our species - "thinker", "tool maker", "culture bearer" - perhaps, "namer" should be added. Physicists, as scientists, have taken on the task of discovering things entirely new to the world and, with that assignment, they have taken on the responsibility of giving names to the things they discover. We can hardly fault them for struggling and stumbling. We can hardly fault them, now and then, for coming up with silly names.