Monday, March 31, 2008

Go West, Young Man!



Moses and his weary Chosen tromping through the desert toward Canaan, Puritans in moldy ships sailing to the New American Israel, Conestoga-wagoning Homesteaders heading to Oklahoma, Dust Bowl "Oakies" headed further to California--now supplanted alongside the Napa vines by Mexicans and Ecuadorians. To pack up your life onto the back of your donkey or Datsun in search of greener pastures is a story as old as humanity itself. It is how restless and hungry little Homo Sapiens wandered out of Kenya and Ethiopia to colonize every corner of the earth in the first place.


"Darling, the banana harvest has not been kind this year. Time to pack up and head north..."

But yet, economists seem to ignore the phenomenon of migration when computing Gross Domestic Product. Yes, unconstrained labor, like capital, will tend to flow across national boundaries in Ricardian economics. What wages do return to the motherland are considered in the macroeconomic financial account as transfers. However, why does the Cuban cease to be a Cuban once he leaves Cuba? When a large portion of his wages return to Cuba in the form of remittances to family members, when he retains a dominant Cuban identity and culture in exile and when he is connected to home via overlapping layers of telecommunication?

Under the conventional macroeconomic model, a country becomes less developed when its citizens find a better life elsewhere. Why not instead think of migration as development? Why not measure the Gross Domestic Product per natural? Count the children of a nation no matter where they tread. The Internal Revenue Service of the United States does implicitly, at least, by charging expatriate American citizens income tax.

Measuring GDP and GNI per natural instead of per resident is the idea proposed by Michael Clemens and Lant Pritchett at the Center for Global Development in their recently-published paper Income Per Natural: Measuring Development as if People Mattered More Than Places. According to the two economists, "almost 43 million people live in a group of countries whose income per natural collectively is 50 percent higher than GDP per resident."

Over a decade between the mid-1980s and mid-1990s, China pulled 130 million people out of poverty, the largest single leap in human welfare in history. This was achieved almost entirely by allowing migration from the impoverished western interior to the bustling coastline cities. The Chinese diaspora continues to form the backbone of the South East Asian economies (and have for centuries), inextricably linked in a tight, informal network with the Middle Kingdom. Singapore, for example, is majority Chinese, with a Sino-Singaporean population of 3.5 million.



There are currently four times as many Lebanese people outside Lebanon than inside (there are two-and-a-half times as many Lebanese in Brazil, in fact, than in Beirut). Young Arab men from all over the Middle East flock to Saudi Arabia to earn the nest egg required for an apartment, satellite television and marriage. Only 20 percent of the population of the "United Arab Emirate" of Dubai, meanwhile, is Arab (the most common colloquial languages are English, Hindi and Urdu).

It goes without saying that migrant labor in the United States is an essential source of welfare for Hispanics across the Americas. In 2006, Latin America hosted a flow of remittances totaling $63 billion (exceeding the combined total of all Foreign Direct Investment and Overseas Development Assistance to the region). For the communities who depend upon these wire transfers, the industry of their countrymen is certainly a palpable factor in their gross national welfare.

Clemins and Pritchett conclude their paper with an intriguing meditation:

The bottom line: migration is one of the most important sources of poverty reduction for a large portion of the developing world. If economic development is defined as rising human well being, then a residence-neutral measure of well-being emphasizes that crossing international borders is not an alternative to economic development, it is economic development.

Wednesday, March 26, 2008

The Art of the Future Uses Lead-Based Paint



"Mao," Andy Warhol, 1972 (the same year as Nixon's famed Beijing visit)

An artist is somebody who produces things that people don't need to have. - Andy Warhol

At the apex of his career, Andy Warhol, that most famous of pop artists, was enlisting the help of dozens of assistants at his studio ("The Factory") to crank out copy after copy of his iconic silk-screen prints. It was the very reproducibility of his work which formed the statement of his oeuvre. He would furiously roll out variated prints--sometimes hundreds--in every different shade of vivid Technicolor, of Hollywood dons and divas, of shrill crimes stories and gory tragedies, of the wide-eyed imagery of the American imagination. He would feed the vast maw of celebrity and sensationalism with every imaginable perspective of every glowing little interest to fall before the eyes of a rabid and adoring audience public.

In fact, it was the interchangeability of the artist himself which Warhol implied with his famous quote: "Everyone will be famous for 15 minutes." He had shown the world that even an "ugly Polish queer from the Midwest" (of negligible formal talent) could ensconce himself within the iridescence of New York City and its most beautiful people. Today, the image of Warhol, like his multicolored prints of Marilyn Monroe, has outshone and outlasted its subject.

What do we then make of Shenzhen, in southern China’s Guangdong province? A former fishing village, Shenzhen was chosen as the first of the country’s Special Economic Zones by Deng Xiaoping in 1979. Today, this gushing city of eight million is the fountainhead of most reproduced art in the world. For less than $100, the global art consumer can commission a Rembrandt, Monet or Jasper Johns--or just a oil-painted copy of their graduation photo. It will be available at the doorstep of their similarly-reproduced colonial, Cape Cod or Tudor-style 4 bedroom, 2.5 bath. A copy of a copy....


Photo Credit: Spiegel Online

According to the UN, bulk US imports of Chinese oil paintings totaled over $64 million dollars in 2006, more than double the figure recorded two years earlier.

Are these millions of stretched canvases the ultimate realization of Warhol's postmodern dream, or are they a travesty of the artistic process? Wretched bastard children of automatons laboring away for 12-hour-days in sweatshop "art factories," as if they were sewing Nike's or lead-painted toys?


Photo Credit: Spiegel Online

And what do we make of the buyers of such paintings? Do they consider these works of human hands "art?" Though aforementioned figure $64 million worth of reproduced paintings shipped to American living rooms and hotel lobbies seems large, it's dwarfed by the price of a single painting, Jasper John's 1959 False Start, which sold for a record $80 million that same year.

Would anyone be willing to spend $80 million on a modern, original Chinese work (i.e. not a copy of Flemish realism or a porcelain vase dating from the Ming Dynasty)?


"What crap, this will never sell, Kim! The client wanted a Mona Lisa with her face on it...and for God's sake, make her look thin, man!"

I have an inkling that that will have much to do with the relative socioeconomic power and international standing to which China can lay claim to at point of sale. Perhaps the delicate brush-strokes, and graceful minimalism of the Middle Kingdom's indigenous style will demand their own dominant cache in years hence. Perhaps Chinese art will go the way of Japanese auto manufacture: from cheap, inferior knock-offs of Western designs, onward to internationally-acknowledged standards of excellence.

Since fashion is art now, and Chinese is in fashion, I could make a lot of money. -Andy Warhol, 1971 (one year before his "Mao" print series)

Sunday, March 23, 2008

Hee-Haw! Wi-Fi Headin' Out to the Country



Intel has developed a long-distance Wi-Fi platform, Intel (r) Rural Connectivity, allowing a wireless internet signal of roughly 10 megabits-per-second to hopscotch its way between nods 60 miles apart. From a city edge, internet connectivity can be projected far out into the countryside cheaply, and with minimal fuss. Each of the transmission towers can run on a mere six watts of electricity, allowing them to be independently solar powered.


(Far superior to previously existing Wi-Fi boosting technology)

From the Intel blog article:
One of the research projects connected rural villages in India with the Aravind Eye clinic to provide medical eye exams via the wireless antenna relay system. In Panama, it is bringing the interent (sic) to a remote village in the rain forest.

Click here for a geekier technical analysis from Daily Wireless blog.

Implication: Practical and economically feasible internet connectivity for users both in the developing world, and the rural corners of developed nations like the United States to address the severe rural/urban broadband internet divide.

Wednesday, March 12, 2008

More from Generation Distraction

Serendipitously, just happened upon this piece by The Boston Globe's Carolyn Y. Johnson, which explores the ennui of Generation Distraction, arguing that "boredom is essential for creativity" and that the quick and easy access to stimuli is the mental equivalent of the overabundance of calories that has lead to the obesity epidemic. It seems our "mental fatness" is clogging our arteries of innovative endeavor, after all. "The most creative people...are known to have the greatest toleration for long periods of uncertainty and boredom." She summons the patron saint of boredom, Marcel Proust, to elucidate the edifying power of idleness:

Dispirited after a dreary day with the prospect of a depressing morrow, I raised to my lips a spoonful of the tea in which I had soaked a morsel of the cake," Proust wrote. "And at once the vicissitudes of life had become indifferent to me, its disasters innocuous, its brevity illusory... I had ceased now to feel mediocre, contingent, mortal.


"I'm not bored, I'm profound"

Kathleen Cumiskey, a professor of psychology and women's studies at the College of Staten Island, is quoted echoing my metaphor to drug addiction: "Our society is perpetually anxious, and a way to alleviate the anxiety is to delve into something that's very within our control, pleasurable, and fun. . . .It feels like it has all the makings of addiction." Paradoxically, the more stimuli people receive to alleviative boredom (email, Facebook updates, funny videos), "people do not seem to feel less bored; they simply flee it with more energy, flitting from one activity to the next." Jerome C. Wakefield, a professor of social work at New York University and co-author of The Loss of Sadness suggests a dosage of boredom shock therapy, to reacquaint the patient with "a comfort level with not being linked in and engaged and stimulated every second."



So, Generation Y, I prescribe that you sit down on a park bench outside for two hours ruminating on the petals of a nearby flower, and call me in the morning.

Or, maybe just read The Affected Provincial's Companion.

Tuesday, March 11, 2008

Talkin’ Bout My Generation



Forget the nomenclature of “Generation Y,” the “Echo Boom,” the “Internet Generation (iGen)” or the “Millenials,” my generation is Generation Distraction. We are they born between the years 1980 and 1990. Reared amongst cell phone chatter, SMS, Facebook wall posts, Twitter updates, AOL instant messages, MySpace messages, blog updates, and 700 channels of glowing digital cable. A vast churning stream of flickering, buzzing, shouting, singing, pinging assaults our eyes and ears at all hours of the day and night. Like newborns victims of fetal alcohol syndrome, we’ve become so bathed in this plenum of media and communication that any abrupt break triggers instant withdrawal. Witness the life-threatening Blackberry blackout last month.

Is there a codeine for the media fiend?

I was close enough to the 1980 end of my generational spectrum to see Generation Distraction to become deeply and inextricably linked to a nascent early-internet culture. The World Wide Web inaugurated the “internet superhighway” (my how quaint that term seems today!) in the late 1980s, and the Mosaic browser kicked off the mass-consumption of the Internet in 1993. Came AOL in 1991; Yahoo, Amazon.com and eBay in 1995; AOL Instant Messaging in 1997, MySpace in 1998, and finally Google in 1999. “The Facebook” launched in 2004, with my alma matter Georgetown being among the first outlets. In college, Facebook so permeated the social sphere that whole nights out were given purpose through the quest for a funny-awesome-sexy profile picture to be uploaded the following Sunday morning (or early afternoon, rather).

Yes, today, Generation Distraction not only consumes every available hour through various media of telecommunication, but we sort and define our very lives and identities through them.

Who am I? I am Geoffrey Daniel Greene, and I’m friends with (…), I like (…) genre of music, I’ve read the following books (…), my favorite TV show is (…), I support (…) for president, I think this link of (…) is funny, last weekend I was at (…) party—here are the pictures… But, then again, are all these things true in real life? I have, after all, carefully crafted my online persona for certain ends.

“Maybe I hated Ulysses, but I want to seem smart, and I know that cute girl from my literature theory course is watching. Better put up a profile picture of me with two girls, it will make me look desired. I hope my ex isn’t checking my profile these days, she’ll leave another snarky wall post. Better block her. But what if she finds out I blocked her? Oh damn, wasn’t I supposed to be writing my paper?”

Instead of facilitating “social networking,” the social network technologies themselves are the end. You don’t Facebook to socialize, you socialize to Facebook. Everything you do is documented, uploaded, and given instant feedback. The number of friends one tallies; the frequency of messages, posts, and comments; the intangible “cred” garnered through one’s online personality—these are the social currency that we desperately accrue. You are being watched, but not just by The Government. In the sequel to Orwell’s dysptopic dream, you are watched not by Big Brother, but by your peers. You voluntarily and desperately submit to constant surveillance and judgment. Thus is Generation Distraction also known by its other avatar, Generation Panopticon.

“Panopticon” referring to English philosopher Jeremy Bentham's 1785 prison design allowing an observer to observe (“-opticon”) all (“pan-“) prisoners without the prisoners being able to tell whether they are being watched, thereby conveying the "sentiment of an invisible omniscience." In the parallel life of Facebook, you are aware of being watched, but the magic is you can never tell by whom. You are judged and sorted hierarchically instantly and decisively, but it’s not clear who populates your jury. It’s vitally important to “win,” however. The Economy of Attention--discussed earlier here--values “eye-hours” above all things, and he who garners the greatest share of a discrete amount of attention is the “richest.”


Bentham's “Panopticon” design

Obviously, there is a sizable dark side to Generation Distraction. Ironically, the overdose of communication has suffocated our communication ability itself. Flooding our receptors with stimuli has deadened their sensitivity. Generation Y is widely reported by managers to be deficient at workplace communication, suffering especially poorly from deplorable writing ability, nonexistent spelling, and whimsical notions of grammar. Nor does this end at the page. Face-to-face contact has ironically been rendered rather quaint by the efficiencies and global reach of personal communication technologies. Generation Panopticon is very comfortable with the reciprocal surveillance of watching and being watched—from a distance.



Do an experiment. Sit down several of your favorite Generation Y members in a room…without a television or a computer. See what happens. (Clue: awkward silence and fidgeting immediately set in). On a long enough time scale, your chosen group will begin to whip cell phones out of their pockets like asthma inhalers for a Content fix. Then they will carry on phone conversations in the hallway or Google sports stats on their iPhone, en lieu of facing the terrifying intimacy of a room full of fleshed people and no screens.


(If aliens visit earth tomorrow, they might assume that the tiny black things we hold tightly over our ears whenever we're walking are protective earmuffs, intended to shield the delicate inner ear from the elements)

The only solution a sudden and unexpected drought of digital Content is liberal portions of light beer (or weed) to deaden the pain of withdrawal from Content. Even then, what conversations transpire will inevitably veer to Grey’s Anatomy or the last funny YouTube video watched. Again, the plenum of Content has among my generation completely replaced actual interpersonal experience with vicarious media representation and disembodied communication.

This inability to converse obviously makes dating difficult, and that practice seems to have subsided as well—with reverberative effects to rates of marriage and childbearing. I wonder too how my generation will handle the challenges of creativity and innovation. If you are under constant reciprocal surveillance, and constantly consumed with the requirements of instant peer validation, how can you truly create something unique and revolutionary? How will you have the space to develop as an individual when you are diluted in a soup of impersonal co-dependency? Will the omnipresent Facebook Panopticon be an even more efficient tool for enforcing conformity than centralized Big Brother? Will this generation inherit an America that has lost its unique character as a nation of kooky basement tinkerers and cultural revolutionaries?

Saturday, March 8, 2008

Why Muslim-Christian Understanding is Stupid

Georgetown University, like many other religiously concerned academies across the nation, has taken up the standard of “interreligious dialogue,” in a well-intentioned effort to heal the theological rift that supposedly divides our world. There is a well-funded Muslim-Christian Understanding program, which offers undergraduates a certificate once they've proven to understand Muslims and Christians sufficiently.

As the standard line goes, ethnic and religious conflict are the new fault lines inherited by the 21st Century, threatening the very success of globalization and harmonious interconnectedness. This threat crops up in the acrimony between the Jews and the Muslims in the Holy Land, the Hindus and the Muslims in Kashmir, or the Christians and the Muslims in the Philippines, Nigeria, Chechnya, Bosnia, Lebanon, and countless other worldwide flashpoints. Furthermore, as evinced by the previous sentence, this “clash of civilizations” usually falls into the template of Muslims vs. (insert any other religion here).

A Manichean struggle exists between two diametrically opposed forces. Such framing is similar to the Cold War framing of free capitalist democracy vs. International Communism. And indeed, in the War on Terror, the forces of Islam have inherited the dark mantel of the Bolsheviks as the new enemies of freedom. This new Enemy has many now-familiar names: al-Qaeda, Hezbollah, Hamas, Islamo-Fascism, etc. Conversely, in the Muslim world, this duality is now embraced in reverse, with the standard of Islam as the Good, and the Other (Zionism, the Great Satan, etc.) as the Evil.

Which brings us back to “Muslim-Christian dialogue.” It is an effort by progressive religious authorities and intellectuals to bridge this bipolar divide. If only the two faiths could understand and empathize with each other, then peace would reign. If only Christendom understood that Jesus is the second most important prophet in Islam. If only the Muslim umma recognized its common pan-Mediterranean history with European Christianity. If only Westerners knew what contributions to the “Western” arts and sciences Medieval Islam made, how the Andalusian scholars preserved the works of Plato and Aristotle to reintroduce Dark Age Europe to its own Greek heritage. If only all three Abrahamic faiths would recognize their common God and patriarchs. Jews and Muslims alike consider Ishmael the father of the Arab people, and his brother Isaac the father of the Jewish people. Could not the Judeo-Christian and Islamic traditions exist side-by-side as brothers—distinct, yet tied by blood?

Yes, they could! Once the Archbishop of Canterbury and the erudite ulama at Cairo’s al-Azhar Madrassa offered up enough “understanding,” coexistence will be magically achieved...

This project is a failure for two reasons. First, and most importantly, it acknowledges and internalizes the fiction of the aforementioned “clash of civilizations.” Notions of difference are socially constructed, and change over time. Religion, much like race, takes on very different identities, depending upon how it is framed. Secondly, it labors under the fallacy that people hate each other for academic theological reasons—reasons that can be reconciled through civil debate and “understanding.”

Kosovo is now the seventh nation wrenched from the ashes of the former Yugoslavia. Less than a generation ago, Kosovars, Serbs, Bosnians, Croats, Slovenians, Montenegrins, and Macedonians simply self-identified as “Yugoslavs.” There was very little religion to be seen amongst Tito’s citizens. Then, in the mid-1990s with Tito and the Soviet Union a distant memory, they all miraculously became sorted as Muslim “Bosnians,” Catholic “Croats,” and Orthodox “Serbs.” Some politicians rose using ethnic power bases and decided that each newly-coined group needed its own land. Some European diplomats agreed and immediately gave their official recognition to the breakaway nations. Lines were drawn on paper, and labeled with their proper ethno-religious label. In Bosnia would go the Muslims, in Croatia would go the Catholics, and in Serbia would go the Orthodox. But, as it turned out, there were some Serbs in Bosnia, some Bosnians in Croatia, and a bunch of Albanians in Serbian Kosovo. The rest is history. Suffice to say, there weren't many Bosnians shouting "Allahu Akbar" or Serbs with giant Crusader crosses in that particular conflict. The crucible of killing in among the southern Slavs was only religious in the nominal sense, and theological understanding will be unlikely to extinguish the still-smoldering landscape.

For over a century under British colonial rule, Pakistanis, Bangladeshis and Indians were “Indians.” Then, miraculously during the 1940s, they became Muslim “Pakistanis” and Hindu “Indians.” Later, in 1971, “East Pakistanis” magically became “Bangladeshis.” Never mind that “Hindu” India had more Muslims than “Muslim” Pakistan (India has the third highest number of Muslims of any other nation on earth, except Indonesia and Pakistan). In 1948, a line was drawn on paper, about two million people were slaughtered in the mad dash to find their way into their proper new religio-national space. Six decades hence, Muslims and Hindus take turns burning whole trains full of innocent people to death. Perhaps the two groups could reach common ground over their mutual love of setting commuters on fire.

Almost all of Iraq’s Christians—and most other religious minorities—have suddenly since 2003 found that their millennia of lineage in the Fertile Crescent doesn’t matter anymore. People who were all Iraqis in 2002 are now Sunni, Shi’a, Christian, Kurdish, etc. Lines are being drawn in paper and in blood, and hundreds of thousands are now casualties of the new scramble for classification and power. Was this a failure of “dialogue?” Did the Sunnis and Shi’a live in relative peace for twelve centuries in Iraq through theological discussion? Or perhaps, was theological debate the very culprit for perceived difference and resulting strife?

In the United States, we think of people with light skin as “white.” At the turn of the century, Irish immigrants were “black.” The term “WASP” (White Anglo-Saxon Protestant) is a carry-over from the age when “Catholic” was a derogatory “racial” category, used pejoratively like “Jew” often still is. Polish people were “Pollocks,” Ukranians were “Bohunks,” Germans were “Krauts,” Italians were “Wops,” Jews were “Kikes,” and Canadians were “Canucks.” Did we heal these rifts by having Catholic-Protestant dialogue? Did the Archbishop of Canterbury and the Pope call a symposium?

The truth of the matter is that most Christians haven’t read the Bible, even those who attend weekly Church. It’s pretty darn long. A large proportion hasn’t skimmed more than a few pages. If asked, they would completely fail to explain the nuances of the Holy Trinity, the degree of divinity and humanity constituting the identity of the Christ, the relative importance of faith verses good works, the relationship between Original Sin and Divine Grace, or the weight given to free will and determinism. Do most Christians know that women should wear a hijab-like veil in church?

The Qur’an is just as long, and perhaps more opaque to the untrained eye. It has no traditional narrative format, parts seem contradictory, its revelatory maxims often shift according to when they were received by the Prophet (the prohibition against drinking alcohol is only the most interesting example). For this reason, the reader can cherry-pick the Qur’an as a work of peace or of war, see Allah as a God of stern punishment or tender grace, and frame their relationship with other faiths as being between the common dhimmi (“Peoples of the Book”), or the arch-enemy kuffir (“refuser”) who must be converted or slain.

Open the Old Testament and randomly point to a passage. If you do this enough, you will eventually find something very troubling. Perhaps a passage filled with Divinely-sanctioned genocide, incest, rape, or a whole host of other subjects that tend not to make it into sermons at Church or Temple.

So where does a through understanding of these texts leave us? Will a rich Lebanese Maronite Christian banker really get along with his Shi’ite unemployed neighbor because they both like Jesus? Will the fact that both trace their religious ancestry to Abraham help the young Hezbollah fighter to reapproach with the radical Zionist settler? Will the Pat Robertson suddenly change his tone toward the Muslim enemy if her knew that the Bible and the Qur’an both sanction polygamy?

Whenever I mention Pakistan or Pakistanis to my mother, she drifts to the same warm impression she has of a Pakistani co-worker she’s enjoyed working with for years at her college. With a smile, she eagerly relates what intelligence and integrity he has. She’s never been to Pakistan, knows almost nothing about it, knows even less about its majority-Muslim religion, and this co-worker is just about the only Pakistani she has ever known. But for the rest of her life, perhaps, she’ll associate Pakistan and its sons and daughters with positive notions of “intelligence” and “integrity.” In this way, she’s like most Beiruti shopkeepers, Indian farmers, and Iraqi lawyers. Until all the politicians and academics stepped in to tell them what to think about certain people, their neighbor was simply the simple man with the charming smile, who’s daughter went to school with theirs, and who’s wife made great hummus.

This, and not academic dialogues, is how people have understood and related to each other since the beginning. And so it shall be for the future.