Beyond the culture of dependency in Africa


The experience of poor farmers in Kenya is a lesson in the need for an ethical revolution against corruption, says the first African woman to be awarded the Nobel Peace prize.

At the gathering of the Group of 20 (G20) in London on 2 April 2009, the world's largest economies reiterated their commitment to helping Africa in the midst of the global financial crisis. As a result of the meeting, between $30 and $50 billion in additional grants and loans will be available to African nations through the World Bank and the International Monetary Fund. While I welcome the news that the world financial crisis hasn't pushed Africa off the global agenda, I cannot help but worry whether this latest tranche of funds will be used effectively by recipient governments, or if these resources will truly improve the lives of most Africans.

Here's why. Almost half the population of sub-Saharan Africa lives on less than $1 per day. But while this poverty is at the root of many of the pressing problems Africa faces, so is the powerlessness of the poor. During the course of the last forty to fifty years, most Africans, in large measure because of their leaders' attitudes and policies, have come to believe that they cannot act on their own behalf. Self-determination and personal and collective uplift, values embraced by the great majority of Africans in the period just after independence, have been eroded.

Disempowerment - whether defined in terms of a lack of self-confidence, apathy, fear, or an inability to take charge of one's own life - is perhaps the most unrecognised problem in Africa today. To the disempowered, it seems much easier or even more acceptable to leave one's life in the hands of third parties (governments, aid agencies, and even God) than to try to alleviate one's circumstances through one's own effort.

This "syndrome" is a problem that of course affects far more than Africans, and far more than the poor. Nevertheless, I have found it to be as substantial a bottleneck to development in Africa as inadequate infrastructure or bad governance, and it has added an extra weight to the work of those who want to enable individuals and communities to better their circumstances.

The corruption and graft that have tainted so much of Africa's leadership in the post-independence period are well-known; the misappropriation of funds, outright theft, incompetence, and cronyism that have characterised too many African governments for decades have been often catalogued. What perhaps is less well understood is how, because of a failure of leadership at the top of the social tree, the culture of corruption - and dependency - has too often eaten its way down to the roots. This theme is explored in my book The Challenge for Africa (Pantheon, 2009).

The roots of the problem

How much of a barrier this syndrome is to Africa's development was brought home to me during the five years I served as a member of the Kenyan parliament (2002-07). A single example can make the case.

One day, I was approached by a group of rural farmers who harvested macadamia nuts. These particular farmers sold their nuts into the Japanese market through a Kenyan processor and exporter, who did not appear to be corrupt. The macadamia nuts' wide variety of uses - as seed, food, and fuel - meant that they were receiving a good price in the market. If a Kenyan macadamia nut-farmer's trees were already planted and producing nuts to harvest, there was no reason why he should not have succeeded and become wealthy by rural standards.

Nonetheless, the farmers were unhappy. When we met, they explained that, because there was so much money to be made in the macadamia nuts, their neighbours, also farmers, had begun to steal. Now, macadamia nuts need to be fully ripe to be ready for processing, and they are not fully ripe until they fall to the ground. But some people (the farmers told me) had started shaking the trees before the nuts were ripe, in order to make them fall; others had begun climbing the trees and picking the nuts even before they were ripe enough to be shaken from the tree. In the end, the greed had become so enormous that some individuals had simply crept onto the farmers' land at night, cut down the trees, and hauled them away, so they could harvest every single nut for themselves.

Because the nuts were not ready, the thieves needed - to make best use of their haul - to find ways to make the nuts look ripe. They would, for example, boil them with tea-leaves to change their colour. But when the nuts arrived at a quality-control post in the market outside of Kenya, they were rejected as obviously rotten. The middleman, furious at this interruption in his export-chain and the potential damage done to his reputation by the rotten shipment, told the farmers he wouldn't buy any more macadamia nuts from them in future.

By the time the farmers came to me with their story, they were desperate. The story of how and why they had lost their once lucrative market left me astonished at the avarice and shortsightedness of some members of the community. I indicated that I would try to find another market for their macadamia nuts, though I didn't hold out much hope. "We can work on it", I said, "but it looks as if the goose that was laying the golden eggs has been killed." It was clear, I continued, that it was going to take much more effort to convince a new market (and a new middleman) of these farmers' reliability.

The individuals who came to me were not farmers of the kind familiar in the west - armed with an understanding of agricultural inputs, international markets, and commodity prices. Indeed, these farmers were little different from their neighbours who stole the nuts, in the sense that nearly everyone who lives in rural areas in Africa grows one crop or another on their land yet often has scant or no information about the product he grows.

Such farmers may have little or no formal education, and may therefore be functionally or actually illiterate. Even if they are able to read or write, they lack access to written materials or the internet to inform themselves about the crops that are their primary source of income; and they may never use or even taste what they harvest at all since (as with macadamia nuts) these don't process and add value to what they themselves produce. They get little help from the state; the Kenyan government, for example, has made few efforts to educate the farmer, encouraging him to become an advocate for his interests, or empower him in the international marketplace (for example, by forming cooperatives).

I advised the macadamia nut-farmers to form a cooperative and work together to get to the bottom of what had happened - find out who owned the macadamia trees; create a register; then determine who was selling macadamia nuts even though they had no trees growing on their own land. I also urged them to start again and this time to instill a discipline among the growers; in this way, they would produce nuts of sufficient quality so they might ultimately be able to find another vendor who would process the macadamia nuts in their own region. This would, in turn, add value to the nuts - and thus guarantee more earnings - before they were sold to the middleman, who would then sell them for export.

Unfortunately, I was voted out of parliament before I had a chance to help the macadamia farmers further. However, my tenure as an MP was long enough to understand what kept this community of farmers poor: in part the farmers' ignorance about what they grew, in part their lack of education, in part the government's failure to support them - but also its own failure to understand the consequences of its self-destructive actions. Instead of working together to further the common good of their communities, each person pursued his individual interests - and all lost.

It didn't have to be this way. The macadamia nuts were already getting an excellent price on the market, so this group of farmers could in principle have pooled some of their earnings and made them available so that more people could buy trees through a low-interest loan. This would mean more macadamia trees for the community to share in the wealth. True, this would have had to be a long-term strategy, since macadamia trees require time to grow; but it would also have reaped dividends within a few years.

However, the thieves wanted the money, and they wanted it fast. So intoxicated were they with the prospect of selling the nuts, they were willing to ruin their prospects for further wealth by cutting down the trees; along the way, they thought nothing of impoverishing their neighbours by making sure that they could neither harvest another crop from a particular tree nor be able to make money again from macadamia nuts, even if they could access the market again. This is how the poor sometimes work against themselves.

An ethical revolution

What happened with the macadamia farmers is a form of corruption. It is no different from a minister demanding a kickback before issuing someone a license to harvest trees in a protected forest. It expresses the same willingness to cheat the system; it flies in the face of commonsense and collective will, and it helps to create a stubborn stereotype of Africa that discourages those who are genuine and compassionate in committing their funds or expertise to helping Africa's peoples. The result is that communities often end up dealing with governments or companies interested mainly in taking advantage of the vacuum created by the culture of corruption to extract as many resources as possible at as low a price as they can.

I'm not so naïve as to believe that personal and collective corruption can ever be wholly eliminated; it will exist as long as there are selfish people and money to be made. But there are concrete measures that governments could take to bring about the needed revolution in ethics, if they were committed to it.

It could start, for example, with an African president or prime minister saying: "We have a problem in our country and as a people. We are cheating and undermining ourselves, and we need to change. For whether it is a policeman bribing a bus driver, or a government minister receiving a kickback to license a business, or someone stealing someone else's crops to make a quick penny - we are failing ourselves, our country, those who came before us, and indeed future generations. I want us as a country to work on it. And it will start with me, and I will do my best to value honesty in whatever I do."

This revolution cannot be confined to those at the top of African societies. Even the poorest and least empowered of Africa's citizens need to work to end a culture that tolerates systemic corruption and inefficiency. A critical step is ensuring that poor people are engaged in their own development, and, by extension, in expanding the democratic space that many African societies desperately need. Just as communities ought to mobilise to combat malaria, or HIV/Aids, for instance, so they must work together to fight the scourges of failed leadership, corruption, and moral blindness.

Such communities could ask themselves: "Do we feel marginalised? Are we capable of acting in concert to make sure that our resources are used equitably? Do we recognise the value of belonging to a state? When we are entrusted to positions of leadership, are we committed to enhancing the welfare of our fellow citizens?" These are the questions that are necessary if a society is to function properly. If they are answered honestly and proactively they can form a system of governance that can evolve and change to meet the needs of the people over time.

Because poor people are more likely to be illiterate, ignored, and feel powerless to act on their own behalf, addressing these questions requires political and economic commitment, as well as patience and persistence - from local, national, and international stakeholders - since change does not occur overnight.

While Africans cannot alter the mistakes and missteps of the past, they can at least try to avoid them in the future. One measure to which I would give priority is for children throughout Africa, from the first grade of primary school through the last year of secondary school, to be taught the values of justice, fairness, and accountability as part of the normal curriculum, so they might grow into the leaders and citizens that Africa needs. Just as new technologies expand the potential for breakthroughs in computer science and engineering through technical colleges, so advances in leadership and the application of values must receive similar impetus.

I don't believe that the peoples of Africa are more accepting of corruption than those in other nations. Africans can - as history shows many have - rise up and demand an end to inappropriate behavior. However, they want to know that if they stand up or speak out, then many others will do the same - especially their leaders, who should be in the forefront of this revolution in ethics. This is one of the most crucial challenges Africa faces. Meeting it could secure a value far beyond the dollar amount of any current or future development assistance.

(By: Wangari Maathai is a pioneering environmentalist and founder of the Green Belt Movement. She was a member of the Kenyan parliament, 2002-07. In 2003, she was appointed the country's assistant minister for environment, natural resources and wildlife; in 2004, she became the first African woman to be awarded the Nobel peace prize. This article has been reproduced from openDemocracy.net under a Creative Commons licence.)

Saruman at Notre Dame


President Obama's masterful speech was widely applauded -- but what did he actually say?
"Those who listened unwarily to that voice could seldom report the words that they heard; and if they did, they wondered, for little power remained in them. Mostly they remembered only that it was a delight to hear the voice speaking, all that it said seemed wise and reasonable, and desire awoke in them by swift agreement to seem wise themselves. When others spoke they seemed harsh and uncouth by contrast; and if they gainsaid the voice, anger was kindled in the hearts of those under the spell. For some the spell lasted only while the voice spoke to them, and when it spake to another they smiled, as men do who see through a juggler’s trick while others gape at it. For many the sound of the voice alone was enough to hold them enthralled; but for those whom it conquered the spell endured when they were far away, and ever they heard that soft voice whispering and urging them. But none were unmoved; none rejected its pleas and its commands without an effort of mind and will, so long as its master had control of it." ~ “The Speech of Saruman,” J.R.R.Tolkien, The Two Towers

Towards the middle of his May 17th commencement address at Notre Dame, President Barack Obama asked the following questions:

Is it possible for us to join hands in common effort? As citizens of a vibrant and varied democracy, how do we engage in vigorous debate? How does each of us remain firm in our principles, and fight for what we consider right, without demonizing those with just as strongly held convictions on the other side?

Essential and vital questions, these, and the concise and straightforward manner with which he proposed them reveals Obama’s rhetorical brilliance. But Obama did more than propose thought-provoking questions to his Catholic audience; he provided definite answers to these, at least for those in the audience not entirely spellbound. Obama’s answers, along with the philosophical and theological principles they presuppose, were deftly hidden behind his rhetorically honed, magical words; and when they are exposed to the light, they reveal a different incantation than the one that appeared upon the exquisitely polished linguistic surface.

In the middle of the address, Obama recounts the story of a Christian doctor who informed him that he would not be voting for him for President in the upcoming election, due not to Obama’s pro-choice position, but to the uncivil, ideological language in which this position was expressed on his website. Obama then told the audience how he immediately changed the wording, expressing his hope that “we can live with one another in a way that reconciles the beliefs of each with the good of all.” This anecdote, I think, provides an interpretive key to understanding not only the essential point of Obama’s Notre Dame address, but also his entire political project as expressed in his many addresses, writings, and acts since President.

Reconciling the irreconcilable

The anecdote is a microcosm of Obama’s macro-political vision: a multitude of people with irreconcilable religious and moral convictions living together in peace and reconciliation. “Irreconcilable” is not my word, mind you, it’s Obama’s. From the Notre Dame address:

Understand — I do not suggest that the debate surrounding abortion can or should go away. No matter how much we may want to fudge it — indeed, while we know that the views of most Americans on the subject are complex and even contradictory — the fact is that at some level, the views of the two camps are irreconcilable. Each side will continue to make its case to the public with passion and conviction. But surely we can do so without reducing those with differing views to caricature.

Of course, by definition there can be no “reconciliation” between irreconcilable views, but Obama means something entirely different here. In light of the doctor story, what it means to “reconcile the beliefs of each with the good of all,” is not to change or encourage others to change views on an issue, but simply to change the way the view is articulated, so as not to “caricature” any opposing view.

The doctor’s “humble” request for rhetorical civility, and Obama’s ready acquiescence to it, is the model for such reconciliation. “I do not ask at this point that you oppose abortion,” Obama quotes the doctor as saying, “only that you speak about this issue in fair-minded words.”

A question arises, here, though: Why would someone who believes abortion to be the deliberate murder of a fully human and innocent person, as the pro-life doctor does, not ask everyone they meet, let alone a President with the most power to see it criminalized, to oppose abortion! That is, why would someone with such a “passionate conviction” judge the “fair-mindedness” of pro-murder language more important than truth, than speaking in such a way as most effectively to stop the killing? We are talking, after all, about a life and death issue here, not one’s view on the estate tax.

Can values be aligned without changing them?

In the speech, Obama urged all Americans to “align our deepest values and commitments to the demands of a new age,” that is, not to change our values and commitments, whether secularist or religious, but merely align them. What this alignment entails must have something to do with the exchange between the doctor and Obama, our models of American virtue.

Allow me to change the anecdote a bit to help discover the connection. The year is 1834, and the issue is slavery, not abortion. There is a law that allows a slave to be killed by its master for any reason whatsoever, and thus thousands of innocent slaves are killed every year. The “pro-life” doctor opposes this law, but his senator advocates it. The doctor, after mystically hearing Obama’s future Notre Dame speech in a prophetic dream, is mesmerized by Obama’s “fair-mindedness,” and recognizes that the “demands of the new age” require that he and every other opponent of the murder of slaves refrain from asking pro-slave-murder persons to change their views, but ask only that they improve their rhetoric. The senator has the same dream, which causes him to recognize that his highest obligation is being fair-minded when he supports the murder of slaves so as not to “caricature” any opposing views.

I think the point is made: if being rhetorically civil were the extent of the required “alignment” for the 19th century America citizen, we would still have legalized slavery, not to mention the genocide of tens of thousands of African-Americans. Needless to say, there would be no President Obama. Suppose the situation were a President proposing a mass genocide of “less-than-human” Jews. “Okay,” assures the President to the doctor, “I’ll be fair-minded and say that they are quite human while we kill them.” One gets the point.

Irony, faith and doubt

I said at the outset that the questions in Obama’s speech at Notre Dame could be mined not only for Obama’s answers, but also for the theological and philosophical principles his answers presuppose. More space would permit me to treat these in some depth; for now, allow me to shed light on what I consider to be the central philosophical/theological reason that Obama would advocate a social and political ideal favoring conversational fairness over truth, and use as his main example what the majority of Americans consider to be a life and death issue. Here is the master key, as it were, that unlocks Obama’s speech:

But remember too, that the ultimate irony of faith is that it necessarily admits doubt... This doubt should not push us away from our faith. But it should humble us. It should temper our passions, and cause us to be wary of self-righteousness. It should compel us to remain open, and curious, and eager to continue the moral and spiritual debate that began for so many of you within the walls of Notre Dame.

I propose this more philosophically and theologically transparent translation:

Whatever “values” and “commitments” we may hold to be true, those that stem from or involve in any way our “faith” must be held with a certain amount of irresolvable doubt—for the “truth” in these sorts of matters can never be known. And this is why we should seek above all to continue, not ever resolve, the “moral and spiritual debate,” whose quite attainable goal is not the truth of any political matter, no matter how life-threatening, but “fair-mindedness.”

I think this interpretation, or something like it, is best able to make sense of why a pro-life Christian doctor revealing his tolerance of the mass-murder of baby-humans in the womb is held up by the President of the United States as a model of civic virtue to a group of graduating Catholic college students. Needless to say, such a relativistic notion of faith and truth is completely irreconcilable with any genuinely religious worldview, and according to Obama, that means over 90 percent of the American people.

What “fair-minded” voices, then, would be permitted to speak in this sort of “vigorous debate”? Would those who refuse to accept its relativistic presuppositions, and who say so plainly, be “caricaturing” their opponents? The kind of debate Obama’s “faith” would “compel” us to undertake is a mockery of debate, for it denigrates the point of any debate, the discovery of truth, and therefore it denigrates the human beings who participate in it, for our greatest desire is to know, love, and act upon the truth.

But with truth eclipsed by “fair-minded” rhetoric as the political summum bonum, what is to prevent the strongest and must ruthless – but, of course, rhetorically “fair-minded”—from exerting power over the weaker? Sure, the pro-life doctors would be speaking quite nicely with all the pro-abortion abortion doctors, while the baby humans are slaughtered in their wombs.

Pace the president of Notre Dame, I, fair-mindedly, or perhaps not, decline to participate in Obama’s “renewal” of political life, in solidarity with all the baby humans killed in the past and who will be killed in the future due to the amoral cultural, spiritual, and political climate only exacerbated by Obama’s cleverly cloaked relativism, wherein the weakest and most defenseless are given a, not-so-fair-minded, silent treatment. Obama asks us not to caricature other American citizens—fine—but let us ask, nay, demand that he not allow them to be murdered.

By: Dr Thaddeus J. Kozinski is Assistant Professor of Humanities and Trivium at Wyoming Catholic College, in Lander, Wyoming

It’s still my university


The leadership may have failed her but this Notre Dame student still loves the place.

Michelle RomeuAs a student at the University of Notre Dame, I can safely say that the past few months have been pretty crazy.

On campus we read the articles. We watched the YouTube videos. We drove past the protesters at our gates and we watched three different planes fly over with banners telling university president Fr. Jenkins to change his mind. We signed petitions and discussed the issue in every single class. For the few months after the news of President Obama’s visit was released, the hubbub on campus did not die down. It only gained momentum as the day crept closer.

Only now that the academic year has ended and I’ve reached the quietness of home have I truly been able to reflect upon those extraordinary months. After three years at the university, I’ve learned that rule number one at Notre Dame is this: everyone has an opinion. Rule number two: there is no hiding this opinion—either you proclaim it to the masses or you will have it forced out of you by some student group passing around a petition. The campus was clearly divided—whenever anyone asked, “So what do you think about Obama coming?” it was as though anyone within a ten-foot radius could hear the sharp mental gulp that came before answering, “Well…”

You never really knew how any given person would respond to that question, or how they would react to your opinion, but nearly everyone ventured to speak their mind. It is difficult to estimate the percentage of students who were pro-Obama speaking or anti-Obama speaking; both parties were quite vociferous. During the days following the announcement of his invitation, two Facebook groups arose and grew steadily in number, both surpassing 5,000 members.

While all the underclassmen had strong opinions on the matter—freshmen, sophomores, and juniors alike—there were none more affected by the announcement than the senior class itself. It seemed as though graduation would be ruined for everyone involved: those who didn’t want Obama to speak would be upset that he was speaking, and those who did want him there would nevertheless be upset about all the protests surrounding the event.

Because of this, I saw something peculiar happening with the senior class: the majority began to just accept it. While some seniors planned not to attend their own commencement—indeed, about 40 of them instead attended a prayer service held while the commencement was in session—I heard many more declare that they were not going to let this ruin their graduation day, and that, even though they personally opposed Obama, they would be honored at the very least to have the President of the United States speak at their commencement ceremony. I ended up admiring those few seniors who attended commencement, but still protested in their own small ways by wearing white carnations or placing a pro-life symbol on their graduation caps.

However, the overall feeling on campus was that the great majority of the seniors either supported Obama, or were on the fence about the issue, still wanting to hear what he had to say.

I was not present at commencement itself, but I read the speech as soon as it was released to the press and then watched it online. It struck me that these were two completely different experiences. Reading the speech allowed me to focus on his words, and not the way he delivered them; but when I actually watched the event, the reverse was true: I was focused on the man who delivered them. Fact: President Obama is an excellent speaker. Fact: President Obama says the things his audience wants to hear. I had to remind myself to look beyond the words.

Obama makes several claims in his speech regarding how he wants to draft a better conscience clause, reduce the number of unplanned pregnancies (with the supposed overarching goal of reducing abortions), and reconcile with those who have different values and priorities. It is all well and good for him to have made these statements, but in truth, what he has done so far tends towards the opposite.

The effect can be deeply ironic. Take a look at this quote from his speech:

“We must decide how to save God's creation from a changing climate that threatens to destroy it. We must seek peace at a time when there are those who will stop at nothing to do us harm, and when weapons in the hands of a few can destroy the many.”

These sentences in particular jumped out at me when I read the text of the speech for the first time. They could be directly applied to the issue that has been central to this entire controversy: abortion. Indeed, Mr. President, we must decide how to save God’s creation—the millions of unborn babies—from a changing climate—this very government administration—that threatens to destroy it.

The president spoke at Arizona State University earlier this year, but that university did not award him an honorary degree of any sort. Their reasoning? That President Obama had not held his position long enough to have accomplished anything worthy of such an honour. By contrast, Notre Dame—a Catholic institution—has honored a man with policies that go against one of the most important aspects of the Catholic faith. How is it that Arizona State, a secular institution, managed to do what so many in the past few months have wanted Notre Dame to do as well? Yes, the Catholic faith teaches tolerance for those with differing beliefs, but it does not teach us to deck them with honours.

And yet, while watching that speech, I had a sad glimpse of the reasons why, perhaps, Fr. Jenkins upheld the invitation and the honorary degree, despite the nationwide controversy it caused. Essentially, he gave the pro-Obama seniors the commencement speaker of their dreams; and for those on the fence, a skilled speaker who could deliver words they wanted to hear. A logical move, perhaps, but not the right one for the president of the most prominent Catholic university in the United States—and furthermore, for a Roman Catholic priest.

But what does all this mean to me—as a Catholic, as a Notre Dame student, as a pro-life person? For me—and, I hope, for all the other pro-life Catholic students at this school—the biggest lesson that I have learned in the past few months is about leadership. As a Catholic leader, Fr. Jenkins has failed us. However, the very last thing I could imagine doing is burning my Notre Dame apparel and renouncing any allegiance to my beloved university—something that many upset Notre Dame graduates and alumni have claimed to do.

On the contrary, I have loved the University of Notre Dame since the very first day I set foot on its campus as a high school senior, and that love has only grown over the past three years. At this university I have grown in my faith, and made lasting friendships with people who not only share my beliefs, but have grown along with me.

Notre Dame is full of students and faculty who are not afraid to say they are Catholic, who fill up the basilica every Sunday morning, who flock to the 28 dorm masses on Sunday nights, who line up for confessions, and who never leave our Lady’s Grotto empty even when it is only 20 degrees outside. Notre Dame is full of people active in their faith, and these people—who make up a huge part of the university—cannot and should not be forgotten.

Fr. Jenkins is in charge of our university just as President Obama is in charge of our country, and while we owe these leaders our respect simply because they are our leaders, it does not mean that they are always right, or that we should believe every word they say—no matter how beautifully spoken these words may be.

There’s a take-home message for us pro-life students in this, too, I think. Defending human life and dignity is not a war of words any more than it is a matter of soothing rhetoric. The commencement controversy has forced us to speak up; now we have to show what our words mean in the realities of life, and so convince some of our classmates -- perhaps even Fr Jenkins -- to come down off the fence.

Michelle Romeu has just completed her junior year as an English major at the University of Notre Dame. She is also a volunteer editorial assistant at MercatorNet.

What’s wrong with early marriage anyway?

A cheerful little statistic on marriage from New Zealand: the number of marriages among residents rose last year by 400 -- from 21,500 in 2007 to 21,900 -- and these were all first marriages, which make up around one third of Kiwi marriages overall. We don’t know the ages of the happy couples, but a recent article in the Washington Post -- Say Yes. What Are You Waiting For? -- was an unexpected plug for earlier marriages.

Sociologist Mark Regnerus politely prods one of the sacred cows of population experts, most of whom fall into the zero population growth camp. Regnerus doesn’t go into the anti-natalist roots of the campaign against early marriage, although they certainly bear investigation; rather, he argues on biological, emotional and economic grounds for the benefits of early marriage.

Biologically, it works for women. The longer they delay marriage and family formation the more their “market value” declines; meanwhile, men’s marriageability increases as they accrue more money and maturity. Regnerus got a lot of flak for that comparison (though he was citing another source) in the 296 comments his article generated, but the truth behind it is pretty obvious.

Emotionally -- and this is the most interesting part of his article -- it works for both spouses. Contrary to what some statistics about divorce suggest (and they need careful analysis) “marriages that begin at age 20, 21 or 22 are not nearly so likely to end in divorce as many presume.” And increasing age alone does not make people more mature and ready for marital challenges:

“Marriage actually works best as a formative institution, not an institution you enter once you think you're fully formed. We learn marriage, just as we learn language, and to the teachable, some lessons just come easier earlier in life.”

Economically, marriage remains a Good Thing, if for no other reason than the effect of pooling resources:

“Married people earn more, save more and build more wealth compared with people who are single or cohabiting. (Say what you will about the benefits of cohabitation, it's a categorically less stable arrangement, far more prone to division than marriage.) We can combine incomes while reducing expenses such as food, child care, electricity, gas and water usage. Marriage may be bourgeois, but it's also the greenest of all social structures. Michigan State ecologists estimate that the extra households created by divorce cost the nation 73 billion kilowatt hours of electricity and more than 600 billion gallons of water in a year. That's a mighty big carbon footprint created in the name of solitude. Marriage may not make you rich -- that's not its purpose -- but a biblical proverb reveals this nifty side effect: ‘Two are better than one, because they have a good return for their work.’”

Read the whole article -- it’s well worthwhile. There are two others with it --one from a journalist who married at 25 (so young!) and one from fellow sociologist Andrew Cherlin, who doesn’t like Regnerus’ argument much and seems happy with his own “market” argument for delayed marriage and small families: “Children are costly to educate and contribute little to the family economy.” ~ Washington Post, Apr 26, 27

Posted by: Carolyn Moynihan

Sexperts only a text away from curious teens

Adolescent health experts in the United States think they have made a great leap forward in sex education. Since the vast majority of teenagers have cellphones, and since an awful lot of them appear to be sexually active, programmes have been set up in several states to receive and answer questions about sex by text message. The beauty of the scheme is that kids can ask the rudest and the most serious questions about sex without bothering their parents.

The move follows Web-based approaches, including the use of social networking sites. While some of the text programmes are automated, one in North Carolina actually employs nine people on shifts to answer questions ranging from “Why don’t girls like short guys?” to questions about anal intercourse. The Birds and Bees Text Line staffers undertake to answer within 24 hours and may refer the young person to a local service. They have a rule not to advocate abortion.

What parents might see as unwarranted intrusion on their territory, and encouragement of an unwholesome interest in sex, the experts see as a face-saving formula for kids pushing the boundaries of enquiry, if not behaviour:

“Technology reduces the shame and embarrassment,” said Deb Levine, executive director of ISIS, a nonprofit organization that began many technology-based reproductive health programs. “It’s the perceived privacy that people have when they’re typing into a computer or a cellphone. And it’s culturally appropriate for young people: they don’t learn about this from adults lecturing them.”

Oh? And by disinhibiting young people in this way aren’t Levine and her colleagues hastening the day when the teens will go direct to porn sites to feed their curiosity? Pornographers are always a step ahead of the competition and are now using the hugely popular Twitter to promote their sites.

“Parents haven’t complained yet, perhaps because they haven’t seen the exchanges,” reports the New York Times about The Birds and Bees.

“Sally Swanson, a staffer and mother of two teenagers, said if parents did read them, ‘It would highlight how much disconnected information kids are already getting at younger ages than we did.’ The questions can be salacious. The staffers try to answer them all, said Mr. Martin, but discreetly and always urging protection. In offering this service to teenagers, he said, ‘you can’t say ‘I’ll be honest except or until.’” That’s often what happens with parents, he added, ‘when the child brings up something shocking, the parents tend to shut down.’”

Can’t say I blame them; stunned silence would be a very natural and even salutary response to a 14-year-old’s enquiries about certain sexual perversions.

If this sort of programme reduces the number of teenage pregnancies in North Carolina it will be nothing short of a miracle. ~ New York Times, May 3

Posted by: Carolyn Moynihan

Brave heart, flawed man

Not only the demon drink but his love affair with violence has marred Mel Gibson and his work.

The sound of idols smashing as they fall off their pedestals is always painful to ears of worshipers; the appearance of the hero’s fatal flaw and the playing out of its consequences brings a disillusioned hush to the audience. So it has been with recent news about the personal life of Mel Gibson, pin-up boy for conservatives who have seen him as a defender of faith and family values right in the fickle heart of Hollywood.

Those of us inclined to forgive him everything for the sake of his 28-year marriage, seven children and his heroically daring film, The Passion of the Christ, now have to face the fact that, after a three-year separation, his wife Robyn wants a divorce, and that the traditionalist Catholic and family man has lost no time appearing in public with a woman whom the gossip mills allege he has been consorting with since last summer.

Nobody is perfect, artists notoriously not so. Dostoyevsky was a compulsive gambler. Dante had a wife but wrote only of the love of his life, Beatrice. Dickens kept a mistress. Mel Gibson’s fatal flaw is an addiction to alcohol which brings out not only the comic side of him but the belligerent, the reckless and the suicidal as well. It got him into deep trouble in 2005 when he spouted anti-Jewish sentiments at a (Jewish) law enforcement officer after being stopped for dangerous and drunken driving.

Many predicted then that Gibson’s career was over and that the commercial fate of his next film, Apocalypto, was in doubt. In fact, it was another huge success and was nominated for several awards. But it confirmed a trend in Gibson’s films -- those he directed or produced -- that is more worrying than his weakness for liquor and the nasty things that go with it: a portrayal of violence that borders on sadism and is certainly beyond the call of any artistic or moral purpose.

Gibson may be correct that “a great civilisation is not conquered from without until it has destroyed itself from within,” but what does it do to our civilised instincts to watch scenes of Mayan priests decapitating captives after pulling out their beating hearts, or of captives in an amphitheatre running the gauntlet of javelins, arrows and stones, only to be put to the blade if they survive, or of a huge pit filled with rotting corpses?

In The Patriot, the sadistic mass killings and burnings by the English army are almost matched by the ferocity of the hero’s vengeance. The throat slittings and torture inBraveheart are more memorable than the fabled battle scene. And yes, the prolonged scourging of Christ in The Passion is more than many pious Catholics can bear to watch. Granted, it is not half so cruel and bloody as the original event, but the attempt at realism here is ambiguous in its effects, if not intention.

The violence in these and other Gibson films seems to express more than anything an anger that, starting from a moral standpoint (injustices wrought historically by the British, the corruption of ancient cultures…) sets out to both accuse and avenge. Nothing can be too extreme in the defence of noble values.

Unfortunately, this is a trap that conservatives, as well as their radical opposites, can easily fall into. George W Bush defended human life and the family, but he also defended capital punishment and torture, and invaded Afghanistan and Iraq. Overlooking these contradictions did not help the cause of morality in the United States.

Nor does the righteous anger that moral conservatives often display towards foes real or imagined. Using vile language about President Barack Obama (and we see some in comments posted on MercatorNet) will do nothing to persuade him or anyone else to change their views on abortion or same-sex marriage; more likely it will do the opposite. Besides, it is simply wrong. Anger is a sin and it ill becomes Christian conservatives.

If identifying enemies is fraught with the moral danger of demonising them, picking heroes carries the risk of prematurely canonising them. Mel Gibson seemed a good pick -- and who knows how heroic his moral struggle has been? -- but the demon drink and his love affair with violence might have warned us some time ago not to invest too much moral capital in his virtues.

Gibson once said in a throwaway line, “I’m somewhere between Howard Stern and St Francis of Assisi on the scale of morality.” If at present he has more in common with the “shock jock” radio host, there is still a chance to scale the heights of St Francis. Only he will have to leave a lot of that blood and gore behind. Amongst other things.

By: Carolyn Moynihan is deputy editor of MercatorNet

Hype or right?

Did the media overplay swine flu or perform a valuable public service?

Well if there is a modern-day proof that the world entered swine flu hysteria, it is that the social media site Twitter was registering 10,000 tweets or posts per hour regarding the porcine influenza. Much of the information on Twitter and other social media sites, according to experts, is gossip and speculation. I don’t think that should bother us; it sounds like the chatter at a bar. In some ways, that is what these social media are; online bars where people can chat while drinking beer at home.

Yet a report in the Sydney Morning Herald on April 30th opined that “Aiding the swine flu scammers is the persistent rumours and fear-mongering that is spreading across social networking sites such as Twitter, on which swine flu has been one of the hottest topics this week.” You see, it seems some people are trying to cash in on the buzz about swine flu with promises they can help you avoid getting it.

While that kind of behaviour is hardly ethical, you’ll excuse me for stating the obvious here, but caveat emptor. If you are taking medical advice from a website that amounts to a personal blog scrunched into 140 characters, you deserve what you get; which likely won’t involve swine flu but rather a giant sucking sound as some online scammer empties your wallet in exchange for a “swine flu prevention kit.”

Almost as persistent as the stories about swine flu have been the stories that the media is overplaying the severity of this flu outbreak. The Guardian ran several pieces over a number of days at the end of April about how much the media was overdoing this story, “Mad journalism disease is now raging through the media” declared Simon Jenkins. Frankly, I think the media is hyping the fact that they think the media is hyping this up. Does that make sense to you? It doesn’t to me either, but then again neither does the media navel gazing we’ve been subjected to. Oh look, now I’m joining in.

Simon Tisdall, also at The Guardian, seems more concerned with the fact that while we chat, or at least were chatting, endlessly about swine flu, tragedies such as the wars in the Congo and Somalia, the oppression in Burma and in China were being ignored. Apparently Mr. Tisdall never heard the famous line from the actor and comedian Mel Brooks “Tragedy is when I cut my finger, comedy is when you fall into an open sewer and die.” That’s Mr. Brooks’ rather harsh way of saying that we really care most about what happens to us. In the early days of the swine flu scare, the possibility of a pandemic virus replicating the Spanish Flu of 1918 is what was concerning many people; just look at all those tweets on Twitter, the posts online and the emails from your long lost cousin.

I’ve been making light of some of the reaction to a very serious issue because some of the reaction has been downright silly. Far from over-hyping, I think doctors and journalists (for the most part) played this one right. Yes, there were some trumped up stories, some less than stellar coverage, but on the whole, I think we got this one right.

When an official with the World Health Organization announces that they are moving to a “Phase 5” of pandemic alert, that is news; news that needs to be passed on to the public. With 25 countries now reporting 2,500 confirmed infections, I wonder what the reaction would have been from the public, or the men at The Guardian had the media chosen to ignore the flu outbreak in Mexico as “nothing to worry about.”

I’ve spent several days at the Health Canada headquarters in Ottawa for the regular press briefings from our Canadian Health Minister, Leona Aglukkaq and Chief Medical Officer of Health, Dr. David Butler-Jones. When I asked Canada’s top doctor whether we in the media, were making too much of this flu outbreak and were just scaring people, he said no. While the good doctor took some exception to the TV images of people wearing masks (mostly because he says they won’t protect you), Dr. Butler-Jones said the interaction with the media was positive.

We’ve had some important pieces of information come out of those briefings and similar ones conducted by public health authorities around the world. As sales and inquiries about Tamilfu and anti-viral drugs grew, Dr. Butler-Jones and his colleagues were able to tell the public, through the media, that the use of Tamiflu now by the general population would do more harm than good.

There was the ability to have real medical experts explain that pandemic, in its proper medical usage, meant that an infection was widespread geographically and had no bearing on the severity of the disease. We could calm peoples fears by noting that while this virus was new, unknown and therefore to be feared, thousands around us die from regular flu each year.

With death and infection rates seemingly higher in Mexico, the question arose, is this the same virus? Does Mexico have a more virulent strain? When Canadian scientists announced on May 6th that they had mapped the genetic code of the virus taken from samples gathered in Mexico, Nova Scotia and British Columbia, they were able to confirm that it was the same virus, reacting differently in local populations.

My favourite pieces of advice from those briefings were the kindergarten like recommendations to wash your hands regularly and to sneeze into your arm not your hands; simple steps that we all know and yet all forget. It cannot be known whether there were any cases of flu prevented from the implementation of these simple prevention steps put forth by health officials and repeated ad nauseam by folks like me in the media. Perhaps without the media attention, there would have been more infections, more deaths, this “over-hyped” outbreak would have been much worse.

By:Brian Lilley is the Ottawa Bureau Chief for 1010 CFRB radio in Toronto and CJAD 800 in Montreal. He is also Associate Editor of Mercatornet.

Torture: do ends justify means?

Former Vice-President Dick Cheney says that torture saved American lives. Is that a good enough reason for authorising it?

With the disclosure of confidential memoranda discussing the permissible limits of aggressive interrogation of terrorist suspects, the Obama administration has thrust the issue of torture back into the public domain.

In response, former Vice President Dick Cheney proposed that subsequent government documents be released which, he claims, will reveal the fruits or benefits of these techniques which he deems to have been successful as a matter of homeland security.

As I understand Mr Cheney, he seems to be arguing that even if the subject practices were something like torture, and they prevented a serious terrorist attack on the United States, this would be justification for having used them. While I have no particular objection to releasing this additional material, I do not find this line of moral reasoning compelling at all.

The former Vice President is arguing, essentially, that the ends justify the means.

Others have taken up this argument with gusto from the opposite perspective. This past week National Public Radio ran a story on interrogation techniques that worked and those that did not. It compared Army interrogation techniques to the CIA’s harsher approach and found the former to be more effective. I have no idea who has the better of this argument as a factual matter. But, again, this line of reasoning begs the fundamental moral question as to the licit or illicit nature of "enhanced interrogation". I certainly hope that the morally benign approach is more effective, but it really is irrelevant to the question of the moral status of torture per se.

I have long resisted the idea that American officials would sanction torture for any reason. As a youngster I perceived that it was always the Gestapo or the KGB who did such things. I also recall the debate in France over torture during the Algerian War which tore that country apart and, at least to some degree, contributed to its withdrawal from Algeria.

That "the ends never justify the means" is one of those foundational principles drilled into any person who has had a morally serious education, evidently an increasingly rare thing in the United States these days. Certainly circumstances can lead to a sympathetic or indulgent attitude to any given situation or person utilizing intrinsically evil practices; but that is a long way from justification, approval or the setting of basic government policy.

The toughest case for me was the bombing of Hiroshima and Nagasaki. My father was stationed in Europe, waiting to ship out to the Pacific when those bombings occurred. Many defend these actions to this day on the grounds that the avoidance of blood lost in trying to invade Japan was well worth the cost.

Again, while sympathizing with the agony of the decision facing President Truman, I never could reconcile myself to the mass destruction of civilian, noncombatant populations, no matter how militarized Japanese society may have been at the time. I simply could not reconcile such actions with any version of the Just War Theory dating back to St Augustine. It was simply beyond the Pale.

Reviewing the list of interrogation tactics and governing practices sanctioned by the US Justice Department back in 2002, waterboarding is the practice which clearly qualifies as torture for me and most people.

For several years now I had harbored an abhorrence of waterboarding, having heard from Viet Nam veterans of its use during that war. More recently, Christopher Hitchens wrote a piece in Vanity Fair in which he tells of his submission to the practice to gain an existential insight into the practice. Hitchens is adamant in maintaining that waterboarding does not simulate drowning. It is drowning. I think he has that right.

Considering slapping, "walling", sleep deprivation and other tactics, tough practices all, I could not say for sure, that such techniques amounted to torture in an objective sense of the term -- until I read how these techniques, as well as waterboarding, were used multiple times over a prolonged period of time. The practices relied on the fear and uncertainty of the subject on the receiving end of an extended barrage of such practices. They fell short, say, of branding or electrocution or other horrendous things that human beings have done to each other. Still, I would be extremely upset if an American soldier were subject to similar treatment at the hands of the enemy. What’s sauce for the goose is sauce for the gander.

Waterboarding seemed to me to comport with what most human beings perceive or understand to be "torture". Reasonable people will disagree on many of the other techniques, depending on their severity and intensity of application; but waterboarding is the real thing.

Without getting into the legal complexity of how we as a country define torture, most citizens presume it to be not a mere municipal matter but one of moral substance.

Which brings us back to ends and means. Saving a city from a large-scale terrorist attack is a good thing. However, does it justify serious, inherently immoral or intrinsically evil means to achieve that end? Cheney and others who support him on this issue seem to think so without saying so. They fail to make explicit their view that the ends justify the means given their focus on showing the benefit, or lack of benefit, of "enhanced interrogation" in terms of successfully avoiding an attack.

I, for one, would have preferred hearing the former Vice President engage the question of what is, or is not, "torture" rather than whether or not it serves a utilitarian function. Assuming that the ends can redeem an immoral means is dangerous, a slippery moral slope which can cause otherwise sane people to justify the most horrendous practices. Why stop with torturing the subject at hand? What if you could lay hands on a terrorist’s wife and children? Would it be all right to subject them to abuse, mistreatment, torture or even death to bring pressure to bear on a recalcitrant suspect? Where, as they say, does it all end?

There is hardly an inhumane or immoral act which cannot be justified for some supposedly greater good: carpet bombing of cities, euthanasia, abortion, abridgement of civil liberties, lying under oath. In this sense torture is no different. You may argue about what is or is not torture, but you cannot justify the thing itself without abandoning the fundamental principles of a just and moral social order.

By: G. Tracy Mehan, III, served at the US EPA in the administrations of both Presidents Bush. He is a consultant in northern Virginia and an adjunct professor at George Mason University School of Law.

Global warming: been there, done that

Australian geologist Ian Plimer says that the planet has warmed and cooled many times before. And humans aren't to blame.

Ian Plimer

The group which shared the 2007 Nobel Peace Prize with Al Gore, the International Panel on Climate Change, says that it is "very likely", ie, 90 percent sure, that global warming is due to increased greenhouse gas emissions generated by man. But "very likely" still leaves room for some uncertainly, doesn’t it?

So when I looked at the cover story of last week’s Nature, I thought that I might see coverage of that 10 percent of unexplained observations and alternative hypotheses. To my surprise, there was none. Instead, there was a windy editorial, "Time to act", which says that the challenge of winding back global warming seems all but insurmountable. This was accompanied by articles headlined, "A burden beyond bearing", "Too much of a bad thing", "Warming caused by cumulative carbon emissions towards the trillionth tonne", and "The worst-case scenario".

It was more like a goosepimpling special feature on asteroid collisions in the London Sun or the New York Post than the world’s leading science journal. But respectable tabloids always tuck in a brief comment from a sceptic. Nature had none. Did its editors have no misgivings at all about the righteousness of their cause?

I confess to being a complete ignoramus about global warming and climate change. But I do like to read both sides of the story. And when experts insist that there is only one side and that I should sign on the dotted line without reading the fine print, I feel suspicious. Even ignoramuses have rights, you know.

That’s why I welcomed the chance this week to interview Australian geologist Ian Plimer about his latest book, Heaven and earth: global warming, the missing science.Plimer is Australia’s best-known geologist and a professor at the University of Adelaide. His book has created quite a stir in the media. Leading journalists have lumped him together with anti-Semitic nutters as a climate change "denialist" and colleagues are shredding his claims in the letters pages.

The vehemence of erstwhile friends suggests that something other than scientific truth is at stake here. And if there is, Plimer is well-qualified to ferret it out. He is a sceptic who not long ago wrote a book attacking creationist science with the provocative titleTelling Lies for God. He can’t abide humbug and scientific politicking -– and this is precisely what he claims is the matter with the IPCC’s prediction that we are all going to fry unless we radically reduce our reliance upon fossil fuels.

In fact, in his view, environmentalism is a religion filling a spiritual vacuum in modern life. He writes:

Both environmentalism and fundamentalist religions foster a sense of moral superiority in the believer. They create a sense of guilt. Our wickedness has damaged our inheritance and, although it is almost too late, immediate reform can transform the future.

It was a happy coincidence that Nature’s splash on climate change coincided with our conversation. He jabbed a stubby finger at the name of Stephen Schneider, the author of "The worst case scenario". This article envisages hundreds of millions fleeing from cities flooded by a 10-metre rise in the sea level and the extinction of half of known plant and animal species. "An interesting chap," he says. "In the 70s Schneider was telling us we were all going to die due to global cooling. Now he tells us we're all going to die due to global warming."

I am an agnostic about global warming, but as a voter I want to be able to interrogate experts who make decisions that affect my future. Plimer’s book, a 500-page brick with 2,311 footnotes is just what is needed to assess a scientific "consensus" which is seldom explained, justified -- or questioned. It's long and detailed, but not obscure, with chapters on the history of the earth's climate, and on how the sun, the geology, ice, water and air each influence the climate. Just what you need to throw hardball questions at true believers.

In Australia the book is selling like hotcakes and it will be published in the US and the UK soon. A German-language edition is on the way. The release date was serendipitous: the exact moment when politicians and taxpayers are shaking empty piggybanks and wondering if they can afford climate mitigation schemes.

If you are an ignoramus like me, the credibility of global warming is supported by a few inconvenient truths. What sticks in my mind are these: temperatures have been rising steadily throughout the 20th century; islands in the Pacific are sinking as the seas rise; the Arctic ice pack is shrinking; and industrial activity is the main source of CO2.

Well, it turns out that none of the above is unambiguously true.

About rising temperatures: even though industrialisation began to add CO2 to the atmosphere in the early 19th century, the earth cooled down between 1940 and 1976, warmed from 1976 to 1998, and has been cooling down since 1998. I'll let the experts quibble over the details. What is clear about the record is that the record is not clear.

Sinking islands. One factor I never thought of is that the surface of the earth rises and falls. More than 150 years ago Charles Darwin showed that coral atolls grow on top of sinking volcanoes. As for Tuvalu, the Pacific island nation which is in danger of sinking under the waves, the land beneath it is sinking and the local ecology has been trashed when US Marines quarried the coral for a World War II airstrip. Its problems are real, but not necessarily due to global warming.

Temperatures in the Arctic rise and fall mysteriously. The Arctic was considerably warmer between 1920 and 1940. Temperatures have risen in recent years, but on August 11, 2008, the area of the ice pack was 30 percent greater than a year before.

And industrial activity is a very, very, very minor factor in generating CO2. Plimer points out that the atmosphere only contains 0.001 percent of the total carbon in the top few kilometres of the planet. Furthermore, I was fascinated to learn that earthquakes and volcanoes are a major source of CO2. About 85 percent of the world's volcanoes are under the sea. Their CO2 emissions and warming effects were not included in the IPCC reports, he says.

But can't anthropogenic CO2 push us past the "tipping point"? "Tipping points are a non-scientific myth," snorts Plimer. And indeed, the first I ever heard of tipping points was in a best-seller by Malcolm Gladwell. If climate science is scavenging in the rubbish bins of pop sociology for explanations, you really do have to ask some questions.

Plimer makes two simple and challenging points. First, climate is always changing. In the past, the earth has been both much colder and much warmer than it is today. It is exceedingly difficult to understand, let alone what causes these changes. Second, the sun is the single greatest cause of fluctuations in the heat of the earth. Very small changes in solar output have a profound effect upon temperatures. The sun is the single greatest agent in climate change, not CO2, he maintains.

As he wrote in a recent newspaper article:

In the past, climate change has never been driven by CO2. Why should it be now driven by CO2 when the atmospheric CO2 content is low? The main greenhouse gas has always been water vapour. Once there is natural global warming, then CO2 in the atmosphere increases. CO2 is plant food, it is not a pollutant and it is misleading non-scientific spin to talk of carbon pollution. If we had carbon pollution, the skies would be black with fine particles of carbon. We couldn't see or breathe.

What about criticism from colleagues? Plimer isn't worried. "You can count the number of scientists who are critical of me on a sawmiller's hand," he told me, and nearly all geologists will agree with him. I sensed a certain professional scorn for anaemic nerds who massage computer models of climate under fluorescent lights instead of getting sweaty and sunburned fossicking for strange rocks.

"The reason I put this book out," he says, "is to start a debate. The fact that I've now flushed out a few scientists to criticise me in public is wonderful because we've never had a [scientific] debate. Consensus is a word of politics; it's not a word of science."

Plimer, a man who has spent much of his life in outback mining towns, complains that many of his colleagues are smug elitists. "The reason this book has been a publishing sensation is that a lot of scientists in the media have treated their reading audience with absolute disdain," he says. "They've spoken down to them; they've been arrogant. The average punter might not have the education that you and I have, but he is not stupid. He knows there's a smell, even if he can't tell where the smell is coming from."

Without a lot more study, I don’t feel competent to judge whether Al Gore or Ian Plimer is correct about the urgency of global warming. Elitists may be insufferable, but often they’re right.

But contrarians are not always wrong. Remember Barry Marshall, another rough-hewn Aussie? Just a pain-in-the-proverbial guy with a crackpot theory -- some nonsense about bacteria causing stomach ulcers, not spices and stress. His colleagues thought he was a quack. Drug companies sneered. You know what? He was right. In 2005 he won the Nobel Prize in medicine.

By: Michael Cook is editor of MercatorNet. Heaven and Earth is published by Connor Court.

Ethics in a time of swine flu

Even if the latest health scare fizzes, we still need to prepare for a global calamity.

Nothing inspires horror more than a plague. The ancient Greek historian Thuycidides gives a horrifying description of Athens in the grip of the epidemic which killed Pericles and perhaps one-third of the population. And in more recent times Liberty City in the video game Grand Auto Theft appears to be plague-stricken. When bubonic plague hit the Indian city of Surat in 1994, 300,000 fled, spreading chaos throughout India.

Just as earthquake aficionados speak of the "Big One"in southern California, doctors fear that a "Big One" will decimate the planet. A 2006 study in the British medical journal The Lancet estimated that the next global influenza pandemic would kill 62 million people, with 96% of those deaths occurring in low-income and middle-income settings. Displaced populations, such as refugees, would be especially at risk.

Is swine flu the Big One? Almost certainly not, as it seems much milder than first feared. Nonetheless, thanks to air travel, the viral disease has spread with incredible rapidity and cases have been reported in around the world -- a sobering reminder of how vulnerable we are. Public health experts recall the 1918-1919 flu epidemic. That killed at least 40 million people around the world, and possibly as many as 100 million.

Epidemics present governments, hospitals and doctors with a number of tough ethical dilemmas. Here are some which have emerged from the latest global scare:

• Vaccines may need to be rationed. The global capacity for manufacturing vaccines is limited. Even with the latest technology, according to The Economist, drug companies could only make enough to cover 10 percent of the world’s population. Only nine countries, most of them in Europe, have enough capacity to supply even their own population. What about people in the developing world? The last time there was a swine flu outbreak in the US, back in 1976, the government even banned the export of the vaccine.

• Will there be enough ventilators to help seriously ill patients breathe? If hospitals are inundated they may not have enough machines or doctors qualified to run them. It may be necessary to remove patients who won’t survive to save patients who probably will. And if two patients have an equal chance of survival, who gets the machine? The younger one? The non-smoker?

Guidelines worked out last year in New York State recommend that decisions like these should be made by "triage officers" rather than the doctors and nurses who are treating the patients. Otherwise the stress could be corrosive. They also recommended excluding people who have bad hearts, metastatic cancer with a poor prognosis, or end-stage organ failure.

• Governments are capable of using public health as a smokescreen for repression. The Egyptian government, for instance, has ordered the destruction of the 300,000 pigs in the country. It did this even though no Egyptians were ill and there was no proof that the disease is transmitted by pigs. But pig owners in the largely Muslim country are overwhelmingly Coptic Christians. According to the New York Times, the Christians see this as another expression of Muslim hostility.

• The weightiest decision will be whether to forcibly isolate and quarantine suspected cases. During the 1918 epidemic, Americans tended to stay home rather than risk infecting others. But will a less conformist generation be as cooperative? "The outbreak of severe acute respiratory syndrome (SARS) in southeast Asia and the deliberate release of anthrax in the USA have shown that modern societies do not take kindly to outbreaks of infectious disease and they are prone to panic very quickly," a scientist observed after the SARS epidemic in the journal Lancet Infectious Diseases.

At the moment, according to guidelines from the US Centers for Disease Control and Prevention, mandatory quarantine measures for Americans kick in only when the case fatality rate rises above 1%. But this means that a million people would have to die before public health authorities are given police powers. That seems absurd.

• Should healthcare workers be forced to care for patients? During the 2003SARS epidemic, a number of healthcare workers were infected. The Italian doctor who identified the virus died. Do they have a right to withhold their services to protect their own lives and their own families? Or should they soldier on?

• How harshly should other measures be enforced? Should people be arrested or fined for not wearing masks, for not closing schools, or for not accepting social distancing measures? Even The Lancet this week was wagging the finger at sniffling readers. "Every member of the public has a part to play in limiting the risk of a full-blown pandemic. Vigilance, and not alarm, is needed, with readiness to self-isolate oneself at home if an influenza-like illness develops."

Governments will have to balance competing values – autonomy, civil liberties, transparency, due process and capacity to harm. There is a great danger of harsh and heavy-handed application of preventative measures by over-zealous bureaucracies. Remember Operation Clean Sweep in Outbreak, the 1994 Hollywood thriller about an Ebola-like virus? Gung-ho military types nearly obliterate an infected town with a huge fireball to save the rest of America.

• A related problem is stigmatisation. Mexicans have typecast as “Typhoid Marys” and subjected to humiliating treatment, even by other Latin American countries. The worst incidents happened in China where dozens of healthy people with Mexican passports were detained and isolated. The disease may not even have originated in Mexico; California and Pakistan have been mentioned in the media as possible sources. It’s human nature, of course, to link diseases with foreigners, and governments need to combat it. Apparently cars from Mexico City were stoned in the state of Guerrero, as locals feared that they were carrying the contagion.

• How much money should be spent on creating a flu vaccine? With limited manufacturing capacity, would the money be better spent on combatting the normal winter flu – which also kills people -– or in designing a new vaccine which might not work?

• And – not the most pressing issue, to be sure -- what is the name of the disease? The Deputy Health Minister of Israel has announced that swine influenza A (H1N1) will be called "Mexico Flu" rather than swine flu as pigs are not kosher. Names are always political.

Epidemics bring out the worst – and the best – in people. Our time to confront "the Big One" might not have arrived, but we need to prepare our ethics as well as our vaccines.

By: Michael Cook is editor of MercatorNet

The lady is not for decoration

America's leading Catholic woman declines to 'balance' Notre Dame's decision to honour a pro-abortion president.

Mary Ann GlendonThe University of Notre Dame is used to controversy and Fr. John Jenkins, like some past presidents of this prestigious Catholic institution, is used to managing, if not attracting it. His powers of attraction were spectacularly on display during the March 2008 “Vagina Monologues” fiasco when he allowed vulgarity disguised as art to be played out on campus. His logic? It’s part of the university’s mission “to provide a forum in which multiple viewpoints are debated in reasoned and respectful exchange -- always in dialogue with faith and the Catholic tradition…”

Exactly one year later, with great pleasure he announced his invitation to President Barack Obama to deliver the 2009 commencement address had been accepted, and that the university would award him its doctorate of laws. His timing was well-calculated. After securing Obama’s acceptance, Jenkins let the local bishop know that he’d gone around him to extend that honor in the first place. And a few months before that, he had already secured considerable insurance by naming Mary Ann Glendon to receive the esteemed Laetare Medal at that same ceremony.

All was in place. Yes, a controversy would erupt, no doubt. Some bishops and Catholic faithful, a great many of them Notre Dame students and alumni, would protest. But after the 2008 election that divided Catholics down the middle, Jenkins would showcase a star from one side of the divide, and a luminary from the other. He would be charged with scandal, but he would wash the left hand with the right. Everyone was watching, and he was controlling the picture.

Until last week. After some soul-searching, Mary Ann Glendon said “No” after all.

“The significance of Glendon’s refusal is enormous,” writes Fr. Raymond de Souza in theNational Catholic Register. “The most accomplished Catholic laywoman in America -- former ambassador of the United States to the Holy See and current president of the Pontifical Academy of Social Sciences -- has refused to accept Notre Dame’s highest honor. It is a signal moment for the Catholic Church in the United States. It is a signal moment for the Church’s public witness. It may even be a signal moment for Notre Dame.”

It’s certainly a call to accountability. Glendon issued her letter to Jenkins directly and, preserving its integrity, released it to the public. It blazed through the internet immediately.

“Dear Fr. Jenkins,

When you informed me in December 2008 that I had been selected to receive Notre Dame’s Laetare Medal, I was profoundly moved.”

Note the timing, she says from the start. Throughout her letter, Glendon was gracious and unambiguous. She recalled her gratefulness for the university counting her 1996 commencement address among their most memorable. Then she informed Jenkins that his follow-up call in March to tell her about Obama’s role and his honorary law degree necessitated a re-writing of the speech she was already planning for the ceremony.

Glendon was once Obama’s professor at Harvard. They no doubt have a healthy respect for each other. But this was about much more than feelings and goodwill.

“First, as a longtime consultant to the U.S. Conference of Catholic Bishops, I could not help but be dismayed by the news that Notre Dame also planned to award the president an honorary degree. This, as you must know, was in disregard of the U.S. bishops’ express request of 2004 that Catholic institutions ‘should not honor those who act in defiance of our fundamental moral principles’ and that such persons ‘should not be given awards, honors or platforms which would suggest support for their actions’. That request, which in no way seeks to control or interfere with an institution’s freedom to invite and engage in serious debate with whomever it wishes, seems to me so reasonable that I am at a loss to understand why a Catholic university should disrespect it."

And yet, Jenkins had attempted to explain it away, saying the bishops only meant that errant Catholics were not to be honored or given a platform by Catholic universities, thus limiting the bishops' teaching to a religious opinion rather than a natural truth to be held by all.

Glendon continued:

"Then I learned that ‘talking points’ issued by Notre Dame in response to widespread criticism of its decision included two statements implying that my acceptance speech would somehow balance the event:

President Obama won’t be doing all the talking. Mary Ann Glendon, the former U.S. ambassador to the Vatican, will be speaking as the recipient of the Laetare Medal.

We think having the president come to Notre Dame, see our graduates, meet our leaders, and hear a talk from Mary Ann Glendon is a good thing for the president and for the causes we care about.

“A commencement, however, is supposed to be a joyous day for the graduates and their families. It is not the right place, nor is a brief acceptance speech the right vehicle, for engagement with the very serious problems raised by Notre Dame’s decision -- in disregard of the settled position of the U.S. bishops -- to honor a prominent and uncompromising opponent of the Church’s position on issues involving fundamental principles of justice."

The White House press office issued a swift response: “President Obama is disappointed by former Ambassador Mary Ann Glendon’s decision, but he looks forward to delivering an inclusive and respectful speech at the Notre Dame graduation, a school with a rich history of fostering the exchange of ideas.”

It was a smokescreen, saying nothing about substantive points Glendon had made, while perpetuating the false idea that this is somehow going to be an “exchange of ideas”. And it implied that the call for a Catholic university not to confer upon a pro-abortion president a doctorate of laws is somehow not “inclusive”.

But neither Obama nor Jenkins has controlled the message since her letter appeared.

“It is to Father Jenkins’s shame that he tried to use Glendon,” writes de Souza. “It is to her great credit that she refused to be used.” Jenkins has invited Judge John Noonan, a former Laetare Medal honoree, to deliver an address in an attempt to fill the void left by Glendon’s absence. The justice of the Ninth Circuit Court of Appeals will suffice for some people, most notably Fr. Jenkins. Credit him with this: Notre Dame will not award the Laetare Medal this year.

After all, it has already been awarded, and the honour is Mary Ann Glendon’s, who has elevated the witness of a faithful servant to something higher than an officeholder of a powerful institution: the noble dignity of a humble person.

By: Sheila Gribben Liaugminas is an Emmy Award winning journalist who reported for Time magazine for more than 20 years. She blogs at InforumBlog.com and on MercatorNet.