Book reviews – War on jihad

Preston Jones
The Next City
June 21, 1998

 

The Decline of Eastern Christianity under Islam: From Jihad to Dhimmitude

by Bat Ye’or (Fairleigh Dickinson University Press, 1996. 522 pages) US$19.95

NEWSWEEK MAGAZINE’S MARCH 16, 1998, issue portrays America’s second generation, modern young Muslims as the future of the Islamic faith. “In El Cerrito, California, Shahed Amanullah knows it’s time to pray, not by a muezzin’s call from a mosque minaret,” writes Carla Power, “but because his PowerMac has chimed. A verse from the Koran hangs by his futon. Near the bookcases — lined with copies of Wired magazine and Jack Kerouac novels — lies a red Arabian prayer rug. There’s a plastic compass sewn into the carpet, its needle pointing toward Mecca.”

Further on, Power tells us that America’s Muslims — who number between two and six million (no firm statistics exist) — are “taking on stereotypes” and the “status quo.” That is, they are bothered that Islam in its entirety is associated in the popular mind with the terrorist likes of Hamas and Algerian and Egyptian extremists. “By going back to the basic texts,” Power writes, these young Americanized Muslims are “rediscovering an Islam founded on tolerance, social justice, and human rights.”

It would be interesting, indeed, to see how one could go back to 7th century Arabia and find a faith rooted in philosophical goods — “tolerance, social justice, and human rights” — that were not propounded until the early modern period, and then only in Western Europe. But putting that aside, these American Muslims’ desire to put a human, and humane, face on Islam is admirable, for it does get an unfortunate, if understandable, amount of bad press. Yet it should also be said that unless their enterprise is based in reality, unless it is rooted in facts, it will not stand. Which brings us to Bat Ye’or’s The Decline of Eastern Christianity under Islam, whose project is getting at the facts of Islam’s historical relationship with the Christians, and to a lesser extent, the Jews of the Middle-east, Eastern Europe, and Northern Africa from the 7th to the 20th centuries.

A Jew born in Egypt, Bat Ye’or has been a British citizen since 1959 and currently resides in France. All of her books, including this one, were written and first published in French. And while she apparently does not relish controversy — she notes more than once that she bears no animus toward Islam — neither does she sacrifice what she thinks is historical truth for the sake of niceness.

In his foreword to this book, Jacques Ellul, who until his recent death was a prominent French legal theorist, theologian, and culture critic, notes that due to political sensibilities (and, one would think, a fear of being thought sympathetic to France’s far-right, anti-immigrant National Front), scholars have been reluctant to speak and write seriously about jihad. Scholars who grappled with jihad usually emphasized its explicitly spiritual components.

In the spiritual sense, jihad is a struggle that each Muslim believer “has to wage against his own evil inclinations and his tendency to disbelief,” and this, writes Ellul, is something with which believers in most religions can identify. But jihad means more, too: From the 7th century to the present one, jihad has often meant literal warfare. “The world, as Bat Ye’or brilliantly shows, is divided into two regions: the dar al-Islam and the dar al-harb; in other words, the ‘domain of Islam’ and ‘the domain of war,'” Ellul continues. “[In Islam the] world is no longer divided into nations, peoples, and tribes. Rather, they are located en bloc in the world of war, where war is the only possible relationship with the outside world. The earth belongs to Allah, and all its inhabitants must acknowledge this reality; to achieve this goal there is but one method: war.” Ellul notes that the Koran does provide for peace with the dar al-harb; in many circumstances, of course, it is best not to wage war. “But this changes nothing,” Ellul writes: “war remains an institution, which means that it must resume as soon as circumstances permit.”

That war is near the heart of Islam should, at least from a historical perspective, not come as a surprise. Muhammad, Islam’s divine prophet, was himself a military commander; and as Bat Ye’or makes clear, Islam was born in a culture wracked by violence. So while Islam took much of its ethical teaching from the two biblical religions (Judaism and Christianity), the customs of the nomadic tribes of Arabia’s Hijaz, Islam’s birthplace, conditioned Islam’s interactions with non-Muslims. Thus Bat Ye’or writes that when Islamized Bedouins raided the towns of Babylonia (in present-day Iraq) and Syria in the early 7th century, Christians who lived in those places perceived these destructions as “no more than the usual predatory activities. But they were mistaken — this was jihad.” This was also the beginning of the astonishing spread of Islam from Arabia to the borders of China in the East and, in the West, to the gates of Vienna, where Islam was checked by Western forces in 1683.

The bulk of this book is dedicated to explaining what happened to Christians and Jews or, in Islamic parlance, to the “Peoples of the Book,” that is, the dhimmis who were subjugated by Muslim rulers. Originally the dhimma was a “protection pact” granted by Muhammad to the Peoples of the Book he had conquered. But before long, this protection became outright oppression. “The dhimma required the humiliation of the dhimmis, who were accused of falsifying the Bible by deletions, distortions, and omissions of the prophecies heralding Muhammad’s mission,” Bat Ye’or writes. “Their persistence in error, regarded as the mark of a diabolical nature, condemned them to degradation.”

Thus does Bat Ye’or seek to modify the conventional wisdom of most general world history textbooks, namely, that Christians and Jews who lived in lands conquered by Muslims from the 7th century have for the most part enjoyed relative peace and freedom. “During 13 centuries and on three continents the dhimmi peoples are presented [in most textbooks and by most scholars] as having uniformly and indefinitely enjoyed a status of benevolent tolerance,” Bat Ye’or writes. “Bursts of fanaticism and waves of persecution, when they are not obfuscated, are interpreted as exceptional situations, often attributable to the victims themselves or to foreign [i.e., European] provocation.”

Bat Ye’or does not accept this view, of course, in part because it flies in the face of human experience. “This puerile interpretation of dhimmi life — resembling idealized illustrations — endows Islam with an exceptional aura,” she notes. “This collective paradisiacal condition, which, allegedly, would have encompassed 13 centuries for millions of individuals, has never in fact been experienced by any people, at any period, anywhere in the world — because it is unfortunately incompatible with the human condition.”

In The Decline of Eastern Christianity under Islam, Bat Ye’or also dispenses with the widely accepted claim that Islamic civilization produced great intellectual and political achievements. While that civilization saw many such achievements, she writes, few originated in Islam; and most derived from the learning of the dhimmi. “Jews, Christians, and Zoroastrians . . . taught their oppressors, with the patience of centuries,” she writes. Indeed, from these dhimmi, Muslims learned “the subtle skills of governing empires, the need for law and order, the management of finances, the administration of town and countryside, the rules of taxation rather than those of pillage, the sciences, philosophy, literature and the arts, the organization and transmission of knowledge — in short, the rudiments and foundations of civilization.”

Bat Ye’or does not minimize Islamic civilization — she calls it “vast, rich, [and] complex” — or hide the discoveries of her considerable study. And readers who take the time to peruse the 175 pages of documentation appended to her narrative might find it difficult to argue with her.

Bat Ye’or is equally forthright in assessing the present world — a world in which jihad, in theory and practice, is alive and well. “The Islamist movement makes no secret of its intentions to convert the West,” she observes.

“Its propaganda, published in booklets sold in all European Islamic centres for the last 30 years, sets out its aims and the methods to achieve them. They include proselytism, conversion, marriage with local women, and, above all, immigration. Remembering that Muslims always began as a minority in the conquered countries . . . before becoming a majority, the ideologists of this movement regard Islamic settlement in Europe, the United States, and elsewhere as a chance for Islam.”

Which brings us to this question: Will critically minded North Americans engage Bat Ye’or’s assertions with the seriousness they deserve? I have put this project to my university students, two of whom are Muslims, as follows: There are some brutal passages in the Hebrew Scriptures (in, for example, the Book of Judges), but there are also parts of the same Scriptures (e.g., the Book of Jonah) that counteract the brutal ones. Given this, and combined with what we know of the practice of Judaism throughout the world, one would be hard pressed to prove that violence is near the heart of Judaism. The same can be said of Buddhism and Christianity. Of course, evils have been, are being, and will doubtless be perpetrated by Buddhists and Christians; but only the ignorant or the shallow would say that violence is near the heart of either of these two faiths. Can the same thing be said of Islam?

Bat Ye’or does not seem to think so. Neither is Harvard’s Samuel P. Huntington (The Clash of Civilizations and the Remaking of World Order, 1996) optimistic that Islam can live in relative peace with the dar al-harb (the “domain of war” where Islamic law does not rule). “Even more than Christianity, Islam is an absolutist faith,” he writes. “It merges religion and politics and draws a sharp line between those in the dar al-Islam and the dar al-harb. As a result, Confucians, Buddhists, Hindus, Western Christians, and Orthodox Christians have less difficulty adapting to, and living with, each other than any of them has in adapting to and living with Muslims.” And the rather vitriolic response Richard John Neuhaus, editor-in-chief of First Things, a journal of religion and public life, received from American Islamic organizations after he favorably reviewed Bat Ye’or in October 1997 does not lend itself to thinking her or Huntington wrong (for a report on the response to Neuhaus see the February 1998 issue of First Things or http://www.firstthings.com/ftissues/ft9802/public.html#Islamic).

But it is still too early to come to firm conclusions about the ability of the dar al-Islam to live at peace with the dar al-harb. My students have not yet reported back to me. In the meantime, we should not avoid tough questions simply because asking them is unpleasant.

To comment, write to PrestonJones@nextcity.com

black line

Response to Preston Jones’s review

Ibrahim Hooper, national communications director, Council on American-Islamic Relations, Washington, D.C., responds: September 28, 1998

Peace

Of course, no one ever saw this letter because First Things refused to print it.

First Things editor owes an apology to Muslims
by Ibrahim Hooper

In his October “Public Square” editorial (“The Approaching Century of Religion”), Richard John Neuhaus left little doubt as to his negative opinion of Islam. While it is common for Islamophobic writers to cast Islam as the “other,” it is quite rare that such views are stated so explicitly. Mr. Neuhaus says clearly: “The chief other is Islam.”

To drive that point home, Mr. Neuhaus uses the convenient journalistic cover of a book review; in this case an examination of The Decline of Eastern Christianity under Islam: From Jihad to Dhimmitude by Bat Ye’or.

It is Bat Ye’or’s, and clearly Mr. Neuhaus’s contention that Islam was, is, and always will be a threat to “Judeo-Christian” civilization. In this worldview, Islam is a “challenge” that the West is “afraid to understand.” Periods of relative inter-civilizational peace and stability are a “momentary pause” in the permanent jihad against the “infidels.”

Mr. Neuhaus uses the term jihad quite liberally yet fails to offer a definition. Jihad does not mean “holy war.” It means to strive, struggle, and exert effort. It is a central and broad Islamic concept that includes struggle against evil inclinations within oneself, struggle to improve the quality of life in society, struggle in the battlefield for self-defense (e.g., having a standing army for national defense), or fighting against tyranny or oppression. There is no such thing as holy war in Islam, as some careless translators may imply. It is rather a loaded medieval concept that did not arise from within the Muslim community.

What is a Muslim to make of claims such as: “Islam’s origins in the customs and values of the Arab Bedouins and of nomadic tribes have left it with the jihad as the only way of relating to the non-Islamic world.” Or what about his description of Middle East as “a world still steeped in the Arab and Bedouin mindset of the Prophet.” Or even worse: “Islam’s spectacular spread was brought about by brutal military conquest, rapine, spoliation, and slavery . . .”

Is this mere ignorance, or does it rise, as I believe, to the level of ethnic and religious hate mongering? How does Mr. Neuhaus explain the following verse from the Quran, Islam’s revealed text: “Those who believe (in the Quran), and those who follow the Jewish (scriptures), and the Christians and the Sabians — Any who believe in God and the Last Day, and work righteousness, shall have their reward with their Lord; on them shall be no fear, nor shall they grieve.” (Chapter 2, verse 62)

Now perhaps being a Christian who accepted Islam, I was not given the proper Islamic playbook. I may have somehow remained unaware that I am supposed to be in a perpetual state of warfare with my “infidel” relatives. (It is interesting that since accepting Islam many years ago, I have never heard a Muslim utter the Hollywood B-movie word infidel. (Perhaps Mr. Neuhaus is misusing the Arabic word kafir, or “one who rejects faith.”) Given his obvious biases, Mr. Neuhaus will not be impressed with the Quranic verse, “Let there be no compulsion in religion.” (Chapter 2, verse 256)

As with other Islamophobes, many of whom are mentioned or quoted in the article, Mr. Neuhaus tosses off the usual disclaimer of not being anti-Muslim. Yet one wonders how anyone who puts Islamic civilization in quotation marks and agrees that “how little that is admired in Islamic civilization is original,” or that “the classical heritage that was presumably preserved by Islam was in fact rescued from Islam by those who fled its oppression,” can avoid such a label.

At this point it would normally be appropriate to mention a long list of contributions Muslims and Islamic culture have made to human civilization throughout the past 14 centuries. I might mention the astrolabe, the poetry of Rumi, the astronomical discoveries of Al-Biruni, al-Kindi’s and al-Farabi’s attempts to establish harmony between faith and science, the Bayt al-Hikma (House of Wisdom) in Baghdad, the founding of Al-Azhar University, the observatory at Jiapur, Ali bin Isa’s treatise on ophthalmology, and the mathematic concepts of Al-Khwarizmi (the word “algebra” is derived from his book Kitab al-Jabr al-Muqabala). This list does not even touch on Islamic art, architecture, medical discoveries, geographical studies, and a host of other contributions to world history and civilization. But of course, since Mr. Neuhaus believes all of these things were really “rescued” from Islam, there is no point in even mentioning them.

Mr. Neuhaus’ religious blinders apparently allow such convolutions of logic and reason as, “Bat Ye’or is at pains not to appear anti-Islamic . . . But the story she tells speaks for itself.” Or this: “However tortured the historical relationships between Christians and Jews, each community is identified by the same biblical narrative . . . Not so with Islam.”

Has Mr. Neuhaus ever picked up a Quran? Apparently not, judging from these bizarre statements. Does he really not know that Muslims believe in and revere Abraham, Moses, Mary, Isaac, Ismail, Solomon, Jesus, David, Aaron, Noah, and many other figures from what he arrogantly and exclusively labels the “Judeo-Christian” tradition?

Perhaps all this quoting of chapter and verse is ultimately pointless to Mr. Neuhaus. He states: “I believe Bat Ye’or and others are right to caution us against delusions; for instance, the delusion that a Muslim-Christian dialogue can be constructed on a basis more or less equivalent to the Jewish-Christian dialogue.”

Here we go again. Islam is somehow uniquely unqualified for inclusion in “our” discussions. This is the Islam of medieval polemics and the Crusades, all boiling oil and scimitars. How sad that a prestigious journal would publish such one-dimensional drivel.

As Professor Malcolm Barber wrote on the subject of the Crusader mentality, “The point is that Islam had to be presented as the enemy. Consequently, Muslim belief had to be disapproved or mocked, and Muslim social behavior distorted and denigrated.” (“How the West Saw Medieval Islam,” History Today, May 1, 1997)

And what does the Catholic Church (Mr. Neuhaus is after all also “Father Neuhaus”) have to say about Islam and the “delusions” of dialogue? In “Vatican Council II: The Conciliar and Post Conciliar Documents” (1992 Edition), we find: “The Church has also a high regard for the Muslims. They worship God . . . They strive to submit themselves . . . just as Abraham submitted himself to God’s plan, to whose faith Muslims eagerly link their own . . . Over the centuries many quarrels and dissensions have arisen between Christians and Muslims. The sacred Council now pleads with all to forget the past, and urges that a sincere effort be made to achieve mutual understanding . . .” Apparently Father Neuhaus is out of touch with church teachings.

Mr. Neuhaus’ views on immigration and the American Muslim community are truly gut churning. He states: “The biggest problem in sight is Islam. People like Ellul and Bat Ye’or (and obviously Mr. Neuhaus) worry about the low-level jihad of Islamic immigration in Europe . . . and about the establishment of Islam in Bosnia.” One wonders if Mr. Neuhaus would have agreed with those who saw Catholic immigration to the U.S. as a low-level invasion or would he defend the “Christian” Serbs who sought to rid Bosnia of its “Muslim problem.” A chill goes up the spine reading such words.

According to Mr. Neuhaus, the problem of Islam in America is not yet at a critical stage. He states: “The situation in the U.S. is very different. There are probably no more than two million Muslims in this country [the real figure is more than double that], and half of them are native-born blacks . . . at present Muslims here pose no threat to the Judeo-Christian identity of the culture . . .”

Now what are we to make of this statement? Are “native-born blacks” more Islamically docile and therefore no threat to “Judeo-Christian” American civilization? Or are African-Americans so low in Mr. Neuhaus’ political hierarchy that they are hardly worth mentioning? Does Mr. Neuhaus advocate restrictions on immigration from Muslim areas of the world?

Mr. Neuhaus’ casual dismissal of African-Americans and American Muslims is both insulting and inaccurate. American Muslims have seen tremendous growth and development in the past 30 years. The number of mosques and Islamic centers now approaches 2,000 nationwide. Muslims have initiated drug eradication campaigns in the inner cities, participated in disaster relief efforts in the Midwest (including sending donations and volunteers to Oklahoma City after the bombing of the Murrah Federal Building) and are currently engaged in voter registration drives and grass-roots political organizing around the country.

Muslims are also starting to stand up to the discrimination and bias they face daily in the workplace, in schools and in the media. The Council on American-Islamic Relations (CAIR), the organization I represent, began documenting anti-Muslim incidents in the wake of the Oklahoma City attack, when Muslims were unfairly linked to that crime. In the first few days after the bombing, CAIR recorded more than 200 incidents of harassment, threats and actual assault. One woman even lost her near-term baby when terrorized by unknown assailants who attacked her house.

CAIR’s 1997 report on the status of American Muslim civil rights detailed a three-fold increase in such incidents. Hysterical and inaccurate commentary has been shown to be a major causal factor in this trend toward stereotyping and scapegoating Muslims.

Mr. Neuhaus concludes by saying that he has tried unsuccessfully (what a surprise) to reach out to Muslims in the past. “As an institute and a journal, we have over the years tried to engage Muslims in the conversations of which we are part . . . It is an embarrassment that . . . the Muslim participation is almost nonexistent.” Then he explains why no suitable Muslim articles have been accepted: “Muslim authors . . . are typically so defensive, or so belligerent, or so self-serving — or all three at once — that they would only compound misunderstandings.”

The stark racism, xenophobia, and bigotry this statement, and the other statements outlined above expose, should have leapt off the page at any reasonable editor.

In the beginning of this article, I called Mr. Neuhaus an Islamophobe. Let us now review, based on the evidence of his own words, whether or not he deserves that title.

According to a report published by the Runnymede Trust in England, there are seven features of Islamophobic discourse. Does Mr. Neuhaus exhibit these traits?

1) Muslim culture seen as monolithic and unchanging. — Check.
2) Claims that Muslim cultures are wholly different from other cultures. — Check.
3) Islam perceived as implacably threatening. — Double check.
4) Claims that Islam’s adherents use their faith mainly for political or military advantage. — Check.
5) Muslim criticism of Western cultures and societies rejected out of hand. — Check.
6) Fear of Islam mixed with racist hostility to immigration. — Triple check.
7) Islamophobia assumed to be natural and unproblematic. — Naturally.

Mr. Neuhaus owes an apology to the Muslim community, to his superiors in the Catholic Church, and to his readers.

Advertisements
Posted in Culture | Leave a comment

Saturday Night big prize winner

Chad Skelton
Globe and Mail
May 23, 1998

Toronto – The two heavyweights of Canadian magazine journalism, Saturday Night and Toronto Life, came first and second at last night’s National Magazine Awards. But Vancouver Magazine was the success story of the evening, coming in third overall and being named Magazine of the year.

Saturday Night won four gold medals and four silver at the 21st annual awards, followed by Toronto Life with three gold and three silver. Vancouver Magazine won three gold and one silver.

The success of Vancouver Magazine was even more surprising because it received its four awards with only seven nominations, compared to 37 nominations for Saturday Night, 25 for {he Quebec biweekly newsmagazine L’actualité and 21 for Toronto Life.

L’actualité came in fourth with two gold and three silver and the technology and culture magazine Shift was fifth with two gold and two silver. Cottage Life, Maclean’s and This Magazine won three awards each. En Route, Air Canada’s in-flight magazine, won two gold, and new magazines Elm Street and The Next City each won one gold medal.

The President’s Medal, a “best-of-show award” recognizing the best article of the year, went to Beyond Sensible, by Carsten Stroud in the Financial Post Magazine – a piece about the joy of riding the new Cadillac Catera (the award also won the gold for humour). The award for best new magazine went to Chirp, a magazine for preschoolers from the publishers of Chickadee and Owl. And the Alexander Ross Award for Best New Magazine Writer went to Nancy Baron for her articles in The Georgia Straight.

Alien Abel was a double winner last night with two gold, one in travel journalism for This Night Will Never Come Again, an article in Saturday Night about New Year’s celebrations in Hong Kong before the takeover, and another for sports writing in Saturday Night for Once They Were Rookies, about how the lives of three minor-league outfielders took very different paths.

The night’s awards were not limited to career journalists. Dr. Jacalyn Duffin, a haematologist, won a silver award in the Science, Health and Medicine category for her Saturday Night article Medical Miracle, in which she described testifying at a Catholic tribunal for the sainthood of Marguerite d’Youville about the mysterious remission of a leukemia patient.

Winners associated with The Globe and Mail include: a gold for sports columnist Stephen Brunt for a profile of former boxer George Chuvalo in Toronto Life, a silver for arts writer Robert Everett-Green for a piece of fiction in Queen’s Quarterly, and a silver for Globe arts columnist Robert Fulford for an essay in Queen’s Quarterly. Fulford also won the Foundation Award for Outstanding Achievement, which recognizes the exceptional contribution of one individual to Canadian magazines. The now-defunct Globe magazine Fashion + Design won a gold in still-life photography.

Magazines winning one gold and one silver were: Financial Post Magazine, Outdoor Canada and Toronto Life Fashion. Winning one gold each were: Canadian Business Technology, Canadian Geographic, Elm Street, Fashion + Design, The Next City, The New Quarterly and Prism International. Equinox and the Queen’s Quarterly won two silvers each. And one silver was won by Canadian House & Home, Chatelaine, Elle Québec, Gardening Life, The Malahat Review and Toronto Life Gardens.

In all, 27 magazines won over 60 awards worth roughly $64,000 in prize money. The awards gala at the Sheraton Centre in downtown Toronto was hosted by CBC Newsworld celebrity Pamela Wallin. The National Magazine Awards Foundation received a record 1,930 entries this year, of which 250 articles were nominated.

Posted in Culture | Leave a comment

Federal-Provincial/Territorial Agreements

Citizenship and Immigration Canada
May 23/1998

Annex E: Family Class Sponsorship. Canada and British Columbia wish to establish a framework for exchanging information and for undertaking collaborative measures to reduce the incidence and cost of family class sponsorship default.

Click here to view PDF

Posted in Immigration | Leave a comment

Discussion group on Dying well (part 2 of 2)

Harry van Bommel
The Next City
March 21, 1998

Letters

1.Ian Welsh , Toronto, responds: April 21, 1998

2.Harry van Bommel replies

black line

Ian Welsh, Toronto, responds: April 21, 1998

I read your article on palliative care and euthanasia with great interest. I am sure you will receive many responses along the lines of the one I am giving. To wit: hospitals do not give adequate pain care. I spent three months in hospital about three years ago. I initially went in for ulcerative colitis, an inflammatory disease of the bowels. The first two weeks were hell. I was given Tylenol Three pills to control the pain. By the end of the first week I was refusing them because they had no effect at all, and I don’t believe in taking medicine if it does no good. I complained, I stifled my screams, I made myself walk to the bathroom every fifteen minutes despite the fact that each time was agony. Towards the end I told the nurses that I needed more pain killers. I would have asked the doctors, but I hardly ever saw them since my main doctor had gone on holiday and the colleague he had left in charge was rarely seen. At the end of those two weeks I came to a day where I so delirious with pain that it was becoming its own anaesthetic. I tried to stop the nurses from touching or moving me in any way. One nurse became concerned and forced some doctors to look at me. They gave me fifteen units of morphine IM, a very powerful dose (IM means injected into a muscle), but it had no effect at all. They then rushed me to surgery and discovered that I had appendicitis as well as ulcerative colitis and an infected IV site. The appendix would have burst in a couple days, surely killing me in the state I was in at the time.

Pain relief was better for the next couple of weeks, then my surgeon returned from his holiday and decided to take me off my painkillers. I told him that there had been other complications. He didn’t listen. Two weeks were enough. There then ensued what I have called the “Great Painkiller War.” My back doctor kept putting me on pain killers, and my surgeon kept taking me off them. I never knew whether I would have enough pain relief or not. At the same time I was suffering from extreme and untreatable nausea, and I couldn’t move due to inflammation in my back. I couldn’t even turn over in bed. The surgeon thought I was faking it, the back doctor knew otherwise and the nurses were divided in opinion. This sort of jockeying went on for almost a month as my condition swayed back and forth due to repeated infections. (I had been on immunosuppressants far too long, leaving me with almost no resistance to infection.)

The “Great Painkiller War” was finally ended by my original doctor (who had handed me over, since my problems were no longer in his area of expertise). I was complaining of symptoms for which no cause had yet been found. He happened to visit and I finally cracked and started crying, telling him what was going on. He immediately wrote in my file that he was in charge of my pain medication, and no matter what the time of day or night he was to be called if they were changed and would change them back immediately. He put himself on the line for me since if I had been faking it he would have lost great face (a couple days later a test came back showing a problem which perfectly matched the symptoms I had described), and I will always be more grateful than I can say. That doctor could ask me for anything and I would not refuse him.

So much for the story. Some comments:

1) It is not my experience that any level of pain can be controlled. Some pain seems to be beyond narcotics.

2) I have been on every anti-nausea medicine my doctor knew of (he is a gastro-ontologist). None of them worked. I don’t know that really severe nausea can be controlled short of unconsciousness.

3) There is no limit to how much pain one can feel. Every time I thought it couldn’t get worse I was proved wrong.

4) Despite that, the key thing about anti-pain medicines is not that they stop you from feeling pain — beyond a certain level they don’t — but rather that they make you not care about it very much.

5) Mental pain is worse than physical. I never used to believe this but some psychotic episodes caused by the medicinal steroids I was on quickly proved it to me.

6) There is a strong moral approbation among many health professionals, including nurses, to using narcotic painkillers. Some nurses expressed this approbation by taking their own sweet time in delivering pain medication. Perversely, of course, this leads to patients asking for the medication before they need it, anticipating, and making the nurse think they really are addicted.

7) The fact that only doctors can prescribe pain medication is a problem. Nurses are very reluctant to disturb doctors, and doctors other than the ones in charge of a patient are very reluctant to prescribe pain killers. Interns, junior and unsure of themselves, who are on call at night, are particularly unwilling to. I have waited for hours suffering unbelievably, because my condition has changed for the worse and my doctors could not be found. Sometimes nurses acted on their own initiative (to my gratitude) but nurses are very reluctant to do so. They can get in a great deal of trouble.

8) I was on morphine or Demerol, injected or intravenously for two and a half months while in the hospital. After leaving I was on powerful morphine pills for about a month. Stopping using them cost me a few nights of sleep when I was about a month out of the hospital. Narcotics soften things but I didn’t find stopping using them all that difficult — especially compared to what the attitude of some doctors and nurses had led me to think. If, however, they had been withdrawn prematurely, I am sure the withdrawal would have been hell.

9) I never asked to be killed since I didn’t want to wind up being treated like a nutcase and I didn’t want to disappoint my main doctor (who saved me from much of the pain, as I described) who was a “fight to the end” sort of guy. I certainly wished for it. I even sincerely prayed for it. I’ve never prayed so hard for anything.

10) The worst thing with pain, is, as you say, loss of control. The other thing is uncertainty. A known quantity of pain can be dealt with. When you never know how much pain you are going to be in from day to day, or even hour to hour . . .

11) About six months later, in a new city, which I had gone to in order to spend my convalescence with my friends, I went to find a new doctor. The first doctor I saw took one look at the reference letters I had to show him (from my various doctors) saw the types of medication I had been on and told me that under no circumstances would he ever put me on any narcotic painkillers. I was so angry I was trembling. The bastard was saying that to “save” me from hypothetical addiction he was willing to let me suffer any level of pain. I would wager money that he had never experienced real pain himself. I walked out immediately. I will never have a doctor who does not meet the criteria of trusting me and being willing to control my pain and suffering if it is humanly possible.

black line

Harry van Bommel replies

You are right that I have heard, and will continue to hear, far too many stories like yours. Whether it was my own parents, grandfather, or friends with serious illnesses or through stories told to me by people who read my work or talk to me after a presentation, the conclusions you draw are similar. I am not a pain management expert, but I have talked to many. What I have learned is that acute pain and chronic pain are different from terminal pain. There seems to be little literature on how to treat chronic pain effectively. Acute pain seems easier to treat because of its relatively short duration (e.g. after surgery). Terminal pain, likewise, is often easier to treat than chronic pain because its causes are easier to determine (e.g. in end stages of cancer) and the palliative philosophy of pain management gets beyond the myths of addiction, tolerance, and hallucinations that still prevent so many physicians and nurses from believing patients.

You are right that finding the right doctor makes all the difference in the world. You are also right that uncertainty is one of the most debilitating factors in controlling pain when pain management is not consistently done well. Until recently, terminal patients were given pain medication on PRN orders which means they had to experience pain before getting more medication. All this system did was ensure that patients anxiously waited for pain to return before anything was done about it. It was inhumane, in my opinion. Good pain control means getting the right drug, in the right amount, in the right way (usually orally or by suppository), and at the right time (i.e. regularly without pain having to return first). Patients are often given extra medication by their bed in case of “break through” pain so that they do not have to wait for a new medication order. This method is much more civilized and humane. It has not been translated into chronic pain fields as much as it obviously should. There is far too little public and political demand for better pain management of all types of pain and not enough research into improving pain management for all types of pain. It is not “sexy” enough a research project because it does not deal with “curing” something. It is ironic that almost every Canadian suffers pain, swallows some ineffective medication, and continues (until the pain is overwhelming) to assume that this is how it should be.

I hope that you, other readers, other pain suffers, and I can change some of that by being much more vocal in our demands for improvements in pain and symptom control. You cannot heal easily when you suffer overwhelming pain or uncontrolled symptoms like nausea. I don’t have to have a medical degree to figure that out.

Lastly, you are also correct, in my opinion, that mental pain is worse than physical. In my article I mention that about three to five per cent of end-stage cancer pain is very difficult to treat because of emotional or spiritual pain (i.e. mental pain). Sometimes induced sleep for short periods provides the mind enough relief to deal better with all types of pain. It is an imperfect answer for now, but certainly better than unrelieved, overwhelming pain.

Things in medicine only change (e.g. maternity care) when patients demand change; not when doctors initiate change. We need more stories of failure and more stories of how success has been achieved. The difficulty with this issue is that patients with chronic or terminal pain are least likely to stand up for themselves or use any of their remaining emotional energy to make the issue more public and political.

Click here to read the article from part 1

Posted in Culture | Leave a comment

Discussion Group on Dying well (part 1 of 2)

Harry van Bommel
The Next City
March 21, 1998

“WE NEED TO SCHEDULE A TIME FOR OUR NEXT MEETING,” says the chair of the department. “How is May 11th at ten o’clock?”

“Oh, I’m sorry Johan,” says Peter. “That’s the day my father is scheduled to die. Can we do it later in the week?”

This kind of conversation does not occur in Canada today. But it does occur regularly in the Netherlands, where euthanasia and assisted suicide have become the stuff of everyday life. Dutch criminal law does not prosecute physicians for taking a life, let alone making it possible for someone else to take his. In the process, Dutch attitudes to life have been transformed. A close friend can, in good conscience, encourage someone he cares for to end it all. A beleaguered mother, unable to juggle her responsibilities to her children and tend to her ailing father, can without guilt sit him down and explain the logic in ending life sooner rather than later. Once seen as spiritual and sacred, a life more resembles a commodity to be cold-bloodedly analyzed, valued, and, if found wanting, discarded. Euthanasia and assisted suicide now account for more than one Dutch death in 40.

The transformation did not occur overnight. The members of the Royal Dutch Medical Association came to permit euthanasia only after much agonizing and only with strict guidelines to prevent abuse. At first, only competent people facing imminent death from a terminal illness and voluntarily asking for euthanasia were eligible. But the Dutch soon found themselves on a very slippery moral slope. Once euthanasia became thinkable in principle, its practice accelerated. Soon, the death didn’t need to be imminent; then, the illness didn’t need to be terminal. Over the past two decades, euthanasia has become an answer for people who are chronically ill, clinically depressed, anorexic, and who have not requested death at all. Chillingly, death has become a doctor-prescribed cure for any number of afflictions.

To breach the next Dutch frontier — euthanizing healthy elders — early this decade the Dutch Euthanasia Society produced “Through with Life,” a television docudrama about Mrs. van den Berg, a lonely woman in her 70s living in a seniors building. Her husband, daughter, and friends had died before her, and her only consolation, apart from Moniek, a young nursing aide who had befriended her, was biweekly visits from her son. She felt unproductive, her life bereft of meaning. She felt her body slowly deteriorating. She had no interest in the future other than to end it.

At first, Moniek and the son both opposed euthanasia for her, partly from guilt that their companionship was not enough to give her a will to live. But a counsellor at the seniors home convinced them that Mrs. van den Berg’s request was reasonable given her age and loneliness. In the documentary’s moving finale, she receives the lethal dose amid soft visuals and inspiring music that heighten sympathy for her and her right to die.

The docudrama and subsequent attempts to sway public opinion were not quite moving enough: For now, healthy people may not be legally euthanized. But as euthanasia proponents continue to bombard the aging Dutch population with images of rational suicides, public rejection of euthanasia for lonely or unhappy people is likely to fade. The Dutch combine a utilitarian view of human worth with a libertarian belief that individuals have the right to death upon request. But though this combination is most potent in the Netherlands, other western countries don’t lag far behind. The logic of euthanasia becomes especially compelling when our institutions create living hells for vulnerable people. In our compassion, we offer them death as a way out.

Canada, with the world’s highest rate of institutionalized seniors, is ripe for euthanasia arguments. Many seniors rightfully fear the neglect, abuse, and — worst of all — the stamp of worthlessness that brands residents of nursing homes and other long-term care facilities. At a home for the aged in Hamilton, Ontario, 74 per cent of women signed an advance directive refusing life-saving surgery or other major interventions should they become curably ill.

At the other end of life, Western society and, as our agents, physicians routinely define the worth of our children. When parents want children and when society values them, we spare little money, effort, and community spirit to care for them, regardless of the odds, as the septuplets born in the United States last year attest. If, however, the children are unwanted and are seen as burdensome to society, we often allow them to die “for their own good.” As of September, Britain’s Royal College of Pediatrics and Child Health began issuing guidelines advising doctors to pull the plug on children who would survive treatment but with severe mental or physical handicaps. Parents who want treatment for their children are now refused care. Yet the definition of severe is arbitrary; a reliable prognosis of a severe infant disability usually cannot be made. Many parents of people living, working, and raising families today were told to institutionalize their babies decades ago. Medical practitioners have not become any more qualified to predict an infant’s capabilities in later life.

The young and the old are not the only at-risk groups. Because Canada is less homogeneous than the Netherlands, where 96 per cent of citizens come from Dutch stock, we have greater degrees of disparity and more vulnerable people relative to our most valued citizens: People who are poor, speak English poorly or not at all, and people of color as well as people with disabilities receive a poorer quality of medical care and die younger. Peggy McDonough of the department of sociology at Toronto’s York University reviewed studies on income and mortality and concluded that consistently lower income levels predicted increased odds of dying by as much as 30 per cent. If Canada legalizes euthanasia, our many vulnerable people would become even more vulnerable, raising the prospect of euthanasia rates higher than in the Netherlands.

Real and imagined financial pressures may also exacerbate demand for euthanasia. As we’ve become the Me Generation, increasingly committed to instant gratification and individual freedoms, we’ve lost our commitment to family and community, often seeing them as impediments to personal fulfilment. With cutbacks in health care and public acceptance of rationing medical services, our elders, those with few family supports, the poor, and the unassertive will be expected to do the “right” thing and request death rather than costly care. Daniel Callahan, executive director of the Hastings Centre, one of the continent’s leading centres for bioethical research, promotes this very ethic: He calls for age-based rationing of scarce health care resources to reduce health care costs. After a “natural lifespan” of 70 to 80 years — “one in which life’s possibilities have on the whole been achieved and after which death may be understood as a sad, but nonetheless, relatively acceptable event,” he argues against interventions to prolong life.

Managed care organizations would also jump aboard this efficiency bandwagon. In Oregon, which recently legalized assisted suicide, managed care organizations have offered to pay for death rather than long-term care since “the least costly treatment for any illness is lethal medication,” explained Walter Dellinger, acting solicitor general of the United States, in oral presentations opposing euthanasia and assisted suicide to the Supreme Court last year. Efficiency arguments may also dictate dying to save the life of another. To obtain organs, physicians have often surreptitiously shortened the lives of patients with life-threatening conditions to benefit patients with better prognoses. Capitalizing on this kind of calculus, Dr. Jack Kevorkian, who has helped over a dozen people (not all terminally ill) to commit suicide, encourages his patients to donate their organs to any clinic willing to accept them.

ONE STRAIN OF EUTHANASIA SPRINGS FROM the eugenics movement and shares many of its motivations and impulses. Started in western academic-medical circles at the turn of the century, eugenics encouraged the breeding of a purer race to represent the middle-to-upper class educated best of the majority white population. At country fairs in North America, eugenics booths idealized the healthy, blonde, blue-eyed population. Poor, uneducated children, regardless of skin and eye color, were targeted as feebleminded and, therefore, a risk to the rest of the white race. Alberta’s Eugenics Board — formed to protect society from the vices of the feebleminded — approved 4,728 sterilizations of people labelled mentally defective, between 1929 and 1972, among the unknowing boys and girls confined to the Red Deer Provincial Training School.

Eugenics soon expanded to include euthanasia. In Europe and especially Germany, which took euthanasia the furthest, euthanasia thrived in the 1920s “within the culture of medicine, modern intellectualism, academicism, and scientism. The program began not because it was German, or even Nazi, but because it was a phenomenon of western science in general,” explained Wolf Wolfensberger, professor of special education at Syracuse University. “The explicit basis for euthanasia in Germany was described [in 1920], 13 years before Hitler came to power and 20 years before the Nazi euthanasia programs actually began.”

First, German institutions euthanized patients who had severe physical or mental handicaps, including developmental disabilities, mental disorders, tuberculosis, chronic illness, cerebral palsy, and epilepsy. Then, with the quick and easy success of the early phase of the program and with the presence of a death-making apparatus, authorities prescribed euthanasia for people who had lost bladder control and other less severely afflicted people; for dwarfs and others who were physically atypical but not necessarily impaired; for those suspected of genetic and racial taints; and for gypsies and others devalued entirely for their social identities. In time, authorities categorized people with behavior problems, odd-shaped ears, very dark eyes, hair, or complexion into these groups. The killing of the Jews evolved out of the desensitization, legitimization, personnel preparation, and equipment development associated with the killing of handicapped people.

The eugenics movement survived the Second World War — below the radar screen — and remains a hot debate, as shown by controversy over recent books like The Bell Curve and studies from the likes of Philippe Rushton, professor of psychology at the University of Western Ontario, which argue that human intelligence stems partly from race. In 1972, Episcopal theologian and bioethicist, Joseph Fletcher, whose work is a staple in university ethics courses, argued that only people with all 15 of his “indicators of personhood” — among them minimal intelligence, self-awareness, self-control, sense of time, curiosity, and willingness to accept change — should be allowed to live.

Today, eugenics takes a backseat in the euthanasia movement. Euthanasia champions — many of whom oppose eugenics — promote euthanasia as a basic human right. Derek Humphry, co-founder of the Hemlock Society in the U.S. and a world leader in the quest to legalize euthanasia, presents the movement’s goal in Dying with Dignity: “The essays in this book trace the steps toward the ultimate personal liberty: the right to die in the manner, at the time, and by the means that a competent adult wishes.” The movement’s strategy is to begin with legislation allowing living wills and advance directives; then, to modify or adopt regulations that, bit by bit, use the legislation to allow assisted suicide or euthanasia on the voluntary request of competent adults with a terminal illness.

In Canada, Humphry’s views have been winning important adherents: In 1994, the Supreme Court voted by the slimmest of margins, 5 to 4, against Sue Rodriguez, the 42-year-old B.C. woman with ALS, or Lou Gehrig’s disease, who had requested physician-assisted suicide, with Chief Justice Lamer among the dissenters. A person should be able to ask the courts for euthanasia, he argued, thereby minimizing abuse to vulnerable people — a basis for the criminal code law against assisted suicide.

At presentations made during senate committee hearings on euthanasia and assisted suicide in 1994 and 1995, many authorities endorsed euthanasia, and various experts argued for a third category of murder; namely, third degree murder, where “the killing was not exploitive or malicious, but . . . with the perception of the best interests, and on the request, of the person killed.” In the killing of Tracy Latimer, a child with cerebral palsy, the Canadian public and the Saskatchewan judge decided that a mercy killer need not have the consent of the person killed, and shifted their collective compassion from Tracy to the father who killed her.

The public has swung behind these experts: Polls show that about three-quarters of Canadians support the concept of people having the right to doctor-assisted suicide if they suffer unbearable pain.

EVERYONE WHO BELIEVES IN LEGALIZING EUTHANASIA — bioethicists, politicians, the Royal Dutch Medical Association, even Derek Humphry — wants safeguards to protect vulnerable people. Yet workable safeguards are delusions. Take the chief safeguard, informed consent: the requirement that people request euthanasia voluntarily and repeatedly.

Physicians have told me that simply by wearing their white lab coats and presenting information in a favorable light, patients will voluntarily consent to what amounts to tortuous treatments with little hope of success. Patients in desperate straits are emotionally vulnerable and without their normal defences. Physicians taught to practise euthanasia in medical school will have a professional bias to use their skills. How would they answer the common question: “If you were in my shoes, Doctor, what would you do?” No law can safeguard how physicians present information in their confidential conversations with patients.

Informed consent also implies that physicians and other care-givers have the time and training to counsel and help patients understand end-of-life decisions. One only has to examine the health care systems across Canada today to understand that such time is rarely available and rarely paid for by the government.

The safeguards are designed to let people change their minds on a dime, as if this can be effortlessly done in the real world. For example, an Ottawa professor in her late 50s dying of a blood disorder decided to forgo more blood transfusions and end her life within weeks. Soon after her decision, her husband had a stroke. Not wanting to leave him without care, she asked for her blood transfusions to resume. Her physicians turned her down, saying she couldn’t keep changing her mind. Thanks to her position in the community and friendship with the university hospital administrator, she successfully fought the decision, letting her help her husband over the initial period of his convalescence before her death. What would have happened had this person been a cleaning lady at the university?

Another real-world situation: Imagine a woman scheduled to die next week, with relatives flying in from Manitoba and Newfoundland to be by her side. Suddenly she feels better, and though knowing the improvement can’t last long, she begins to think in terms of hanging on long enough to attend her grandson’s graduation in three months. What do you expect will be her choice? How often might a family “allow” a loved one to change a scheduled death?

Those who’d have the courts decide euthanasia to minimize abuse forget that the courts are chronically backlogged. Inevitably, a court-based euthanasia process will be streamlined and bureaucratized into a rubber stamp, just as the courts have resorted to plea bargaining and other administrative shortcuts. In the Netherlands — the only western country to permit euthanasia — doctors do not report a majority — 59 per cent — of euthanasia and assisted suicide cases to Crown attorneys, as required by the Royal Dutch Medical Association and the courts. Dutch doctors, according to a major survey, justified flagrant disregard of reporting procedures due to the “burden” of paperwork. Another Dutch study found that in almost one-quarter of euthanasia and assisted suicide cases, the physicians ended life without the patient’s explicit, concurrent request. If Canada had Dutch death rates, we would have about 6,000 deaths per year by euthanasia, 1,800 people dying without their consent, and 1,000 assisted suicides. That’s five Tracy Latimers a day, and almost three Sue Rodriguezes a day.

The Dutch studies nevertheless endorse euthanasia because neither voluntary euthanasia, assisted suicide, nor unrequested euthanasia increased dramatically from 1990 to 1995, the time period they analyzed. They downplay the significance of so many unreported cases and aren’t disturbed by the death toll of people who did not explicitly request euthanasia. They rationalize the disregard for informed consent by suggesting that doctors had previously killed people surreptitiously without their requests, and that the Dutch euthanasia laws have only brought the practice to light. The studies pooh-pooh the inappropriate deaths as an inevitable cost that must be borne in reaping the greater benefits in legalizing euthanasia. In the Netherlands, physicians brought to court on charges of murder stemming from euthanasia cases are chastised for not following the guidelines but not imprisoned or prevented from practising medicine. The legal and health care systems — despite the official, elaborate safeguards in place — are entirely indifferent to unreported cases of euthanasia.

Canadians have no reason to believe that our crowded courts or beleaguered health care bureaucracy would be any more virtuous or diligent than the Netherlands’. Even before the recent countrywide health care cutbacks, Canadians’ trust of our physicians had been steadily declining. As put by Dr. Katherine Foley, an international pain management authority from the Sloan Kettering Cancer Center, at the 1994 International Congress on Care of the Terminally Ill in Montreal: “If you do not trust your doctors to care for you when you are dying, why would you trust them to kill you?” Foley described how the euthanasia movement rose to counter the failure of modern health care in meeting the needs of the dying. Rather than relieve suffering, extreme medical efforts often caused unnecessary pain. Foley also described another, simultaneous reaction to extreme medicine: the hospice movement, with its interest in alleviating pain through palliative care.

MY INTEREST IN PALLIATIVE CARE — and from it, my opposition to euthanasia — stems from caring first for my mother in Canada and then for my grandfather in the Netherlands and lastly for my father back in Canada, at home, until they died. I learned many things that help make me a better friend, husband, and father. That said, I would not have chosen to care for my mother had professionals offered to provide her with hospice care at home. I was 24 years old, a recent York University graduate studying French at Laval University in Quebec City. I had no experience in caring for any sick person and certainly none for someone who was dying. Preparing for the death of loved ones and for life without them was enough of an emotional roller coaster without having to know about pain and symptom control. However, I didn’t have that choice, and I learned to provide care for my parents and grandfather. I also learned that such experiences provide much laughter, joy, love, and intimacy. That part of the experience I wish everyone could experience. Those moments are life-defining and life-fulfilling.

In caring for my loved ones, a key issue came up over and over again — the need in the dying for a sense of control in the uncontrollable events leading up to a death. A sense of control is all-important. It is about maintaining our roles as parents, grandparents, workers, church members, neighbors. It is about continuing the decision-making process in our lives.

My grandfather had his cancerous stomach removed when he was 91. The surgery was a success but not his recovery. On death’s door on Christmas Eve 1980, he rallied into the following summer until his body couldn’t continue. No longer hungry or even thirsty, he asked us not to prepare his liquid diet any more; he just wanted water with some sugar added. Near the end of his life, he felt eating had become a charade; yet control remained important. Just one day after switching to sugared water, to our bewilderment, he asked for a ham sandwich. Once he got his sandwich he took one bite, smiled broadly, and said, “I just wanted to see if you would give it to me. I don’t want any more, thank you.” He continued to drink his sugared water and died a few weeks later.

A sense of control is important throughout a person’s illness. When my father first experienced seizures and breathing difficulties, he knew that he wasn’t well, but he didn’t know the extent of his illness. After the doctors concluded their tests and diagnoses, I told them I wanted to be the one to tell him he was dying. In going to him, I said: “The test results are in, and I have done some research on your condition. Would you like me to tell you what I found out?”

My father was now in charge. He could hear what I had to say, not hear the news, or wait. He chose to wait, to give himself time to prepare. Over the next few days he asked simple questions that I was able to answer. He took his time, eventually getting all the information I had. Once he had control over the medical information, he was better able emotionally to deal with his emphysema and the tumors in his brain.

Being in charge also requires carrying on with your family responsibilities; it requires refusing to succumb to a self-centred disregard for the needs of those close to you. I once asked my father what was most difficult about dying. “I’m most sad that I cannot be at your wedding in June,” he answered. Yet although his energy was low, although it would have been easier to dwell on dying than on our wedding, he helped us plan the service. We had offered to advance the wedding date and move it east from British Columbia to let him participate, but he wouldn’t hear of it since most of my fiancée’s family lived in the west. As a wedding gift, he gave my fiancée a necklace my mother only wore on important occasions — he felt it was important that her prospective father-in-law provide something very personal and precious, and he beamed with love in doing so. He also arranged for me to receive a wedding card — containing all his hopes and dreams for his only son — the night before my wedding. It must have been very difficult to write knowing that he would not be at the wedding. His effort, however, let me sob part of my grief away on my wedding eve, making the actual wedding day glorious, filled with love, beauty, and the promise of a wonderful family life ahead. An invaluable gift of love that only a parent can bestow.

A sense of control is about self-affirmation, self-respect. It is about being a person. Understandably, many people would rather die than lose their personhood, hence the attraction of euthanasia. We can all identify with the need for self-respect, and we all harbor fears that we might not rise to the challenge when our own time, or that of our family members, arrives. Hence our enormous sympathy for Sue Rodriguez. The thought that she would die a horrible death in front of her family by choking or failing to breathe terrified others with ALS, their family, and friends. More than a few people in the media, in politics, and in the public started conversations with “I’d rather be dead than have ALS.”

Dr. Peter Hargreaves is a family physician who has cared for hospice patients in Liverpool for about 12 years and at Lion’s Hospice near London since it opened five years ago. He has been involved in over 2,500 patient deaths, including many patients with ALS. “You can count on one hand the number of patients who have requested euthanasia,” says Hargreaves. “They usually request euthanasia because of uncontrolled symptoms, pain, nausea, vomiting, and they request it on admission because they ‘have had enough — put me out of my misery.’ If you gain their confidence by rapidly controlling those symptoms, they usually do not request euthanasia any more. If someone has wildly out-of-control pain and symptoms, I will actually get them pain-free, through helping them to sleep straight away — give them four to six hours of pain-free sleep. They come around and say, ‘I feel better. Thank you very much. What are we going to do now?’ Then we plan how to control their pain and symptoms. Once you have done that, the request for euthanasia stops. I have never had a sustained request for euthanasia apart from one patient who was psychotically depressed and needed treatment for the psychosis.”

What about the media reports of choking? According to Hargreaves and nurse Sue Watts, day programme manager at Lion’s Hospice, many family members give ALS patients liquids, thinking liquids are easier to swallow. However, because liquids do not trigger a swallowing reflex, they may actually cause a choking episode, leading the carers and the patients alike to believe that they cannot eat or drink anything anymore. “People stop giving themselves nourishment if they do not know better and that is a real pity,” says Watts. “Thickening fluids while making sure there are no solid lumps is most effective. Pureeing solids or softening food is also very effective.”

Concerns over patients who have trouble breathing are also misunderstood, explains Hargreaves. “The breathing — this thing about respiratory arrest — sounds very dramatic, but it is not a dramatic end. You tend to find that the breathing becomes gradually more shallow. Quite often the patient does not realize that the breathing has changed. When you look at them, you think, ‘they’re not breathing very much or very deeply.’ They get used to a smaller respiratory reserve. If they become aware of distressing breathing or they feel they cannot take a deep enough breath, then there are plenty of drugs that we can use to take the edge off that, to remove the fear about not being able to take a deep enough breath. You can deliver those drugs in a variety of means — at home, in the community, or in an inpatient setting. The important thing is, if you are at home, we need a range of drugs available for carers to give or for general practitioners, district nurses, or whomever. We encourage carers to be very much involved in this.”

The British ALS association has produced a box it calls the Breathing Space Kit. It has two little drawers, one marked for the visiting home doctors and nurses and one for the family carer. It encourages the family physician to put medication into each drawer, and trains the nurse and carer in how to handle them. Doing so gives everyone a sense of control and helps prevent crisis situations. Families are able, with telephone assistance, to give their loved ones whatever drugs they need.

Dying in unbearable pain, the greatest fear of people with a life-threatening or terminal illness, is a leading cause of demand for euthanasia. This fear would subside enormously if medical training simply kept up with medical knowledge. In knowledgeable hands, medicine can help most people be relatively pain-free and alert until they die. For example, only in five per cent of people experiencing pain with advanced cancer will the pain be difficult to manage. This five per cent represents people whose emotional or spiritual difficulties heighten their pain and make it untreatable solely with drugs. Their best medicine would be support from family, spiritual leaders, social workers, counsellors, or nurses. When all else fails, pain relief could come through short periods of drug-induced sleep.

Once pain is managed, other symptoms like nausea, vomiting, bed sores, and dry mouths are more easily controlled. Palliative care physicians provide such pain and symptom control successfully every day across Canada. Dr. Cicely Saunders, the modern founder of hospice care, has been teaching for decades that “nobody need die suffering.”

WHEN MY MOTHER WAS DYING, she received too little pain medication, and she received it too infrequently because her physicians, her pharmacists, and her family all thought that if we gave her too much, too soon, the drug’s power would soon wear off, leaving nothing to give her later on in the illness. We couldn’t have been more wrong.

Dr. Dorothy Ley, a palliative care pioneer in Canada and the first woman to win the Canadian Medical Association Medal of Service, explained that the “myth that too much pain medication will lead to addiction is just that — myth. When medication is used to combat pain, there is no high associated with its use and, therefore, no addiction.” The myths about addiction and drug tolerance were disproved by these and many other palliative care doctors in the 1960s and 1970s, yet they persist simply because the message has not hit home. Most physicians, untrained in pain control, remain hesitant to use modern methods, even though these methods are proven effective.

To help people live life fully until they die, their pain must be relieved immediately — “saving” larger doses for possible future pain is pointless when proper pain and symptom control can add months or years to a person’s life. Patients who are alert, comfortable, and able to meaningfully participate in their lives have more energy and hope for the immediate future.

Colleen McArthur was diagnosed with non-Hodgkin’s lymphoma in December 1988. On Boxing Day, her massive spleen was removed, revealing a cancer that had spread into surrounding lymph nodes and to the liver. Chemotherapy began right away, to her distress. “I was shocked to learn about the side effects. They told me my hair would fall out and that I would experience numbness in my fingers and feet. I was a piano teacher, the numbness made me feel very sad,” recalls McArthur.

Her chemotherapy began at St. Boniface Hospital in Winnipeg, but rather than drive the 120 miles into Winnipeg every three weeks, she switched to Morden District General Hospital, near her home, which fortunately had an outreach centre and a palliative care coordinator, Thelma Alexander. “Somehow she [Alexander] must have known about my first visit because there she was offering me a pencil and paper to write questions for my doctor. She was so kind and really made me feel comfortable on a very apprehensive day. This atmosphere of genuine concern and kindness continued right through my months of treatment.”

Alexander and her colleagues use various non-medical skills to help their clients deal with their physical and emotional pain and suffering. For McArthur, the care began with their willingness to simply listen and attend medical appointments with her; later, they helped her set up a cancer support group in her own community. Later still, it included gentle stretching exercises as well as deep breathing, Tai Chi, and long walks (out of the sun because chemotherapy and direct sunlight are not a healthy combination for skin).

McArthur has been relatively pain-free and alert because of excellent medical attention over the past nine years that let her remain in her community and share in the lives of people she loves most. But McArthur’s experience is rare. Most dying people do not receive such physical, emotional, spiritual, and informational supports. Until they do, some lonely, isolated, and suffering people — estimated at three per cent of the population — will continue to request death.

With a concerted effort, Canada could be substantially pain-free in five years — “Pain-Free by 2003” is the refrain of a nascent pain and symptom control campaign that aims to educate the public and the medical establishment that pain and discomforting symptoms need not be the norm for people who are dying. We must all learn to expect the opposite, and to view the existence of patients in pain as a national disgrace. Once we’ve learned to tend to our patients mercifully, once the public recognizes that they and their loved ones need not fear a painful death, the public debate over euthanasia will diminish and perhaps even disappear.

Our world includes unconscious trends that defy logic. On the one hand, we have physicians in Canada, Australia, and Europe developing a suicide pill for terminally ill patients upon request. On the other we have the knowledge and skills to make Canada a pain-free zone and the resources to treat vulnerable people (including most people with a terminal illness) with the dignity they require. We have the wherewithal to work together to dramatically improve the situations of people who are dying, chronically ill, or vulnerable. Will we?

Click here to continue on to read part 2: Letters

Posted in Culture | Leave a comment

Book reviews – Our relentless drive to cooperate

Elizabeth Brubaker
The Next City
March 21, 1998

 

The Origins of Virtue: Human Instincts and the Evolution of Cooperation

by Matt Ridley (Viking, 1997. 304 pages) $32.99

SCANNING THE INDEX OF The Origins of Virtue, a prospective reader will be hard-pressed to guess what the book is about. The index’s references are anthropological (from the !Kung’s intoleranceof hoarding to competitive gift giving among the Wakayse), biological (from B-chromosomes to cell replication), and zoological (from ants’ division of labor to brain size in vampire bats). The index refers to economists, political theorists, and historical events. A prospective reader may well find the array bewildering. What on earth can one make of a book that discusses the International Rice Research Institute, the emperor Justinian, and lie detectors in fewer than 300 pages? What do the Hutterites have to do with honey bees? How will author Matt Ridley manage to pull together elements as disparate as slime mould and Sophist philosophers?

Remarkably, what emerges is a coherent and engaging examination of cooperation in the natural world. Ridley demonstrates an impressive ability to synthesize; he has the imaginative flexibility “to see in the actions of hunter-gatherers distant echoes of the origins of modern markets in financial derivatives.” In layering dozens of dissimilar examples and arguments, he builds nothing less than a theory of human society, concluding that societies will thrive if governed by small-scale, local institutions that promote private ownership (either individual or communal) and protect property rights. Such institutions will permit easy communication, provide accountability mechanisms, and foster social and ecological responsibility.

Ridley wants to know why people behave virtuously. Why do self-interested men and women cooperate to achieve apparently selfless goals? Ridley (who holds a PhD in zoology) looks for the roots of human society in evolutionary biology. He finds collaboration everywhere — at the cellular level, within bodies, and among insects and animals.

Genetic interest explains collaboration within insect groups and most other animal families. Virtually all cooperators are either behaving selfishly (finding safety in numbers, for example) or aiding their families (who share their genetic material). Human beings are the exception to this rule: We are distinguished by our altruism toward a group of genetically unrelated humans.

But Ridley has no illusions about our altruism. He concludes, after surveying our evolutionary ancestors, primitive peoples, and vastly different contemporary societies, that we cooperate not because we are inherently nice, nor because we are compelled to do so by church or government, but because we have learned, as we have evolved, that cooperation works. Cooperation is a successful evolutionary strategy.

Central to Ridley’s analysis of cooperation is the prisoner’s dilemma. This game theory classic supposes that two prisoners are being questioned separately. Both will eventually go free if both remain silent since the authorities will lack evidence to convict them. But since neither can be assured of the other’s silence, both have an incentive to “defect,” or to give evidence against the other. When players play just one round of a prisoner’s dilemma game, they quite wisely defect. When they play the game repeatedly, however, they learn that selfishness is irrational since defections will be repaid in kind in future games. It pays to cooperate. And it pays to establish a reputation for trustworthiness to encourage others to play the game with them in the future.

Ridley sees prisoner’s dilemmas wherever individual self-interest conflicts with the public good. Fisheries provide a perfect example: If all fishers were to cooperate and limit their fishing efforts, all would benefit from healthier stocks. But because no fisher can be confident his restraint will be followed by others, each has an incentive to catch as much as he can. Otherwise, the fish he leaves uncaught might well end up in his competitors’ nets. The result? Overfishing by all to the long-term benefit of none. But just as repeated interactions and the establishment of reputations encourage players in prisoner’s dilemma games to cooperate, so do they in real life fisheries. Small-scale, communally managed fisheries, where fishers regularly communicate among themselves, set rules, and retaliate against cheaters, have proven remarkably sustainable.

Ridley argues, however, that some resources won’t be sustainably managed because humans have no innate environmental ethic. In resource use as in other areas, we are governed not by instinctive virtue but by enlightened self-interest. Only resources that can be owned — protected against use by outsiders — will be conserved, since their conservation will benefit the owners. It was because wild game could not be owned that our ancestors “extinguished their way across the planet.” Without the tools to establish property rights, our ancestors lacked the incentives to restrain their resource use. Property rights, Ridley concludes, are “the key to ecological virtue.”

Tragically, governments, oblivious to the essential role of property, have destroyed many traditional systems of sustainable resource use. The replacement of common law property rights with government regulations and the nationalization of forests, rivers, and fisheries have been responsible for much pollution and resource depletion.

Governments are the problem rather than the solution not only in environmental matters but also in social matters. As government intervention has destroyed people’s incentives to preserve resources, so has the heavy hand of the state dulled people’s sense of social responsibility. A system based on authority rather than reciprocity, coercion rather than self-interested development of reputation, and edict rather than communication creates prisoner’s dilemmas writ large and threatens the social instincts that have so successfully evolved in our species.

Given our instinct for cooperation, Ridley is confident that virtue and harmony can make a comeback. The key is to devise institutions that bring out our virtuous instincts and harness our self-interest in productive ways. Given that civic virtue predates both church and state, workable social institutions need not rely on the authority of either. Indeed, they should be as fully devolved as possible. Local institutions that are small enough to encourage communication and generate trust will engender cooperation. Built from the bottom up, they will reflect and promote our virtue.

Posted in Culture | Leave a comment

What would happen if . . . we raised highway speed limits?

Frank Navin and Michael Cain
The Next City
March 21, 1998

 

We asked Frank Navin, professor of civil engineering, and Michael Cain, director of research at Saftey by Education (Not Speed Enforcement),to comment

We would be flouting science. It doesn’t take a rocket scientist to realize the faster you hit a solid object the more severe your injuries. It does, however, take a high level of science to measure the effects of increased highway speeds on our safety. Most modern vehicles are built to allow a belted occupant to survive, with little injury, a 50 km/h head-on impact into a solid object. Although expressways are usually driven at 100 km/h or more, depending on the posted speed limit and the level of police enforcement, engineered roadside objects have protective outer barriers that can be hit at the highway’s designed speed — usually within 10 km/h of the posted speed. But the faster a driver is travelling before crashing, the worse the impact. While a 50 km/h impact is equivalent to dropping a car from the top of a two-storey building, a 100 km/h impact is equivalent to dropping 11 storeys, and a 150 km/h crash to almost 30 storeys.

The number and severity of crashes would increase. Canadian studies show that increased speed increases the chance of crashing, and a Finnish study observed that for every one km/h speed increase, the number of injuries increased by three per cent and the injury costs doubled. These negative effects come about because of the shorter time for the driver’s observation, decision making, and action, made worse by the shortened distance available for braking and steering.

Actually, we need increased enforcement and reduced speeds. The combination of reasonable speed limits and vigorous police enforcement effectively reduces the number and severity of injuries and fatalities on our highways. Electronic police enforcement by photo radar cameras can be one useful tool. For example, Victoria, Australia, claims that photo radar has reduced its road fatalities by about 11 per cent.

We’d be recognizing the speed at which most motorists already drive. Decades of research around the world proves that the upper end of average traffic speeds are safest for our highways and major roads. This reduces variations in speed among vehicles, which has long been identified as a far greater cause of accidents than absolute speed.

There wouldn’t be more accidents. Contrary to aggregated statistics produced by governments, mere speed is rarely the sole contributing factor in a crash; in fact, research shows that two-thirds of speed-related fatalities involve either drugs, alcohol, or both. Of the remainder, about two-thirds occur below the speed limit and are related to road conditions. In late 1995, the U.S. Congress allowed states to set their own maximum speed limits. The safety lobby predicted that fatalities would increase by 6,400 or about 15 per cent. Results now confirm that fatalities were within a statistically insignificant 0.2 per cent.

Government and insurance companies would stop gouging us. These bodies may appear safety conscious when they push for lower speed limits, but the desire for revenue underlies their motives. Since motorists frequently face 50 per cent increases in basic insurance for just two speeding tickets, it’s not surprising that the Insurance Institute for Highway Safety is the largest group lobbying against raising American speed limits. In fact, insurance companies frequently supply speed enforcement equipment for police departments.

Police could concentrate on important issues rather than on revenue and photo radar. Citizens suffer as police focus more on activities that return revenue (speed enforcement) and less on traffic activities that, while time intensive to enforce, reduce both accidents and driver frustration (red light running, failing to yield, and impaired driving).

Michael Cain

Frank Navin

Posted in Automobile, Transportation | Leave a comment