Skip to content

After Twenty-Five Years, Is It Time for Second Look at Harm Reduction? (Part #2)

Forgetting the Centrality of Culture in Drug Use

In 1987, the CBC classic TV series Degrassi Junior High aired an episode about the principal character, Joey convincing his younger classmates that he was a drug dealer and selling them a random collection of vitamins and pain killers under the pretense that they were psychoactive recreational drugs. Much hilarity ensued as the characters consuming the fake drugs acted out in increasingly disinhibited and bizarre ways, based solely on their expectations of how young people on drugs are expected to act that they had learned from watching TV.

The episode was a tribute to a study conducted at the University of Washington in Seattle, whose results were published in Psychology Today in 1981. The psychology department had taken control of a campus pub and lured students into an experiment conducted under false pretenses. The study was marketed to the students as a study of young adult social behaviour but, in fact, it was a study of the scale of the placebo effect with respect to intoxicants. While some students were served drinks with the normal alcoholic content, others were served drinks with no alcohol. Their behaviour was almost indistinguishable. They behaved in increasingly disinhibited and sexually aggressive ways, the more they drank irrespective of the amount of alcohol they actually consumed, but correlated strongly to the number of drinks they had downed.

Brigham Young University, which observes stricter anti-substance use rules than the Mormon Church as a whole, prohibits its students from drinking tea and coffee, as well as alcohol and has succeeded in having the possession of electric kettles prohibited in every private apartment complex that wishes to rent to its students. Nevertheless, if you visit the campus ice cream parlour late at night, on Friday or Saturday, you find young adults behaving not that differently than the kids at the University of Washington. Because Mormons know how to enjoy a sugar high like no other people.

Expectations of the effects of a substance profoundly condition how humans experience the substance and how we behave after using it, especially in social contexts. And the main thing that forms those expectations is the culture in which we are situated.

Once upon a time, the fact that culture, as much or more than chemistry conditions the behaviour of intoxicated people was common knowledge, something to be joked about in the 1980s. But with the rise of crack, an admittedly more immediately intoxicating and addictive way of using cocaine than insufflation, society went a different route.
            Differences in cocaine use, abuse and addiction patterns between people living in housing projects and people working on Wall Street were increasingly explained chemically rather than socially. Crack was slightly chemically different from regular cocaine and, all other things being equal, did produce a shorter, more intense high. But wealthy cocaine users had been “free basing” cocaine (i.e. doing crack) for years already, with largely the same social and medical experiences as those snorting it.

Much more important, though, the Reagan Administration and the first Bush presidency, had ideological reasons for suggesting that physical differences, not social differences accounted for America’s diverging cocaine cultures.

The idea that black people were physically different from white people and that this was mirrored in the (actually quite minor) chemical differences between coca-leaf derivatives, brown-coloured crack rocks, and their lighter-coloured cousin, powder cocaine became the sole explanation for the major differences between the cocaine-using community on Wall Street and the one down the road in Harlem. The neoliberal austerity programs ravaging low-income communities and their cultural impact could be factored-out if we removed culture from how we analyze substance use and abuse, as could the massive windfall profits Reagan’s deregulation of the stock market were producing on Wall Street.

We became dumber about drugs in the 1980s and lost a good deal of common knowledge about the profound impact of culture in shaping how people feel and what they will do both to obtain drugs and once they are under their influence. Ultimately, the effects of a drug on a person are a complex dance of culture, neurology and chemistry. And if we factor any of those three things out, we make bad drug policy.

Some Historical Examples

In the nineteenth century, one of the reasons residential schools seemed like such a good idea to social reformers was the rise of the “drunken Indian” trope. Social reformers saw an epidemic of alcoholism ripping through Indigenous communities, destroying their social fabric and sought to act in a variety of ways. But the story was more complex.

While it is certainly true that the virgin soil epidemics, colonization and war had traumatized many people and communities, the reality was far more complex than just addiction caused by trauma. The places where drunkenness was viewed as a dangerous epidemic sweeping Indigenous communities were all places that had no prior history of alcohol use. Epidemics of drunkenness were generally not reported by Spanish colonizers in the Mexico Valley and Yucatan Peninsula were alcohol was already a popular intoxicant.

Instead, the places where epidemics of drunkenness were reported were in North America, where the use of intoxicants was based around very different traditions. In these regions, hallucinogenic plants were the preferred intoxicants and, aside from a small number of holy men and women who used them habitually, their use was generally confined to major bacchanalian festivals that took place just a few times a year, like the Huron/Wendat “going out of your head” festival.

These long term, stable societies were based around seasonal (often centred on the spring and fall wild mushroom harvesting seasons) binge use in which people used drugs in ways Mikhail Bakhtin associated with carnival. Pre-existing North American drug culture was based around habitual use by a small subset of the elite and seasonal binge use followed by weeks or months of abstinence.

When alcohol arrived, it was initially fitted into pre-existing consumption patterns before new cultures of drinking grew up. So, when Europeans encountered Indigenous alcohol use, it was when they witnessed a village-wide or confederacy-wide episode of binge-drinking, a multi-day bender that shut down the economy and society. Rather than seeing this as episodic, they, based on their own drinking cultures, assumed they were watching a horrifying decline of whole societies into drunkenness and abjection.

If there is one particular sort of drug whose consumption we can see as culturally conditioned, it would have to be central nervous system stimulants, such as coca, caffeine and amphetamines.

When colonists arrived in Peru, coca’s primary use was as an appetite suppressant and energy booster for people engaged in seasonal manual labour building imperial infrastructure. It is not simply that Andean peasants consumed coca at a lower molarity than one finds in cocaine, it was seen, in the heartland of the Inca Empire, as work drug of the peasant class. Its recreational use was off the radar, especially for commoners because of its strong association with repetitive toil.

And when coca first came on the scene in Victorian North America, its main uses were as an anaesthetic and as a tonic for stomach ailments and energy drink for the health conscious in beverages such as Coca Cola. Cocaine’s recreational use gradually spread outward from the medical profession as new cultures of elite use slowly, haltingly came into being over decades.

As supplies of coca declined following its criminalization in the 1920s and the destruction of the Dutch cocaine plantations in Indonesia during decolonization, America cast about for new powerful work drugs designed to improve concentration and productivity and adopted the Nazis’ strategy of synthesizing methamphetamine and other forms of speed as a cocaine alternative. Shockingly large portions of the American and Japanese populations were amphetamine users during the 1950s but no recreational culture developed because the drugs were associated with work.

The rise of amphetamines as primarily recreational drugs took place in the 1990s when youth cultures reinterpreted stimulants as disinhibiting, recreational and hedonistic, instead of focusing, study-oriented and ascetic. This happened largely by coincidence as the mass diagnosis of children with ADHD, which was treated with prescription amphetamines, coincided with the popularization of MDMA as a party drug. As young people discovered that their study drugs could function as a cheap MDMA alternative, drug cultures shifted. Now, a drug that had, for decades, made people more focused, diligent and self-controlled now made them flighty and hedonistic without a single atom of the actual drugs changing.

An interesting footnote: in Thailand, local methamphetamine is often mixed with caffeine to give it an extra kick, something that seems risible to North Americans who cannot understand why one would add a work drug to a fun drug.

I would be belabouring the point unnecessarily to provide further examples but those interested might want to look at the traditional Ethiopian coffee ceremony and the hot chocolate pub scene in early modern Europe for other examples.

The Worship of Self-Harm and Self-Harmers

I want to suggest that if we want to see drug-fueled madness, death and misery actually decline in our society, we need to think about both the cultures of drug use within our society and the cultural views of drugs in society at large.

Our society appears to be going through a secularized version of what was called the Athlete of God phenomenon in early Christianity. Flamboyant self-harmers, whether shooting fentanyl or chopping off their genitals are, on one hand, thought of as oracles, special people with special rights who cannot be obstructed on their special path. Yet, on the other hand, are permitted to live in states of terrible ill health, material deprivation and physical and psychological misery because this is their enlightened choice, a choice so profound that no non-oracle can understand it.

In other words, we have a double consciousness about drug users: they are concurrently infallible oracles with a special path to the truth and, on the other, people whose death and misery we are bending over backwards to facilitate.

We have also effectively made drug-user a full-time job, a vocation that people can publicly proclaim and receive applause and the chorus of “You’re so brave!” that anyone engaged in flamboyant acts of self-harm receives from our mainstream media and community leaders.

I do not have the solution to this problem. The best I can do is describe the problem better. If there is one thing we are learning in the twenty-first century, it is that we need to pay more attention to the writing of Antonio Gramsci and other who remind us that the state is not very powerful by itself, relative to the institutions and culture of the society it seeks to govern. Because I do not believe that, by itself, the state can do much to arrest this tailspin. What we need is a social movement that rejects the progressive turn of worshipping self-harm and self-harmers without returning to the prohibitionist ethos whose failure produced the original harm reduction movement.