Skip to content

Life Near the Colour Line – Part 1

“The problem of the Twentieth Century is the problem of the colour-line.” – W. E. B. Du Bois, 1900

I guess it’s still the twentieth century because boy is white Anglo America mad about the colour line today. A troubled and somewhat dishonest woman from a family that contains both black and white members has been outed in the international media by her white parents for “impersonating” a black person. They want you and me to know that she has no business being the president of the NAACP’s Spokane chapter because she does not have one drop of black blood. Who gives a shit what the NAACP thinks or whether they want her representing them? We know who gets to draw the colour line; and it sure as shit isn’t politically engaged black folks. As we are all supposed to know, blackness lives in the blood, in ancestry; it is inalterable, a stain upon a lineage passed from one generation to the next.

Of course, that’s only if you don’t live near the colour line.

I have lived near the colour line for most of my life. My black mother and white father didn’t travel to much of the US during their thirteen-year marriage because their relationship and my very existence violated the miscegenation laws that remained on the books in southern states throughout much of their relationship.

That’s why, when I was born, older relatives from both sides of the family breathed a collective sigh of relief: I had “such good skin,” meaning, of course, that I could be mistaken for white if nobody looked too closely at my hair or my physique. Like my great aunt Connie, I had a body that could “pass.” Passing was made easier living in a wealthy neighbourhood in which my mother was mistaken by the casual observer for my nanny or some other sort of domestic servant. (Her dad was afraid of visiting us, in case his visits exposed us.) It was helped too by her pouring her savings into sending me to a private school that had no black students.

But it was there that I was found out. Even before I knew what passing was, my famous black uncle died and I got to be called “nigger” by the school bullies. That went on for a year and a half but then it stopped because the colour line was moving.

When my dad took me to Kenya in 1988 to photograph wildlife, we visited the equator. You know that story about how water runs straight down a bathtub drain without corkscrewing clockwise, as it does in the Northern Hemisphere or counter-clockwise, as it does in the Southern? It’s supposed to go straight down at the equator. In Meru, on the equator, you can pay to see this. But what they don’t tell you is that you need four burly men holding that detached bathtub running north, running south, chasing the equator as the magnetic fields creating it jerk and twist it this way and that.

Here, in the temperate zones, we see the equator as a stable absolute, a fixed point etched into the surface of the earth. But when you’re actually there, it bounces this way and that, like a taut skipping rope being used in a tug of war. If you have to relate to the equator, sometimes you are chasing it carrying a half-full bathtub, with a bunch of confused white tourists in toe jogging to keep up. And sometimes it just rolls over you and there’s nothing you can do about it.

That’s what the colour line did to me. By the time I went to Kenya, there was no longer any such thing as passing. Race was alive and well in America but the age of the one-drop rule was over. Race now lived in your body’s external appearance.

The terms “white trash” and “hillbilly” stopped referring to the white descendants of people whose ancestors had been sold into servitude on tobacco, rice and indigo plantations in the seventeenth century. Now these terms were just class descriptors. With sufficient education and wealth, members of this formerly racialized group could become like the kind of people my great aunt Connie and I were becoming: people who were suddenly on the white side of the colour line.

We were free!

Of course, there were some problems with that. Like all those people who lost their Indian status by getting law degrees or white husbands, I experienced some ambivalence going from being the same race as one half of my relatives to being the same race as the other half. Perhaps that had something to do with having no say about when or whether the colour line rolled over me.

But it did mean that I had new uses in BC’s tiny black community. In the early nineties, some community activists attempted to co-own my leadership of the BC Green Party. They wanted to be able to honour me for being the first black political party leader in BC. Except they couldn’t do that. We settled on “the first leader of African descent.”

I was also useful at that time because the keynote speaker at a major community event had recently given some very problematic advice. This young man was nearly as light-skinned as I. And he had some advice for young black people about who to achieve success for their families: “marry white,” he said, like his dad had, “most people don’t even realize I’m black.” It was at that moment, that community activists needed people like me and the editor of our newspaper, The Afro-Carib News, to defy the placement of the colour line and stand with our black family members. And stand proudly.

Many of my friends saw this as a joke, something worthy of their laughter. And it was and is absurd. It was the same kind of absurdity I and my white tour group felt as those burly sweating men ran back and forth with a metal bathtub in Meru in 1988. It provoked a nervous, desperate laughter as they saw a fixed reference point wriggle and bounce before their eyes, while desperate, sweating Africans chased it in the heat of the day.

I will post a second part soon. Clearly this essay is not done.

No Time for Conservative Defenses of the Broadcast Consortium

I am a conservative. My political allies are conservative. But we don’t know we are. And so we make mistakes. Lots of mistakes.

By “conservative” I mean that my politics are centred on a nostalgic defense of an irreversibly collapsing social order that was already in decline when I came of age politically. I believe in twentieth-century Cold War welfare states with universal social programs, large middle classes and a commitment to social equality.

We New Democrats are the only true conservatives in Canada, cowardly, nostalgic and willfully blind. Often too frightened to open our eyes and see that not only is the old, industrial, unionized, universalist Canada collapsing, the material and political conditions that enabled it to exist in the first place are no more. Welfare states were creatures of the Cold War, polities whose social contract was necessitated by the Communist Threat. It was necessary for global capital to make a lie of communists’ claim that capitalism magnified inequality, impoverishing, brutalizing and marginalizing the majority, rendering them less secure, physically, materially and socially.

That need has passed. There is no global order challenging capitalism and so its expensive advertising campaign trumpeting its socially just, redistributive nature can be dismantled, either slowly, by Third Way parties or quickly by neoconservative parties.

We true conservatives, as distinct from the radical, triumphant social movements and parties of the far right who have taken on the name “conservative” as a means of obfuscating their agenda of radical social change, know what our job is: slow the dismantling of the welfare state and mitigate the excesses of the market to the extent that the investor class, financial institutions and bond raters permit.

For this reason, we instinctively leap to the defense of any Cold War institution that comes under attack. And so, today, we come to the Broadcast Consortium. Most of my friends and allies in the NDP, Liberal and Green parties are outraged that Stephen Harper is going to boycott the Consortium’s leadership debates in favour of cherry-picking broadcasters and formats that play to his strengths.

Because the Consortium is part of the institutional framework of the Canadian welfare state, we naturally assume that this body exists to safeguard the public trust and maintain our democratic institutions. It does not. The Consortium is just the three Cold War-era Canadian TV networks, two of which are private, for-profit corporations, and the third, the beleaguered CBC, dying of a thousand cuts, its board stacked with Harper appointees.

In other democracies, debates are run by organs of the state, charged with fair and equal election coverage, based on transparent values encoded in law and regulations. If Canada truly had a system of fair election debates based on our democratic values, such debates would be administered by the Canadian Radio and Telecommunications Commission (CRTC) or Elections Canada. But that is not the Canadian way. Big communications companies run our election debates based on the needs of their shareholders, not of Canadian voters.

If Canadians on the left really are to snap out of our conservatism and stand for something other than things getting worse slower, these debates are as good a place as any to do so. Instead of defending the Consortium controlled by Bell-Globe and Shaw-Global, the media giants who endorsed Jim Prentice’s re-election bid, let us call, instead of an end to leaders’ debates where big money calls the shots; let’s call for debates under the aegis of Elections Canada’s Elections Advisory Committee or a new committee of the CRTC. Let’s talk not just about defending the older, gentler plutocracy of the Canadian state; let’s call for something better than a corrupt, broken Cold War theory of electoral fairness.

In recent months, New Democrats have begun to shake off our conservatism with our commitment to bold and novel social reforms and new programs like the national child care plan. Let’s keep that up and use our position as the official opposition to set out competing terms to Mr. Harper’s for a national leaders’ debate, rather than simply defend a broken status quo.

Harper’s Unforeseen Doom: the Notley Victory in Historical Context

On its own, the sudden election of the NDP a majority provincial government in Alberta, vaulting from 10% of the popular vote to 41% and four seats to fifty-three is a story with national implications. It goes without saying that this election will have a profound effect on Canadian politics and is likely to have major realigning effects on our federal party system and on federal policy issues.

Usually, I sound a note of caution in response to claims of major national political realignment in response to a single election result. But this time, I am doing the opposite. I want to suggest that the impact of the NDP’s sweeping provincial victory on national politics, and, in particular, on the coming federal election may actually be greater than pundits anticipate.

And that is because of a phenomenon that a Twitter commentator noted when trying to explain the continuation of the NDP surge through the final week of the campaign: “Albertans don’t vote. They stampede.”

Many observers may not yet have noticed a distinctive property of Alberta voters: since entering Confederation as a province in 1905, the province’s parliamentary delegation to Ottawa has been defined by the sitting provincial government. Residents of other provinces unfamiliar with Alberta electoral history might think that I am making an overstatement when I say that the single most important factor in determining what party Alberta voters send to represent them in parliament, over more than a century of elections, are the preferences and alignment of their provincial government. So, for the next few paragraphs, I am going to drown you in statistics.

From 1905 until 1921, Alberta was governed by the Liberal Party. And in the 1908 and 1911 elections, over 50% of Albertans voted Liberal and Liberal MPs won 67% and 75%, respectively, of the province’s federal ridings. But, in 1917, the wartime national unity government of Conservative Prime Minister Robert Borden won a strong majority, as it did throughout English Canada. Then, in 1921, the province was swept by the United Farmers of Alberta, a party strongly associated with the national Progressive Party. The UFA held office until 1935. Continuing the trend of the Liberal era, Progressive/UFA candidates were the majority of the delegations Alberta sent to Ottawa in every election: 1921, 1925, 1926 and 1930.

In 1935, provincial voters swept the United Farmers from office and replaced them with Social Credit. And axiomatically, beginning in the 1935 federal election, Albertans sent Social Credit majorities to Ottawa in every election for a generation: 1935, 1940, 1945, 1949, 1953 and 1957. This changed in 1958. The populism of Conservative leader John Diefenbaker captured the imagination of English Canada and permanently changed the character of the Progressive Conservative Party. While Social Credit did recover and become Alberta’s second-largest federal party in the 1962, 1963 and 1965 elections, capturing between 23% and 29% in each, the divided loyalties of Alberta voters reflected the divided loyalties of the Social Credit government, many of whom were more sympathetic with the federal Tories than with the increasingly crankish and anti-Semitic federal Socreds.

In 1971, Albertans threw out Social Credit and elected the Progressive Conservative dynasty that was defeated just tonight. And, in lockstep with their new provincial government, Alberta voters began delivering crushing victories to the PCs federally for a generation. Indeed, the 1972, 1974, 1979, 1984 and 1988 delegations Albertans sent to Ottawa were unanimously Conservative.

This one-party hegemony at the federal level came to an end in 1993 not because Alberta voters began to diverge from their provincial government but because, as in 1958, that government came to be divided between Progressive Conservative loyalists and Reform Party sympathizers, the latter group quickly becoming the majority. As the Reform Party grew and changed at the national level to incorporate former Progressive Conservatives, first as the Canadian Alliance and then as the Conservative Party, it continued to enjoy the confidence both of the Alberta provincial government and Alberta voters who delivered overwhelming majorities of the province’s seats to these parties and, beginning in 2004, to the reconstituted Conservative Party of Canada, which won over 60% of the vote and over 90% of the seats in the province in 2004, 2006, 2008 and 2011.

Which brings us to the present.

What will be keeping Stephen Harper up tonight is this question: will Alberta voters do what they have done every other time they have defeated a provincial governing party and granted a sweeping majority to a new political formation in the past 100 years? Will the new government of Alberta be given what Alberta voters always give their provincial government: a massive delegation of MPs to back them up in Ottawa and send the whole country the message that Alberta has changed?

Let’s hope that, while much can change in Alberta, some political traditions remain strong for the province’s next century in Confederation. Because if this realigning election in Alberta is anything like the province’s other three, 1921, 1935 and 1971, Edmonton-Strathcona MP Linda Duncan will have at least seventeen new Alberta NDP MPs joining her in Ottawa this fall.

Culture and Institutions in Canadian Politics: The Rise of First Minister Autocracy and the Russification of Canadian Political Culture

As time goes on, Canada’s political culture and traditions are increasingly divergent from other parliamentary democracies. While some mistakenly characterize this as a reflection of global trends or a result of the Americanization of our political ideologies and brands, it is a mistake to think that the increasing autocracy and centralization in our political system is part of a global trend. Critics of the rise of first minister autocracy, when they don’t blame the US or indict a non-existent global trend, often choose the low-hanging fruit of blaming the shifts in our system on individual bad actors, typically Pierre Trudeau, Jean Chretien and Stephen Harper.

While this comes somewhat nearer the mark, these analyses miss the big picture because they ignore three interdependent variables: (1) the rules political parties adopt for governing themselves, (2) the rules parliament adopts to regulate political parties and (3) the cultural shifts both within parties and among voters generally related to these changes. Based on this emphasis, I date Canada’s divergence from normative global parliamentary practice to the early 1990s.

Three largely unrelated phenomena converged in the lead-up to the 1993 federal election to send Canada on its new course towards first minister autocracy and away from the parliamentary traditions of the rest of the English-speaking world. Ironically, the social movements and ideologies that gave rise to these things were quite similar to movements in other Westminster-descended parliamentary democracies. Canadians did not behave exceptionally or strangely in the early 90s; rather, a set of ad hoc decisions produced a series of unintended consequences that reshaped our political institutions.

First, there was Preston Manning and the Reform Party. Like Ross Perot in the United States, Manning led a broad, incoherent populist coalition of the fringe, absorbing anti-state voters on both the left and right of the political spectrum. While Manning, himself, hailed from the right of the spectrum, there was nothing insincere in his attempts to incorporate left-populists and left-libertarians into his big anti-government tent. His prescription for reforming Canadian democracy was one that sought to renew rather than revamp the nation’s parliamentary institutions. He proposed to end the gratuitous “whipping” of caucus votes on non-spending measures, favouring a return to the more lax party discipline evident both in the British parliament at Westminster and the US Senate and House, where MPs could defy their leaders in large numbers on key issues without calling into question the party’s functionality or the leader’s legitimacy.

Manning also proposed a cultural renewal to recover the original intent of the first-past-the-post voting system: that an MP’s job should be to represent the consensus of their geographic community rather than the ideology of their party on the issues of the day. He proposed informal polling, surveys and other new mechanisms to shore-up the increasingly untenable view of an MP’s relationship to their constituents. While seemingly nonsensical to the residents of Canada’s multi-cultural, ideologically diverse cities, Manning’s views made a kind of sense in his and his party’s heartland: rural Alberta, the one-party state in which a clear majority in every riding voted for the same party, shared a colour and religion and had fairly homogeneous views on the issues of the day.

I remain, to this day, committed to the belief that Manning was absolutely sincere in those beliefs. But the first major challenge his leadership faced required that he make a pragmatic exception to these ideas: Doug Collins. Collins, an avowedly white supremacist newspaper columnist whose nativist views went far beyond nativist rhetoric to encompass genuine sympathy for the Nazi party, sought the Reform Party nomination in West Vancouver-Capilano. Manning’s own anti-immigrant policies and dog-whistles to politically unrepresented racist constituencies had built just the sort of membership base that was dying to choose the last explicitly white supremacist columnist to still be writing for a mainstream regional paper as their representative in Ottawa.

Unlike most party leaders who could show up or send their key provincial lieutenants to a local nomination meeting to twist arms, make calls and dangle rewards to shut down insurgent candidacies, Manning had built an organization that stood for the repudiation of just those kinds of politics. And so he had to use—or, technically threaten to use—a provision of the new Elections Act, one that granted the leader of a party sweeping powers to over-rule local party organizations in the selection of candidates. Whereas candidates had always required a leader’s endorsement, previous versions of the Elections Act were full of undefined grey areas handled by uncodified tradition. In practical terms, the earlier, skeletal versions of the Act had been interpreted to mean that electoral district associations (the members of a party living in a riding) could choose their candidate and the leader had the power to veto.

Because previous acts had not spelled-out the sufficient conditions to nominate a candidate but merely some of the necessary conditions, Canadian parliamentary tradition had assumed that an electoral district association’s membership had to sign off on a candidate. With a tradition of mutual veto power, shared between local members and the leader, local associations had been left largely unmolested in their candidate selection practices for more than a century. Sometimes the local party board might choose the candidate behind closed doors but, by the 1920s, this had largely given way to nominating meetings won by whoever signed-up the most party members.

Those who remember the 1980s can recall the world before the 1993 election, the world in which if a party leader wanted to parachute in a star candidate, they sent in organizers, bag-men and local heavyweights to corral enough votes for that candidate to win the nomination meeting. Party leaders still got their way most of the time but they had to do so by maintaining majority support for their agenda amongst party members in ridings where they had a preferred candidate.

One can see how powerful our culture is in shaping our perceptions and recollections when we hear party organizers explaining, today, how the sweeping appointive powers our leaders enjoy to hire and fire candidates at will is something that has always been and always will be a feature of the system. Yet these same hacks should be able to remember the 1989 Oak Bay-Gordon Head byelection where local mayor Susan Bryce won the Social Credit Party nomination on a platform of going to the legislature to depose then-premier Bill Vander Zalm. Even though the laws on the books back in the 80s technically permitted the kind of veto power our leaders currently enjoy, our culture was such that vetoing Bryce would have been viewed by the public not as a demonstration of Vander Zalm’s strength as an autocrat but instead a sign of his weakness and incompetence. Promising to work with and include Ms. Bryce was viewed as less weak than overturning the results of a meeting the premier was too incompetent to control.

But culture is only half the story. Manning, the politician who bottled and sold democratic reform, was not just able to get his way because of his vast credibility on process issues with Canadian voters. He could get his way because the much more detailed Elections Act of 1993, authored by the Mulroney government, spelled-out what electoral district associations could and could not do. The inventory of their powers no longer included any ambiguity. Not only had they lost their prior de facto power to select candidates; they had lost their power to admit members too. Now, head office controlled who could join and the leader controlled who ran. Nomination meetings remain part of Canadian political tradition and remain the means by which most parties choose most of their candidates but, twenty-two years ago, they became what they are today, a moribund political tradition that, if things continue as they are, should be a distant memory a generation from today.

While Manning’s personal brand and sticky situation with Collins added political legitimacy to an increasingly autocratic set of Canadian political practice, it was then-Liberal leader Jean Chrétien whose response to another party nomination crisis who pushed things further. With two leadership contests five years apart, bitterly fought between opposing party factions, the Liberal Party of Canada was the site of the biggest sign-up drives in the nation. In the Toronto area, both the Chretien faction and the Turner-Martin faction had turned to a different group of unpopular far-right zealots: the anti-abortion movement.

Jim Karygiannis, my personal bet for winner of the 2018 Toronto mayoral race (more on that in a future post), cut his teeth in these battles, that vaulted Tom Wappel and John Nunziata into the national spotlight as anti-abortion MPs. While Liberals for Life, the brand name of the conservative Catholic theocrats, had been unmolested under Turner’s leadership, Chretien put an end to their expanded organizing. While not firing any MPs, he too exercised his newfound candidate appointment powers to reshape Liberal nominating practices. By 1997, Chretien was proudly canceling nomination meetings and appointing candidates by fiat, not just in the hotly-contested Toronto area ridings that had created the pretext for using these powers but in any electoral district where Chretien thought Liberal members might make the wrong choice. Accordingly, the Liberal Party amended its own constitution to enshrine and formalize the powers of the leader to select candidates at will and override local members.

This all was only possible because a new Elections Act had to be placed before the Commons. Mulroney, whose popularity had bottomed-out at 14% was widely viewed as corrupt and it was expected that his party would attempt to win the next election dishonourably. This view was not just the result of the series of corruption scandals that felled minister after minister; it was the result of the “free trade election” of 1988, in which unrestricted ad spending by private corporations in favour of Tory candidates dwarfed the budget of all three major parties. The idea that Mulroney might seek to buy the next election by nefarious means had to be dispensed-with, and a comprehensive and transparent Elections Act was the logical solution.

Jean Chretien, Preston Manning and Brian Mulroney did not agree on much. But each man’s response to a different emergency created new legal frameworks, new institutional frameworks and new cultural norms concerning the relationship between leaders and their caucuses. It is in the 1993 election that we see the origins of Canadians’ present-day view that MPs serve at the pleasure of party leaders, in sharp contradistinction to the British and Australian traditions that understand leaders serving at their pleasure of their caucus.

The view that leaders serve at the pleasure of their caucus is a venerable one in British parliamentary tradition. Because parties emerged in the Westminster political system out of voting blocs of elected MPs who coalesced gradually, formal party organizations and mass party membership were additions to a pre-existing party system. The first way parties became real in parliaments was through recognition by the Speaker of a group of MPs as a party. This group had to choose from among their number the first officers parties had: leaders, whips and house leaders. Until the twenty-first century, British political parties chose their leaders in caucus meetings; and that is how it is still done in Australia. Whoever has the confidence of the majority of a caucus serves as its leader, at the pleasure of that caucus. Margaret Thatcher was brought down by a non-confidence vote in her caucus; so were the last two Labour Party prime ministers in Australia.

While this remains the way that party leaders are chosen in the US House and Senate too, the multi-cameral, presidential system of governance down South had to find other ways to select their presidential candidates. In the 1820s, after two generations of dissatisfaction with caucus-driven presidential selection processes, the US Democratic Party became the world’s first mass party to use a system of convention delegates to select party leaders. And about a hundred years later, as a result of cultural exchange, Canada began to follow suit. Of course, by that time, Americans were beginning to tire of the corruption and horse-trading associated with delegated conventions and were beginning their transition to the primary system.

As a measure to counter corruption, American progressives, led by Wisconsin governor and senator Robert La Follette, began to nationalize the process of selecting convention delegates. But shifting the burden of selecting convention delegates from private clubs (because that is all political parties actually are in most English-speaking democracies) that could set arbitrary membership fees to state governments, progressives were able to create a radically more representative system, one that has been crucial in keeping both the best and worst features of American democracy intact. In states with primary systems, parties could no longer charge fees for membership; they could no longer mess with membership rolls; choosing presidential convention delegates and local candidates was a process administered transparently and accessibly by state governments. This slow contagion has spread across the United States, making Americans the most involved of any population on earth in the nomination of candidates for the major parties.

In the mid-90s, most Canadians did not notice the shift that had happened in our laws and culture around the selection of candidates and their relationship to the party leadership. That’s because, like most countries, few Canadians are members of political parties – scholars estimate this figure at 1%. But this is a four-year rolling average that includes the huge influx of instant members who sign up in the year prior to an election.

But while Canadians, as a whole, were not especially concerned about the new powers party leaders had gained—especially given that these powers had initially been used exclusively against racists and misogynist religious extremists, party activists were. The tiny subcultural groups of party members worried that their power as party members was being drained away at the expense of party leaders. And their response, which, like the expanded appointment powers, I supported at the time, seemed eminently reasonable.

“OMOV” was the ugly abbreviation for “one member-one vote,” a series of measures national parties took between 1993 and 2014 to change their leadership selection processes. Long-time party activists who felt disempowered by the draining of authority from their riding associations into the office of the leader saw, as their solution, a more direct and unmediated role in selecting a leader. So, over a generation, one party after another abandoned the old, ethanol-powered, back-slapping smoke-filled convention halls in favour mail-in, internet or phone voting for leadership candidates in which each party member could directly participate.

This put an additional squeeze on the dwindling authority of party caucuses and riding associations. First, members no longer delegated their authority upwards through their MP or convention delegates to choose a leader; in this important role, local party associations and caucus members were far less relevant.

Second, this created an incongruence between the legitimacy of local candidates and the party leader, especially in rural ridings and places where party members had little evening leisure time. While a substantial majority of party members were involved in choosing their leader by mail or electronically, the same could not be said of local candidates, who continue to be elected in in-person meetings. Thus, in nearly every riding, in nearly every party, the number of members who participated in choosing the leader vastly exceeds the number who participated in choosing their local candidate.

Third, it eliminated the role of local party organizers and MPs as brokers of voting blocs. In a delegated convention, the horse-trading that occurs as candidates are eliminated in the multi-round voting system used by Canadian parties was typically brokered by local party organizers or MPs. One came to a convention “with” a leadership candidate, member of parliament or local riding president. As that person made new deals and forged alliances, one followed them from one leadership candidate to the next. With no delegated conventions, preferences encoded before the convention through instant runoff voting or live coverage of the convention via social media replaces these individuals’ leadership in directing delegates’ votes.

As both authority and legitimacy have drained out of local MPs, candidates and organizers into the office of the party leader, this has spurred changes in our public discourse and how leaders are evaluated both by party activists and by the general public. But before I get to that, I want to make a distinction between the argument I am making and the one that many opinion leaders have made about the concentration of power in the office of the Prime Minister.

While it is undoubtedly true that there has been a steady presidentialization of the PMO in the forty years since the beginning of Pierre Trudeau’s third mandate, this is a surprisingly unrelated phenomenon. The phenomenon I am describing affects all political parties, irrespective of their proximity to power through this synergy of law, culture and institution. And, if one looks abroad, it is not to the United States or any similar republic whose course we seem to be following.

No. If there is an international comparison to be made, it is that Canadian politics is become Russified. Those studying the political culture of Russia over the long term sometimes advance the following theory: because of the ravages of the Mongol invasion, Black Death and collapse of Byzantium, we must understand Russian political culture as being forged out of two main traditions, the Byzantine and the Mongol. Both of these systems are what might be terms reverse-feudal. Whether we trace modern Russia to the Byzantine Empire or to the Khanate of the Golden Horde, what we see is a political culture whose accountability structure is not delegated gradually upwards from the micro-local, to the local, to the regional, to the national/imperial, as in societies whose political practices arise out of a Western European medieval past.

Instead, accountability in the Russian system was built through a direct relationship between the peasant class and the Tsar. If a member of the land-owning class was behaving badly, his tenants appealed to the Tsar for relief, including the replacement of the local lord by another member of the gentry class. Because of the lack of a conventionally aristocratic class, the authority of local lords was understood to be contingent upon the Tsar, and not the reverse. And this tradition has continued, de facto and, intermittently, de jure, to the present day: regional governors are appointed by the Kremlin to represent the president’s regime to the local region; authority is delegated upward from the people to the autocrat and thence, downward to members of the national duma and regional governments. When political change happens, it is through seizure of national leadership by a new autocrat.

In Russia, therefore, one of the most important features of the autocrat is their ability to control remote local organizations, party officials and members of the duma. Because government departments and regional authorities are imposed from above and not locally accountable, a weak leader who fails to demonstrate that control produces anxiety in the populace. If the autocrat is the only means by which the branches of the state with which one interacts can be made accountable, a weak leader will produce bureaucratic despotism and corruption while a strong leader will produce democratic accountability.

And, as I first noted in 2008, this is becoming a Canadian political value. Stephen Harper’s performance of strength through a hyper-controlled affect and conspicuous micro-management is resonating with Canadians who now see their only point of access to the political system as their selection of a leader.

In 2010, I became the first candidate for a federal NDP nomination to be barred from proceeding to a nomination meeting in my riding. I was barred on the grounds that I have made four Facebook posts on my personal page that the national office crew deemed unflattering to the party. They found the most egregious to be the one in which I raised the party’s conduct during the Gustafsen Lake siege of 1995. Without getting into the merits of my particular case, I wish to note that since that time, the number of candidates for nomination barred from even subjecting themselves to a vote by local members has grown, not just in the NDP but in every party.

This change in party practices arises, in part, from a misunderstanding of the advice Obama organizers have given the Liberals and New Democrats. “Staying on top of social media” has been misinterpreted as the control and management of party officials’ and candidates’ communication to small online communities. But this is, manifestly, not how the Democrats or Republicans work. In part because they are hemmed-in by a primary system and cannot hire and fire candidates at will but, more importantly, because a US presidential candidate must demonstrate her legitimacy by assembling a large, heterogeneous that is unanimous about very little, legitimacy is associated with showing that your leadership is one of the few things on which your party is in accord.

But even this mistaking of advice is significant. Canada’s new autocratic political culture is doing one of those things that culture does: constrain our ability to imagine a place or time different than the one we inhabit. Just as we forget nomination meetings like Oak Bay’s in 1989, we also misinterpret people from other countries when they try to tell us about how things work there. The Democrats must mean that that they obsessively surveil and silence minor party officials, because we cannot imagine that what they are really doing is conducting social media like an orchestra, deftly signaling one section to speed up, to slow down, to play louder or softer, to transform the cacophony of the party base into harmony.

And so it becomes necessary for leaders to demonstrate their authority with increasing severity whenever a candidate goes off-message, publicly sacking people for minor remarks to small audiences. To do otherwise would be to show weakness, in the Russian sense, imperfect control of one’s minor officials who are only accountable through you, the leader.

For this reason, it is only natural that more and more candidates should be personally appointed or removed by the leader, arbitrarily and at will and party organizations have fallen over themselves to rewrite their constitutions and bylaws so that, even if the Elections Act were amended to limit leaders’ sweeping powers of appointment, as Michael Chong attempted to do through the failed Reform Act, our march to autocracy would continue apace.

Unfortunate reality of modern Canadian politics is that we are approaching the point of no return. Whereas Margaret Thatcher, in the Britain of 1989, could be fired by her caucus, this is only barely possible in today’s Canada. First of all the position of “party leader” has now been legally severed from any role in the House of Commons. Were Stephen Harper’s caucus to fire him, here is what he could do:

As Prime Minister, Harper could visit the Governor General and ask him to dissolve parliament, a request with which he would be almost certain to comply. Once parliament was dissolved, Harper, as the head of the Conservative Party of Canada, could appoint 338 new, loyal candidates by leader fiat and his current caucus would find themselves running as independents or saddled with the problem of building a new political brand and party infrastructure, new lists, new members, new donors in a matter of weeks. And, because nine in ten Canadians vote based on party platform or party leadership, the overwhelming majority would likely be defeated by newly-minted appointed MPs.

In the 2007 referendum on proportional representation in Ontario, the Toronto Star warned that if we did not stick with first-past-the-post, we would live in a Canada where leaders directly appointed their entire caucuses. Yet the reality is that, without changes in Canadian political culture, this is the direction we are heading. As I observed in 2008, it is the absence of proportional representation that is helping to increase leader-centred autocratic control:

In most countries, the increasing diversity of political opinion has resulted in more proportional voting systems and coalition-building; it has reinforced deliberation and negotiation in politics. These changes have been made not simply out of respect for diversity but out of a growing demand for social and political order in the face of an increasingly diverse and atomized society. Yet Canadians, motivated by the same anxieties, have chosen a different response. We seek to vest power in the person who is most capable of fusing a subset of these atomized groups and individuals back into some kind of unified formation.

In our voting system, the most successful party is one best at reducing the number of choices its potential voters feel that they have. A look at Liberal messaging shows that Jean Chretien became increasingly reliant on his ability to convince potential NDP and Green Party supporters to vote for his party. And despite his antipathy for Chretien, Paul Martin intensified this approach. What we missed during that time was how this change in Liberal tactics helped to change Canadian ideas of what made a legitimate government. As the Liberals lost their capacity to intimidate left-of-centre voters, they lost power. And Canadians learned a lesson: a government’s legitimacy comes not from its ability to appeal to the majority but instead from its ability to control and discipline its own supporters and potential supporters.

Supposedly, Canada’s dwindling newspaper and TV news sector are worried about this. And yet, when a candidate displays any originality or distinctiveness at all, they rush to report that a party has suffered a “bozo eruption.” They begin hounding the leader, demanding what he will do about even one of the party’s 338 candidates expressing a difference of opinion from the leader. And if the leader does not immediately and publicly punish or, better yet, summarily fire the candidate, we might as well be reading Russia Today: the leader is portrayed as weak and unable to exercise the kind of control befitting the occupant of 24 Sussex Drive.

Every time another candidate is fired or summarily removed, Canada’s new authoritarian political culture becomes more entrenched; and every time a leader is made to look weak and suffers an electoral setback because they have been portrayed as weak, it also becomes more entrenched. And so, when it comes time to replace that leader, it seems only logical to choose someone more like our current Prime Minister, an unapologetic authoritarian and micro-manager.

Because, increasingly, that is what Canada’s unique, new political culture demands.

Culture and Institutions in Canadian Politics: The Greens and the Absence of “There”

Yesterday, I posted something to Facebook that had the shit shared out of it. Based on the ideas I was wrestling with when I dashed this post off in the space of about 90 second and the responses it has provoked in various parts of the interwebs, I thought that maybe it is time for me to do some writing about the ways in which we underestimate and under-think a basic rule of political analysis articulated by York University political scientist Denis Pilon.

Pilon, a long-time colleague in the electoral reform movement, has posited that the effects of any reform to the voting system will be strongly conditioned by three things: (1) the mathematical and legal language of the legislation (2) the culture(s) of the people interacting with the voting system and (3) the institutional structures prevalent in society and the specific structures and cultures of the institutions interacting with the system.

All too often, when we think about what is going on in Canadian politics, we do not pay enough attention to these things. As a result, we often blame a bad turn of events or a disturbing political development on individual bad actors or the regulations that affect people directly through the state’s apparatus, rather than the regulations (both state-created and self-created) of voluntary associations like political parties and social movement groups. We similarly discount culture as both a cause and effect of political events.

So, to kick off this series of articles, which will begin later today, I will reprint the original offending Facebook post:

“As the Green Party’s scheme of focusing on only NDP-held ridings for pickups this year and passing over prime turf like Vancouver Centre, heats up, many people are saying that the Greens are ‘really conservatives’ or ‘really Liberals’ or ‘really regressive.’

Let me speak from experience: the Greens are not “really” anything. They are the first political formation in North America to come of age in the neoliberal era. As such, they do not function like a political party in the traditional, institutional sense. They are a leader-centric brand, with low membership participation, high turnover and no continuous policy agenda.

‘Green’ is a signifier that functions at a primarily aesthetic, brand-based level. Those trying to understand ‘who’ the Green Party membership is, what the party’s ideology is, etc. is on a wild goose chase. There is nothing to find. A Green Party is, from one moment to the next, an unstable fusion of its leader’s personality, alliances and beliefs and an aesthetically-driven voter base whose theory of political causation is most similar to James Frazier’s original theorization of sympathetic magic.

As for why Greens give the Liberals a free pass and seem hell-bent on taking out a series of NDP incumbents, while there may be some behind-the-scenes agreements with the Liberal Party, I would suggest that such agreements, even if they exist, are post-facto irrelevances. The current leadership of the Greens feel personally betrayed by the NDP for the party’s failure to BE the Green Party. And they are exacting revenge.”

Also, there is some past writing of mine that might also help to inform the series and provide some needed background:

In 2008, I made some comments about the prorogation crisis that are directly germane when examining the way that the law, political party governance and culture have functioned synergistically to condition Canadians’ ideas of legitimacy. They were printed by

On my own site, I’ve offered some commentary on the different ways voters connect cause and effect when they cast votes and the prevalence of sympathetic magical views of voting. Relatedly, I have also talked about the ways in which Green parties are as much novel social movements as they are the occupants of a longstanding role in Christian societies.

This afternoon/evening, I will offer some thoughts on candidate vetting processes and how they fit into a larger cultural and institutional matrix. David Ball’s excellent article on this issue may help to frame my argument.

ISIL, Pancho Villa and the Centipedes: the Politics of Imperial Over-reach and Outrage Porn

In David Cronenberg’s film adaptation of Naked Lunch our junkie-hero, William Lee attempts to steal his sleeping co-worker’s tank of pyrethrum insecticide in a desperate attempt to get high. But his coworker awakens and grabs hold of his tank and begins questioning Lee about his motives for stealing it, “Is that why you tried to lift mine,” he asks Bill, having correctly divined his motives. “That’s unkind, Edwin. ‘Lift’ is unkind. No… I’m doing a job for a friend. You see… it’s the centipedes. Yes, the Centipedes are becoming downright arrogant. They’ve started attacking his children!

In many ways, this scene is a microcosm of the whole film. It is about the collision between the Freudian uncanny and Americanism. No matter how depleted out hero becomes, not matter how bizarre and alien the landscape in which he finds himself, his responses, whether bewildered or heroic, remain those of an American interloper.

As America begins its next quixotic foray into the lands between the Levant and Persia, graveyard of occidental empires, this seems as good a starting point as any for understanding how it is that the president who came into office to pull American soldiers out of the Middle East will leave office as the commander of yet another American war effort in the region.

As William Lee illustrates, Americans are suckers for outrage porn. Theatrical gestures designed to generate outrage capitalize on one of the most important aspects of the American psyche. Republican apostate Kevin Philips, creator of Richard Nixon’s Southern Strategy and the associated book The Emerging Republican Majority, argued in his 2006 American Theocracy that one of the most important aspects of imperial culture is the capacity to become offended very easily. Comparing modern US senatorial debates with those of republican Rome, he observed that in order to constantly mobilize expansionist war efforts, an empire’s political consciousness must be extraordinarily sensitive to offense. In the public discourse of the Roman Republic, the state was constantly being threatened and insulted and its citizens abused in foreign lands; the republic’s honour demanded swift and devastating responses to these insults.

But if there were one form of political theatre more capable of mobilizing Rome’s lethal force, it was the public theatre of moral offense. Rome became an empire because of its capacity to continue mobilizing against the rival empire of Carthage for centuries. More than avenging the deaths of Roman soldiers in prior battles or protecting Rome’s expanding Mediterranean trade, what formed the backbone of war propaganda during the Punic Wars, was the sacrifice of babies to the Phoenician god Baal. While these sacrifices were not of Roman children, nor especially numerous, they inspired consul after consul to send legion after legion to war in North Africa.

It is in this context that we must understand the genius behind ISIS’s provocation of the United States. ISIS’s burnings and beheadings represent a tiny proportion of the death toll inflicted by the incipient Caliphate; most deaths are like those caused by its adversaries: deaths associated with seizing, sacking and occupying cities, deaths associated with conflicts between fighting men in planes, tanks and body armour. But is the ritualized, public executions that are galvanizing US opinion to the point where even the Bernie Sanders neo-isolationist presidential bid is becoming equivocal about the war.

Imperialists cannot bear to watch public acts that defy the moral code they understand their hegemony to uphold. Not only is inhumane suffering publicly dramatized and celebrated; so is the flouting of the empire’s power to impose its theory of the good upon the world. Theatrically-staged burnings and beheadings are not merely atrocious in humanitarian terms; they throw down a gauntlet and they shame the empire that will not challenge them. That would be like letting the Centipedes rip a man’s children limb from limb.

To some of my readers, it might seem surprising that Barack Obama would be, in the last full year of his mandate, so susceptible to this kind of challenge. Didn’t he come to power to end American imperialism? Such a view fundamentally misunderstands Obama and his foreign policy agenda. If George W. Bush was America’s Theodosius I, the emperor who ended the empire’s religious pluralism, closed its academies and sent its scholars to work as apologists for a nationalized Christianity, Obama is its Julian the Apostate, a man who understood himself not to be a destroyer of the empire but its restorer. Julian reopened the academies, re-legalized paganism and set Rome back on its pluralistic course. Like Obama, he was a personal beneficiary of the empire at its best, a winner in the pluralistic global order of Kennedy and Johnson, at the noontide not only of imperial power but of its original liberal ideals through the Alliance for Progress and the Second Reconstruction. Men like Kennedy and Johnson were so committed to liberal ideals that not only were they willing to send the army to Saigon; they were willing to send it to Selma.

The Obama presidency’s foreign policy has been about rebuilding the ruins of an empire, in a desperate attempt to prevent it from being overtaken by the harbingers of decline: ignorance, fundamentalism and a disregard for its plebians. Obama and Hilary Clinton will never receive the credit due them for the Honduran coup, when the first new US-backed military junta took power in decades took power in Latin America and ended the expansion of the Chavista bloc. Similarly, Obama’s policies in Iraq and Afghanistan have been policies of rationalization, resource and expectation management. Obama’s policies are those of retrenchment project: federal efforts to limit the power of states to enact the New Jim Crow, universalization of basic material rights like health care, reinforcement of the Munro Doctrine after a decade of neglect and the rebuilding of America’s multilateral military coalition through cooperative efforts with its NATO allies. This is a presidency committed to restoring the empire to the greatness it possessed when it welcomed his father Kenyan father into a vast, pluralistic, liberal global order.

Would LBJ or Kennedy have been able to tolerate the theatrical provocations of ISIL? Not on your life; compared to the great, presidents of muscular liberalism, he is really being quite restrained. But the atrocity porn continues to stream out of Syria and Iraq, making a lie of the Pax Americana.

But behind that humiliation is another, deeper humiliation awaiting the American Empire. This theatre of outrage has been the means by which men have made themselves national and global heroes, the means by which movements and states have rebounded from torpor and irrelevance.

Caliph Ibrahim of the Islamic State of Iraq and the Levant joins a venerable tradition of men who have made themselves and their ideas powerful and relevant on a global scale by mobilizing American outrage against them, a tradition whose hundredth anniversary we will be celebrating on March, 9th, 2016.

This tradition is one that the US would do well to remember, were such a thing permitted by the deep structures of American historical memory. In 1916, Pancho Villa’s fortunes in the Mexican Revolution were in decline; first battling the regime of German and US-backed conservative usurper Victoriano Huerta, Villa had come to personally command an army of tens of thousands of Mexicans seeking land reform, democracy and an end to neocolonialism. But infighting had broken out amongst the revolutionaries and the moderates, led by Venustiano Carranza, had reduced Villa’s army to less than five hundred men, hiding out in Mexico’s northern borderlands. It seemed that all was lost for the Villistas until Villa hit upon an idea that would, indeed, turn the war around and inspire tens of thousands of new recruits to flock to his banner.

He staged a quixotic attack on the United States of America. Crossing the border into New Mexico, he attacked an armoury of the Thirteenth Cavalry in the small town of Columbus. While the initial attack took place on the pretext of replenishing his badly-depleted supply of arms, Villa’s inspired choice allowed him to reap far more than he, and the hundred men who joined in the attack, could carry off.

America had been humiliated, by a brown-skinned “bandit” and his handful of followers, its storied southwestern cavalry shown up as unprepared, weak and untrained. There was simply no way American could do anything other than make Villa into the titanic figure he remains in the mythology of the Mexican Revolution.

As we saw eight-six years later, America has no choice but to respond to the taunts of a guerrilla leader hiding in the hinterland of an arid, mountainous, impoverished state with population deeply resentful of the US and sympathetic to the guerrillas: send in a of fifteen thousand men to look for that one guy, to storm around the countryside pissing off the locals, searching dwellings and caves and asking for news of the fugitive in some kind of heavily-accented, ungrammatical version of the local language. While these responses never seem to work out that well for the US, even when they end up catching the guy—no thanks to the column of fifteen thousand troops turning over larger rocks—but they do work out awfully well for the movement that makes the successful provocation.

There is a grandeur and heroism that accrues naturally to the out-gunned ragtag band of guerrilla fighters or terrorists—and Villa was certainly called a terrorist—that dares to challenge the hegemony of a world-spanning imperium. This vast, heroic stature only a massive imperial punitive expedition can confer takes on an existence that mere military defeat cannot erase. Simon bar Kokhba, Pancho Villa: this is the league that Caliph Ibrahim seeks to enter. As has been the case since the Muslim armies first broke out of the Arabian Peninsula and tore Egypt away from its Byzantine overlords, one can only be a true Caliph if so recognized by one’s opposite, by the Pope, or better, by the Byzantine Emperor or Holy Roman Emperor.

Since George W. Bush’s shift in the legitimating discourse of American imperialism from anti-communism to militant Christianity through his appropriation of the power to pronounce national benedictions, as God’s vicegerent on earth, “may God continue to bless America,” the possibility of reconstituting an Islamic Caliphate through American aggression has grown. The more effectively the US can be provoked to send the armies of Christendom to fight ISIL, the more this war will resemble a Crusade, and the more it resembles a Crusade, the clearer it will be for all to see that Ibrahim must really be the Caliph. While Islamic jurists, political leaders and scholars will dispute that claim, the imperial response to these acts of public shaming and provocation will make a far more powerful argument to the contrary.

Whereas groups like Boko Haram focus their terror into striking fear into the people of Nigeria, Niger and Chad through atrocities staged for the benefit of local villagers, ISIL has been far more focused on and effective in its efforts at cultural translation. The kidnappings and murders of West African women and children by Boko Haram are not staged as outrage porn for Americans but as direct threats to West Africans. And in this way, they do not become America’s business.

It turns out that the problem with the Centipedes is not simply that they are attacking children. The problem with them is that they are becoming “downright arrogant;” the attacks on children are a necessary but insufficient condition for vengeful imperial ire; atrocities, acts that are beyond the moral pale of the hegemon, must not merely take place; they must be staged, with defiant pride, shaming the empire into its own quixotic, pride-driven retaliation.

How to Lose a Referendum in BC: Time for YES to Transit to Change Course

How can I tell I am in a referendum campaign in BC? Well, people who are encouraging me to vote “no” cover the waterfront with their positions on the issue before the public. In my forty-two years, I have been an active participant in nearly every referendum campaign my city and province have seen. I fought on the “no” side during the Charlottetown Accord referendum in 1992 and then, in 1996, I organized the “Yes to proportional representation” campaign in Vancouver’s municipal voting systems referendum. I served on the steering committee of the “yes” campaigns for proportional representation in the 2005 and 2009 provincial referenda on the voting system and here I am today, a foot soldier in the “Yes to better transit” campaign by the GetOnBoard BC coalition in the upcoming vote.

And things do not look good. British Columbians love voting “no” to things. Indeed, the only time I have been aligned with a referendum campaign that gained majority support, 58% voted for proportional representation in 2005, it was because we disguised ourselves as a “no” campaign.

The fact is that when most British Columbians look at a ballot, the option we are really looking for is “FUCK YOU!!!!” If that option isn’t available, they try to find the next best thing. If the word “no,” is on the ballot, we generally don’t need to look any further, as evinced by the crushing 70% victory for the forces campaigning against the Charlottetown Accord in 1992, despite being out-spent by our opponents by about twenty-five to one.

For all the talk about BC politics having changed since the days of WAC Bennett’s “socialist hordes at the gates!” and Education Minister Tony Brummett’s fist-pounding defense of the last Social Credit budget in 1991, “the people of the world have spoken and they will never, never accept communism!” we really are just the same people, the best market for negative populism in Canada. BC voters will choose whatever political formation most clearly articulates an anti-elite narrative that identifies a villain, his or her allies and their hidden agenda. As we saw in 2013, what we are not up for is “a better British Columbia, one practical step at a time” or in the case of the gong show I am backing this year, “creating a cleaner, safer, smarter transportation system.”

Because BC politics is fundamentally negative in that it is mainly about figuring out who is really in charge so that they can be punished for fucking everything up, British Columbians need to find unity in the political process in other ways. One way we do that is by having a provincial political scene in which 80-90% of voters have aligned themselves with one of only two major provincial parties for the past seventy-eight years. This locates us on big, diverse teams where we can feel like we are working shoulder-to-shoulder with half the province to stop those evil New Democrats/Socreds/BC Liberals/CCFers/Coalitioners from carrying out their hidden agenda to wreck everything.

An even more effective way to unite British Columbians and make them feel a sense of cohesion is a “no” campaign because, in BC, it doesn’t even matter what the ballot says: the result of defeating some hated measure (which is no doubt cloaking some even more dastardly hidden agenda) is that every voter will magically get their preferred alternative to whatever is on offer. That is how, during the Charlottetown Accord debate, the Union of BC Indian Chiefs worked side-by-side with John Cummins’ BC Fisheries Survival Coalition. That is why advocates for mixed-member proportional representation worked shoulder-to-shoulder with supporters of first-past-the-post in 2009 to defeat STV.

The “no” side in a BC referendum does not just succeed in making the perfect the enemy of the good. It succeeds in making everyone’s individual, imaginary perfect the enemy of a single, specific good. That is why people who want the BC government to build twice as much transit for free are working so harmoniously with those who oppose any further government funding of mass transit. That is why people who want transit to be funded only through progressive income and inheritance taxes are working so happily for a campaign run by the Canadian Taxpayers’ Federation. In a “no” campaign, there is an implicit social agreement to enable each individual campaigner’s act of utopian self-deception, as I witnessed in 1992 when Preston Manning activists helped future gadfly and perennial candidate Imtiaz Popat produce a video explaining that a “no” vote was a vote for the restoration of hereditary indigenous rule for all of British Columbia.

When I speak with my fellow New Democrats who are campaigning for the “no” side, I have to listen to them explain how we should not have to pay any new taxes for this transit, how it should all come from general revenue, how it should include a reduction in or abolition of fares, how consumption taxes are regressive and how we should abolish the PST and replace it with higher corporate, income and resource taxes. Of course, they argue, Christy Clark will be made to see that transit should be free, consumption taxes are evil and we all must pay our fair share. To which I can offer only one lame response, “Based on what you know of the BC government and its ideology, do you honestly believe they will build the new transit you want anyway, after you vote ‘no’?”

But if asking people to confront their socially-enabled self-deception ever got anyone anywhere in BC politics, everything would already be different. This, obviously, is no way to get things done. As I have written elsewhere, one of the reasons we view political campaigns as a self-sufficient act, irrespective of their public policy outcome, is because we have grown so deeply pessimistic about the idea that the result of any vote will make our lives better in any way. People are enthusiastic about the “no” campaign not because they believe it will make their lives better but because it will provide real experiences of social cohesion and victory, rare commodities under present-day neoliberal hegemony.

Instead, I am writing this post to beg my side to retool, to stop and think about how negatively we British Columbians react to a paternalistic, cross-partisan elite consensus that presents some political outcome as good and positive for everyone.

The reality is that the “no” forces really do have a hidden agenda, that their puppet-masters are dangerous, shadowy forces of extreme-right think tanks who will be satisfied with nothing less than the total annihilation of our social safety net and the sacrifice of public health and education to their invisible-handed god. The fact is that climate change deniers, oil companies and Ayn Rand fanatics are sitting back in their plush leather chairs laughing about how easy we are to manipulate for their purposes and what witless dolts the clowns who pass for our local civic elite appear to be.

It’s time to stop all this nonsensical positivity. Identify the villain; expose their friends; explain their hidden agenda. Time is running out.

Happy Tenth Anniversary, Battlestar Galactica

In 2009, I wrote an article on the representation of Mormon cosmology in the Ronald Moore re-make of the Battlestar Galactica TV series. I never got around to writing a second version of this article incorporating the revisions that were recommended to me in workshops and conferences in 2009 and 2010, nor a version incorporating the second half of the final season of the series. I doubt that I will now.

Anyway, with today’s celebration of the tenth anniversary of the series premiere, it seems a good time to revisit the article and pass it on to fans who may have missed it the first time around. So, without further ado, here it is: Battlestar Galactica and Mormon Theology

Oh — and here are the TV interviews I did that helped to inspire the article here, here and here.


Doctor Who: Man, Monster and Minor – Part II: The Silence, the Rise of the Trauma Monster and the Inward Turn of the Home Front

This article is the second of two on gender dynamics in Doctor Who. The first appears here.

In 2013, I suffered a minor psychological breakdown, triggered by, among other things, the new Doctor Who monster, a race of creatures called “the Silence.” The Silence, likely an homage to Joss Whedon’s the Gentlemen, are creatures with one singular power: the ability to make anyone who saw them forget that they had the moment they looked away. The horror of seeing one of the Silence inheres not in witnessing the creature’s hideous visage and diabolical nature but in remembering all the other times you had already seen the Silence and forgotten they were there. Not just “there” but everywhere.

These creatures had been distorting human history since its beginning, silently manipulating the fate of the world for their own diabolical ends. As one explains, “we have ruled your lives since your lives began. You should kill us all on sight but you will never remember we were even here. Your world is ours… we are The Silence.” For how long have they been doing this, someone asks the Doctor, “as long as there’s been something in the corner of your eye, or creaking in your house or breathing under your bed or voices through a wall.”

The Silence are one of the most successful villains of the new Doctor Who, since its resurrection by Russell T. Davies in 2004, an adversary that has sent English children back to their proper viewing perch for the classic series, behind the sofa. While the Daleks, Cybermen and Sontarans, the totalitarianism monsters of the Second World War and Cold War have returned, they mainly offer viewers a sense of nostalgia and continuity, not terror. Nor has there been any great effort to update monsters who are more adaptable to our contemporary fears of inhuman authority, dehumanization and the annihilation of culture and emotion; there are no new, scarier Autons or Axons to speak Matrix-esque fears of the present day.

I would suggest that this is because our modern risk of cybernetic dehumanization inheres, in part, in our loss of any clear sense of implicit threat as our phones and consoles merge with our bodies, the kind of fear that was narrated more easily a generation ago in David Cronenberg’s Existenz. For this reason, such fears are not central to the reinfusion of terror into Doctor Who.

As in the original series, the Doctor must convey a sense of manly heroism relationally and symbolically, by protecting a female companion from danger. Because the main character has been transformed from an asexual being into an ambiguously and ambivalently sexual one, the non-consummated nature of the Doctor’s relationship with his companion is one that, even more strongly, conveys a Victorian restraint-driven manliness. Now, the Doctor is tempted, from time to time to engage romantically or sexually with his younger female companion. And yet, for some important reason, he must restrain himself from doing do.

In trying to understand why this must be, the show’s queer subtext seems a logical explanation; Russell T. Davies’ Doctor feels fleeting moments of attraction to his female human companion but not enough to actually sustain the rich, romantic, sexual relationship she wants and “deserves,” with some more suitable male partner, the Will and Grace “fag hag” dynamic played out episode after episode.

But let us, for a moment, consider how the nature of the Silence and the other popular new monsters in Doctor Who link the unconsummated sexual dynamic to the return of the show’s ability to convey horror. Steven Moffat has struck fear into the heart of a new generation of youngsters (and adults like me!) with the Silence and the Weeping Angels by triggering the fears of contemporary watchers the way the Daleks and Cybermen played on the fear of totalitarianism that existed in audiences of half a century ago.

Like the creature lurking under beds and behind curtains in the current season, the Weeping Angels and the Silence evoke the consciousness of victims of childhood abuse and sexual violence, and the ways in which the resulting trauma plays on the memory of survivors. While the Silence are creatures one forgets every time one looks away, only to recall, with ever-increasing horror all the times one witnessed and forgot, when one sees them again, the Weeping Angels speak to the vigilance that survivors of trauma experience.

Weeping Angels are monstrously strong and lethal creatures that can only move when no one is looking at them. One must never close one’s eyes, never look away, never let the lights go out, never blink or the Angels will set upon you and tear you limb from limb. For so many victims of childhood sexual violence, this fear of the dangerous world that comes into being when the lights are out has left a residual vigilance, that permanent imprint of trauma that remains sleepless and vigilant, hoping to delay the seemingly inevitable reckoning with horror.

Before I met the Silence, I had always found the idea of “repressed” and “recovered” memories hard to understand, hard to believe in. How could something so life-altering and horrifying really be forgotten? How could one go through life never remembering things around which an abused child’s life is organized? But that misses the point of repressed memories—the horror of repressed childhood trauma is not repressed once; it is repressed again and again. And even as the events themselves recede, the horror only grows in power because every time you remember the event again, every time it breaks through repression and localized amnesia, you remember all the other times you saw the monster and repressed it again because you could not bear to gaze upon its visage. The work of repression is constant, repetitive and exhausting; through it, we become unwitting, involuntary accomplices in the conspiracy of silence that surrounds trauma and abuse.

When we hear the voice of the Silence, we hear generations of priests, teachers, parents and relatives whispering those words, “we have ruled your lives since your lives began. You should kill us all on sight. But you will never remember we were even here. Your world is ours… we are The Silence.”

It is in this light that we must understand the unconsummated nature of the Doctor-companion sexual dynamic. The Doctor cannot sleep with his companions—not because he is gay—but because he knows their secret: that they are victims of trauma and abuse, and that he would be exploiting his knowledge of who and what they really are if he did so, much as he might wish to.

It also helps to explain the feature of the series that fans find most aggravating: that nearly every companion, in her childhood, became entangled with the universe-threatening monster the Doctor is fighting. And it is her prior encounter with the trauma-inducing events and creatures that set her on a path that will, inevitably, intersect with the Doctor’s. Here, our modern Dcotor stands in for the charismatic, altruistic future therapist, police officer, social worker, foster parent with whom the traumatized person must confront the foundational evil that has been hanging over her life, a hero bound by ascetic vows never the turn that intimate relationship into a sexual one.

Serial killers, rapists, human traffickers—these are our new demons in popular culture; they have replaced the Nazi war criminals and Soviet agents of half a century ago. They hold that status because they threaten our patriarchy’s minors, our home front; they target “our” women and children, not men. And by interposing oneself between these predators and the women and children of England or America, one becomes a masculine hero, no matter how effete or unmanly one’s body or personality. This gendered, relational position doesn’t just permit the Doctor to be a dandy hero; it gives us Gil Grissom, Spencer Reid and a host of other otherwise-insufficiently masculine men who hunt the monsters who threaten the new home front.

At this point, people who are not me might focus on the ways in which this argument shows Doctor Who to have always been a patriarchal show that subordinates women to men (perhaps aside from the 1979 and 1980 seasons). This can be said of most shows on TV and, frankly, most good ones, not because the film industry is full of misogynists but because we continue to live in a patriarchal society that constantly re-inscribes its gender dynamics in its literary and dramatic production.

What interests me are the ways in which the show operates within these gender dynamics to adumbrate new possibilities for narrating the deeply gendered repression that remains near the heart of our society. I have yet to see any portrayal of repressed memories of abuse more compelling than the Silence, one that engages not just individual trauma but the multigenerational, structural character of abuse and trauma.

When Jack Cram, the radical native sovereigntist lawyer went mad, he spoke—inaccurately—of our society being run by a conspiracy of pedophiles in our courts, churches and legislatures. There is, of course, no such conspiracy. It is just that our society runs as if there were. When I wrote of the lethal silence that powered southern lynchings, the silence that enables predators like Bill Cosby and Jian Ghomeshi to seek out and assault new victims with impunity, the picture in my mind was of the Silence, as depicted by Stephen Moffat, that powerful force, as old as the human race itself that stops us telling others what has happened to us, that chokes cries for help in our throats, that seeps into our houses and places of work, stifling our words.

While there is much to criticize about the new Doctor Who, in particular, the direction of the show since the Davies’ departure, I continue to draw inspiration about how to be an ethical man enmeshed in a patriarchal society. Just as the old series taught me how one could be clumsy, eccentric, hard-to-understand, strangely-dressed and yet mysteriously heroic, I choose to draw inspiration from the possibilities the show lays before us. All that is needed to be a man, Doctor Who continues to tell us, is to fight to protect the home front. As the Doctor says of the Silence, “they’ve been running your lives for a very long time now, so keep this straight in your head. We are not fighting an alien invasion, we’re leading a revolution. And today, the battle begins.”

Gender in Doctor Who: Man, Monster and Minor – Part I: The Home Front, Manliness and the Dandy Hero

The first time I quit politics, I gave a concession speech crediting my seven-year career as leader of the BC Green Party to the British science fiction series Dr. Who for “teaching me that a tall, eccentric, clumsy, curly-haired man can, indeed, save the universe.” My valedictory remarks ended with a quotation from the Doctor’s first farewell, in 1964, to his young, female companion/assistant, a traditional feature of the show by the time of its cancellation in 1989.

Although, the first of these companions had been the Doctor’s granddaughter, the young, shapely, wide-eyed female companion became the most predictable feature of the cast. Indeed, out of the original series’ 667 episodes, 647 feature such a character. While Dr. Who companions were generally portrayed as assisting the Doctor, it was only at the apex of second-wave feminism (1977-82) that their role does not consist substantially of screaming, being injured, getting captured, being rescued and asking questions in a way that enables the main character to demonstrate his superior knowledge.

Although there is an argument to be made that Dr. Who companions did more good than harm when it comes to widening the range of possible female roles on television, I don’t think much of it. After all The Avengers premiered in 1961 and, by the year before Dr. Who began, already featured witty, assertive, female action stars. But, as with my views on racism in J R R Tolkien, the idea that a work of literature beloved by self-identified geeks be flawed, even chauvinistic, and yet still do and say worthy and important things, is unlikely to find unanimous acceptance.

And that is a shame because Dr. Who does have a lot of important things to tell us about gender and sexuality in the twentieth and twenty-first centuries. It was, after all, the crew at Queer as Folk who successfully revived the series in 2004, because the original series could be easily read as having a queer subtext, a subtext that almost spilled over into simple text in the years 1970-74.

When Jon Pertwee played the title character, he portrayed him as a dandyish man with a slight lisp and an over-coiffed silver perm, dressed flamboyantly in a cape, powder-blue frilly shirt and velvet smoking jacket, a man with refined tastes who imported gorgonzola cheese from Italy when exiled to London. And just in case anyone missed what the Doctor was in those years, William Hartnell reprised his original role in the tenth anniversary episode, pronouncing of his two successors, “so, you’re my replacements, a dandy and a clown!

The “galactic hobo” portrayal of the Doctor, first by Patrick Troughton (1966-69) and then by Tom Baker (1974-81) was more typical of the original series, and one for which it is better-remembered. However, it is worth noting that an explicitly dandyish hero was played by Peter Davison (1981-84), featuring two young male companions, for the first time since the 1960s, something that would have simply been too queer for Pertwee’s already sexually problematic portrayal.

Still, one would be hard-pressed to find any portrayal of the Doctor in the original series that could be considered conventionally manly. Neither Troughton’s and Baker’s hobos, Pertwee’s and Davison’s dandies nor Hartnell’s and McCoy’s curmudgeonly know-it-alls were heroic in a conventionally masculine sense. They eschewed physical violence, favouring more ambiguously-gendered forms of aggression, relying on deception, self-control, trickery, superior knowledge and manipulating their enemies, almost none of which showed a trace of athleticism.

Sure, there were some ways the Doctor’s body was capable of feats impossible for ordinary humans but those feats were devoid of athleticism; he could hold his breath for minutes at a time, practice obscure martial arts without breaking a sweat and cheat death by “regenerating” into another body. But when it came to feats like mountain-climbing, the show went so far as to lampoon its title character’s lack of athleticism, having him retrieve from his pocket, the self-help book Everest in Easy Stages and, upon discovering it to be written mostly in Tibetan, Teach Yourself Tibetan.

This kind of masculine heroism, centred in superior knowledge, self-control and cleverness had once been the ideal form of “manliness” in the English-speaking world, as compellingly argued by Gail Bederman in Manliness and Civilization. Back when colonizing and “civilizing” the “darker races” was the job, the manly hero of Rudyard Kipling’s world was not unlike the Doctor. The Englishman or American who carried the “white man’s burden,” had to, by necessity, distinguish his manliness from the “primitive masculinity” he allegedly opposed. Indeed, societies whose theories of masculinity were most congruent with this exaltation of restraint, were those most lightly colonized by the English and Americans, the Kingdom of Hawaii and India’s Hindu principalities outside the formal British Raj.

But, the nineteenth-century drew to a close, concern over declining birth rates or “race suicide” and the rise of Tarzan comics made effete young men like Teddy Roosevelt re-make their refined manliness into a less restrained, more violent masculinity.

Yet, in the English tradition, the ideal of the dandy hero did not die such a quick death, in part because of conscription. There was not just a cultural need to see the continuation of the dandy hero ideal into the twentieth century, but a politicized national interest. Characters like Biggles, the under-aged gentleman-hero of the RAF occupied a contradictory and frequently-lampooned role in British pop culture, increasingly incongruent with the appetite- driven, violent masculinity of the likes of Ian Fleming’s James Bond, the heroes who epitomized twentieth-century manliness.

What allowed Biggles to survive through the 1950s, 60s and 70s, as the Second World War, and attendant conscription, receded from the public mind was the fact that, in the Biggles comics, the war never really ended. Because he was defending the home front, the women and children of England, his masculinity, while quirky, could remain undisputed. Wartime masculinity was more capacious and diverse because it could be clearly unified around the defense of England’s territory and non-combatants against the forces of autocracy or fascism. There was a teleology to masculinity: it was the nature that existed in men that motivated and enabled them to defend the non-combatants at home. If they could not be masculine by virtue of their nature, dandies could be so by what they accomplished: the defense of the home front.

And so we come full circle to Doctor Who, the show that captured a nation’s imagination in 1964 by creating a space-age monster that perfectly symbolized totalitarianism: the Daleks, from whom the Doctor is forced to defend his granddaughter Susan and her teacher, Barbara Wright. Fundamentally, Doctor Who kept the ideal of the dandy hero alive in the same way Biggles did, by demonstrating his masculinity teleologically (by succeeding in his defense of women and children) and relationally (because this could be enacted through his observed protector relationship to a woman and/or child). Because a dandy was no longer intrinsically manly, his masculinity (and hence heroism – this is a patriarchy, after all) had to be telegraphed in this way. The dandy-hero defended the home front (i.e. women, girls and boys) from the totalitarian forces of continental Eurasia (Daleks, Cybermen, Autons, Sontarans, Rutans, Axons, etc.) The asexual relationship between the Doctor and the endless parade of pornographic archetype companions (leather-clad savage, stewardess, micro-skirted professor, ditzy secretary, etc.) served to underline their minority relative to the centuries-old Time Lord and, hence, his role as their protector.

As time wore on and we became a people whose contemporary political and cultural struggles came to revolve more around Stonewall and less around the Holocaust, the queer reading of Doctor Who became a more obvious one. And, for more and more viewers, it became a proto-Will and Grace. The companions came to stand in less for the women and children of wartime England and more as the beard or “fag hag” of the gay, male hero, the asexuality of the relationship conditioned less by the woman’s putative minority and more by the man’s queerness.

But this transition ultimately deprived the show of its underlying dramatic tension. As the memory of fascism receded, audience members were not on the edge of their seats, nor, as many fans remember their younger selves, peering out from behind the sofa, to see if fascism incarnate would succeed in its evil design and land on England’s metaphorical shores. The show, by the 1980s, had become a parody of itself (a parody that I personally loved!), with a cult following of gay men, sexually inept geeks and hard-core sci-fi aficionados.

Without the women of the home front to protect, there could be no compelling dandy hero and hence, no mainstream audience.

It is with this understanding that I will visit how it is that the tension and drama that old Doctor Who gradually lost have somehow been restored in the new series, how a dandy hero, who is queerer than ever, is once again a compelling television character. In the second half, I will suggest that this is, once again, centred in relational gender dynamics and our perceptions of the most sinister threats to the virtue and safety of women and children in Anglo society.