Categories
Military History science

The oldest weapon

Throughout history, humans have invented an infinite array of tools to perfect the art of waging war. Over time, this has created a variety of weapons to choose from: guns, swords and arrows are the most visible ones. But there were so many others that didn’t quite survive into the modern age – like spears, slings, atlatl and The Terminator.

But what was the first weapon humans used? Trying to answer this question is more complicated than I first thought. There are many red herrings, questions of semantics and a whole heap of (I think) illogical candidates. Plus, it doesn’t help that science has come up with several different candidates at various points in history. This post summarises the history of the history of our weapons, and tries to provide a definitive answer to the question “what did the earliest humans use as weapons?”

Semantics

The first issue with uncovering the forefather of all modern weapons is semantics – or the meaning of certain operative terms. What do we mean by ‘weapon’? Do we mean anything with which you can hurt another person? What about animals and other living beings? What does it mean to cause hurt? Do we mean any kind of hurt, or do we mean physical pain? Moreover, what does it mean to be able to hurt? Does it have to be intentional, or is unintentional use alright? Depending on how you answered the above questions, a stone, a poisonous leaf and a racial slur can all be classified as weapons. But that’s silly – the list of weapons we seek to control have never included rocks or any other suitably dense object. Airplanes still allow you to carry onions, even though they’re highly toxic to cats and dogs. And despite all the progress we’ve made in eradicating racial slurs and epithets, popular discourse has never seen them as ‘weaponised language‘, although some sociologists are starting to push for changes in this direction.

So clearly, our definition of weapons is much more narrow: a weapon is a tangible device that humans use to inflict physical pain and/or death upon prey, game and other humans.

But here, we run into another complicated term: what do we mean by ‘human’? Do we mean modern humans who have discovered fire, wheel and agriculture? Or do we mean historical humans who may have had knowledge of these concepts but did not have any means to control them? If we accept that we mean Homo sapiens, how do we then view the weapons that may have predated our species? What about weapons that Homo sapiens may have picked up from other Homo species? In answering this question alone, we run into the full weight of human taxonomy, and all the interesting branches of Homo that we humans derive from. In a previous post, I’d written about emerging research on human migration, which paints a much more colourful and contentious picture of our ancestry than we could have imagined even a decade ago. For example, are Neanderthals a separate species of Homo, or are they merely subspecies of Homo sapiens that died out before historical times? At one point in prehistory, at least nine species of humans walked the Earth – and now there’s only one. Where did the rest go? Did we kill them all? Or did we absorb some of their genetic makeup into the human pool? These and many other questions remain unanswered to this day – for what it’s worth, I think some of these questions will never truly be answered by science alone. But our inquiry must go on, and we must draw a line somewhere.

The many human species that we know of. Source: ScienceAlert

My definition of ‘human’ is essentially a cop out: I mean any Homo species that were present on Earth by the time Homo sapiens began migrating out of Africa, and came in contact with Homo sapiens, either through warfare or through interbreeding. So, by this definition, Homo neanderthalensis and Homo floresiensis are both ‘human’, even if this stretches the word to its limit.

Finally, there is the question of ancestry. When we say ‘ancestor of modern human weapons’, what do we mean by ‘ancestor’? Do we mean weapons that have survived to the present day? Or do we mean any weapon that may have evolved into a weapon we can recognise today? What about dead-end weapons that humans may have used at some point, but are no longer seen to be of any value?

For me, the definition of ancestor is the one that is most useful to understanding weapon evolution and migration. So, dead-ends and made-up weapons are of no use. So, I will only consider weapons that are the direct evolutionary forefathers of modern weapons i.e. bows, swords, spears, catapults, slingshots etc.

Definition

To summarise, this is the definition of “modern human weapon” I will use:

  1. It inflicts physical pain and/or death to humans and other animals
  2. It was created with the intention of causing pain or death
  3. It was created by any of the 4 to 9 human species present on Earth by the time Homo sapiens began migrating out of Africa
  4. It has survived to the present day either in its original form or through some direct evolutionary descendants

Candidates

With these in mind, there are several candidates for the title of ‘ancestor of modern human weapons’. Some are not as obvious as the others.

Boomerangs

Yes, the boomerang. That same icon of Australian aboriginal culture. While we think of aboriginal people in Australia when we think of boomerangs, we find anceint boomerangs all over the place: from Africa to Europe. Boomerangs with gold tips were even found in Tutenkhamun’s tomb, showing that the story of boomerangs may be an ironic tale of Eurocentric world’s “self-discovery”. Boomerangs are surprisingly old: the oldest boomerangs we know of were found in a cave in southern Poland. Dated to about 23,000 years ago, these boomerangs were made of mammoth tusk and were likely used to hunt small-medium sized game like deer and boar. Interestingly, the oldest evidence of boomerangs from Australia are from nearly the same time period: about 20,000 years ago.

Paleolithic boomerang from southern Poland. Source: Reddit

Although we only think of boomerangs as those wooden things that return to the thrower, returning boomerangs are not the only kind of boomerangs humans have used. In Australia, both types of boomerangs are used to hunt birds and game. A returning boomerang can be thrown above a flock of ducks to simulate a hovering hawk. The frightened birds then fly into nets set up in their flight path or, if they come within range, the hunters can use non-returning boomerangs to bring the birds down.

Other than their use as weapons, boomerangs are also incredibly versatile tools: you can dig holes with them, flint-tipped ones can be used to start fires, weighted boomerangs can be used as hammers and to stun fish underwater, and some Aboriginal communities use them to make music.

The varied uses and the timeline of artefacts from Australia and Poland suggests one of two things: either early humans were already using boomerangs when they moved out of Africa, or the invention of boomerangs occurred independently on mainland Eurasia and Australia. If boomerangs were invented around Europe, what role did Neanderthal communities have in their creation? While this possibility would make for some juicy military history, the timelines just don’t support either side of the argument. Neanderthals went exist around 40,000 years ago, and we don’t see any evidence of boomerangs for at least 20,000 years after that. So yes, you can assume that Neanderthals gave humans more than just 20% of their DNA, but there isn’t any evidence to support it.

Atlatl

You may wonder why this list of prospective candidates does not include the bow and arrow. The bow and arrow is undoubtedly one of humanity’s most important weapons of war – entire empires have risen on the backs of people’s skill with launching sharpened projectiles using a taut string. The Mongols proved for all posterity that agility and mastery of archery are enough to turn a forgotten people into a truly fearsome force. The subsequent invention of crossbows, longbows and later seige instruments only serves to prove the point that archery has been one of the strongest shapers of human civilization.

Some have suggested that bows and arrows predate modern humans, but I can’t find any evidence that this is a popular view among paleoarchaeologists, so that remains an interesting theory – even if highly unconventional. However, we do know what bows and arrows came from: in Africa, we have remains from ~40,000 years ago of a weapon that works on the same principle of using tension to propel projectiles. Before there were bows and arrows, there was the “Stone Age Kalashnikov“: the atlatl. The construction of atlatl is surpringly simple: all you need is a long, flexible spear that is pointed on one end and held taut in the notch of a “spur” at the other. You create tension in the spear by driving it into the notch, and let it go to send the spear flying.

The atlatl is a curious thing. While the principle makes sense to anybody who’s played with pen refills in school, its construction is almost alien to us. It’s also a humble reminder that ancient people saw the world around them in ways we’d scarcely recognise now. If nothing else, the atlatl pushes our timeline for early weapons to at least 40,000 years ago.

Daggers, swords and the such

If you thought daggers were the most obvious candidates for early weaponry, you’d be very very wrong. Daggers are short, close-range weapons with at least one sharpened edge. Unlike arrows and spears that really only need a pointed tip, daggers need a sharpened edge, which requires considerably more effort and skill. Moreover, early humans used rocks, wood and things like volcanic glass, which are all brittle and hard to shape into the form of a dagger that needs a sharp edge and a blunt handle that is comfortable to grip. So, daggers really only came into the picture in the Bronze Age around 5000 years ago. Significant as they may be to warfare historically, daggers and swords are very recent inventions in most parts of the world.

Spears

Then we have the boring “pointed stick”: the spear. Spears are good melee weapons, used to maintain distance between the human and the prey (possibly another human), while causing damage. They offer many advantages to simple hand-to-hand combat: you can put some distance between yourself and the other party, thus minimizing injury; you can sharpen one end and use it to bleed the other person, thus reducing the amount of effort you need to bring them down. Also, you can accessorize your pointy stick by tying a sharpened piece of rock to the end.

Spears have a solid paleoarchaeological footprint: there is evidence of humans using spears from as long as 400,000 years ago. No other weapon comes even close to this. Nearly every Stone Age site on every continent shows evidence of spear usage, sometimes tipped with sharpened stone fragments. Paleolithic remains from Europe and Africa are littered with pointed sticks, leading us to believe that they could very well be the oldest weapons we know of. More clinchingly, modern chimpanzees use pointed sticks to hunt bushbabies.

Chimps hunting bushbabies. Source: National Geographic

Is this the answer we have been looking for? Are spears the forefathers of swords, pikes and all other weapons? In my humble opinion, probably not. Spears need you to be very close to the other party before they can be of any use. Unless hunting defenseless animals like fish, rabbits and small deer, the prey can very easily fight back or run away. Moreover, spears are absolutely useless against any large mammal – and paleology has shown definitively that early humans frequently hunted large mammals like mammoths, bison and even saber-toothed cats. Even if hunting in a group, a bunch of 5 foot tall bipeds with large brains and reduced musculature wouldn’t be able to hunt a 12 foot mammoth with a 6″ thick skin and a prehensile trunk. Clearly, a spear would be of very limited use to early humans.

What they’d need is a throwable spear – something that can be used for melee if necessary, but intended to be thrown. Something about 2-3 feet long, made of easily-available material like wood and tipped with only a perfunctory rock or glass. Something versatile but also easy to make. Something like a javelin.

Javelins

Javelins are a forgotten class of weapons. Javelins were replaced by bows and arrows when archery was “discovered” by Europeans who were repeatedly trounced in the battlefield by armies from Central Asia. Time and time again, the disciplined, regimented armies of Rome would be defeated by “barbarians” with superior archers. This would be a pattern with established armies across the world: the incumbent armies, lulled into a life of stability and safety, invested in ostentatious melee weapons like swords, fancy war horses and warhammers. Invading generals chose instead to shed all weight and invest in nimble ranged weapons that allowed them to attack with force and retreat with speed. The Hindu rulers of northern India were conquered by marauding armies of Muslim generals who relied on improvised seige weapons and horse archers. A similar fate befell the wealthy rulers of West-Central Asia when Genghis Khan adopted similar tactics.

Javelins served the same purpose in prehistoric times. Whereas prey had various means to defend themselves at short range (tusks, trunk, claws, hide, antlers etc.), humans hunted that prey from a distance. Their weapons would have been intended to cause damage over multiple hits. Fossil remains show that early humans on the African savannah hunted this way, using javelins to help them chase an animal to death. This hunting method has been called “persistence hunting”, and evolutionary biologists have used it to explain many features about the human body that seem to be designed to help us run more efficiently and for longer: the Achilles tendon, arched feet, short toes, wide shoulders, etc. I’ll be the first to admit that persistence hunting is a hotly-debated issue in academic circles, and there’s strong evidence on both sides of the debate.

But there are many reasons for why we should suppose that the earliest weapons were indeed javelins. First, the Hadza people of Tanzania. These hunter-gatherers are known to engage in persistence hunting for at least part of the year. Their methods are very similar to what early humans would have employed, and the prey they hunt is mostly the same as well – large animals like the kudu, wildebeest and zebra. The weapons they use are not spears and swords. They use javelins and bows and arrows. Here’s a summary of their technique as captured in Attenborough’s “Life of Mammals”.

Second, people who think that humans had to use spears just because chimpanzees also use spears tend to minimize the differences in the type of prey hunted. Early humans hunted in large groups to bring down large mammals. Chimpanzees hunt in small groups to hunt small-medium sized mammals, generally smaller than the chimps themselves. Their prey of choice are colobus monkeys and bush babies, both of which are much smaller than themselves and largely defenseless against the more aggressive, powerful chimpanzes. Also, humans hunted out on the savannah and in forest clearings whereas chimps are mostly arboreal hunters that go after other tree-dwelling animals. The weapons you’d use to hunt a fleeing kudu or gazelle are very different from what you’d use against a baboon.

Finally, the earliest spears archaeologists have uncovered are almost certainly javelins. Conard et al. (2020) almost state as much, by showing that most Paleolithic artefacts misclassified as spears would be better labelled as “throwing sticks”. In addition, stone-tipped javelins found in Ethiopia have been dated to around 280,000 years ago, suggesting that these weapons probably predate Homo sapiens, which are known from the fossil record only around 200,000 years ago. In Germany, there is evidence of wooden throwing spears from as far back as 350,000 years ago, well before Homo sapiens evolved.

So there we have it. The mystery has been solved: the earliest human weapons were probably javelins.


Bonus: The Flail

Do you know what a flail is? You know what a flail is. It’s a stick with a spiked ball at the end, attached to a chain or rope. It’s a very common trope in medieval fantasy literature, and a steady fixture in any Hollywood scene showing brutality and torture in early Europe.

The cool thing is, it probably didn’t even exist. There is a whole fascinating article on The Public Medievalist that goes into more detail of why it’s so prevalent in our popular imagination, and goes to debunk the idea that these impractical, unwiedly things ever existed.

Categories
Book Reviews Culture Indian History Indian politics Religion

The Indian Conservative: Hindu apologism goes mainstream

Jaithirth Rao is an Indian businessman who founded Mphasis, a cookie-cutter IT outsourcing company based in Bangalore, India. In time, his stature as one of India’s aspirational new tech elites gave him space to air his views on politics, history, culture and a range of other social subjects. Rao calls himself a true-blue conservative in the Burkean sense – small government, free markets, traditional family values, continuance over radical change … the whole kit and kaboodle. “The Indian Conservative” is a compilation of his various lectures, talks and thoughts on an assortment of issues that Indian conservatives have concerned themselves with. It seeks to put forth an argument that conservatism in India has a long and colourful history that deserves further study. In doing so, Rao tries to elevate the status of conservative figures like Sardar Patel and Dadabhaoi Naoroji who’ve been given short shrift due to independent India’s wholesale adoption of Nehruvian liberalism.

Before reading the book, I was genuinely curious about a lack of cohesive picture of the conservative movement in India. Other than recent speculation about how different India would have been if Sardar Patel had been made PM instead of Nehru (a long shot considering the zeitgeist of the time), there’s very little we know of the other side of Nehru’s liberal India. The previous hints I’d seen were through Guha’s books, and even he laments the lack of scholarship on Indian conservatism. So when I came across this book, I picked it up without even checking reviews online. In a way, this turned out to be a good thing because I could start with no prejudgments about the author, content or style, and could appreciate the book for exactly what it was supposed to be – an overview of conservative thought in India, and a case for why it should be studied more intensively.

Boy was I wrong! In this post, I want to do two things: firstly, review the book for what it is, and then talk about all the things that it isn’t – so you can see for yourself the various ways in which Jaithirth Rao missed the mark in entirely avoidable ways. My overall assessment of Rao’s book is mixed – on the one hand, it brings conservative thought to the mainstream and gets us talking about it on an intellectual level and without the baggage of Hindu extremism. Equally, the book fails to deliver on every single claim it makes at the outset: it’s not historically accurate or complete, it never explains what makes Indian conservatism different from its Western cousin, isn’t held up by solid arguments so much as statements of intent, and finally, is too heavily reliant on the author’s 10-mile-high understanding of Indian society.

What follows is an expansion on these two sides of the coin. This post is going to be longer than average (which is already much longer than most blogs) so if you’re liable to get bored, I’d suggest skipping the next section and jumping straight to the second part where I make my case for why Jaithirth Rao’s latest book is only a 4/10, and can be ignored by most people.

What It Is

“The Indian Conservative” considers various spheres of conservative thought, namely political, cultural and social. The book also includes a small chapter about Rao’s own views on aesthetics and education. The chapters on cultural, social and aesthetic spheres cover what it means to be Indian, and how the conservatives of history, legend and imagination have all combined to create a rich, vibrant, multiethnic and multicultural polity we know as India. These chapters are all fairly boring with very little to stand on other than a smattering of religious texts and some well-intentioned proclamations by leaders.

The really interesting bits are actually all in the first chapter: the political sphere. Here, the author begins with a broad definition of what Indian conservatism is and what its guiding principles are.

Conservatism is a school of philosophy which is not characterized by rigid contours or definitions. It believes that human beings as individuals and as communities have evolved over time, developing laws, institutions, cultures, norms and associations. This evolutionary process undoubtedly contributes to practical utility.

The conservative position is that improvements have to be gradual, and preferably peaceful. Sudden, violent attempts at so-called improvements are viewed with suspicion, because they are likely to backfire, destroy much of the good in the past and the present, and deliver a situation substantially worse than the earlier one.

For those with an interest in political theory, it’s not hard to notice a direct and strong link to Western conservatism – more specifically as a school of thought containing Alexis de Tocqueville, Edmund Burke and Adam Smith. However, Rao reminds us that these ideas are not foreign imports to India. Indeed, if one were to consider the Mahabharatha and Tirukkural to be foundational texts of the Indian civilization, we would see that the Indic civilization itself is a deeply conservative one.

These two texts – one a religious epic and the other a collection of words of wisdom – deal with the three pursuits of humankind: artha (material, political and economic wellbeing), kama (beauty, passion and sensous pleasures) and dharma (virtue and morality). A fourth pursuit – moksha – is attained when the other three are achieved.

Then, the author makes the link between ancient Indic thought and modern history.

Let us switch gears and consider names associated with modern Indian conservatism, focusing for the time being on the pre-Independence era. The first is Rammohun Roy, who was a political conservative and a supporter of British rule, while being a social and religious reformer – a reformer and not a radical. The second is Bankim Chandra Chatterjee, who can be characterized as almost the founder of Hindu conservatism. […] Bankim and Lajpat Rai along with several others realized that a shared Hindu cultural identity could be the basis of overcoming vertical and horizontal boundaries among Hindus, like caste.

Hinduism, in other words, formed pre-Independence India’s “imagined community” a la Benedict Anderson. This is where Jerry Rao (that’s what the author goes by apparently) brings modern day Hindu nationalism back into the conservative fold. In his analysis, the roots of Hindu nationalism and that of Indian conservatism are one and the same. There may be some merit to this line of thought, but I think there are some gaps in Rao’s reasoning that someone else will have to fill. We’ll pick up this thread later in the post.

To those who might argue that conservatism everywhere is merely reactionary hand-wringing, Rao has a ready response:

The view that conservatives love the old and oppose all change is both simplistic and wrong. Conservatives are most certainly not reactionaries. We only love those parts of the old and inherited that are constructive and creative and not dysfunctional. We are committed to change, which as the Greek philosopher Heraclitus observed, and as the Yajur Veda articulates, is inevitable. We, however, do not believe in jettisoning features of the past that are worth preserving or that we feel are worth cherishing.

While this is a sensible position to take and I personally find it hard to refute, it’s nigh impossible to shake the feeling that much of Rao’s analysis is based on European and American conservatism, with all the Indian bits retrofitted to prove his point. We’ll return to this objection in the next section.

Returning to the question of political conservatism, the author details how the Indian National Congress until the late 1920s saw British rule as a benevolent protector state. Its only demands were only for ‘home rule’, on the lines of what the Irish were fighting for. We know that Dadabhai Naoroji’s strongest allies in the British parliament at the time were Irishmen, and even before Naoroji’s time, Raja Rammohun Roy was received in England by the liberal Unitarians. So almost unwillingly, the author concludes, Indian conservatives ended up in the wrong camp due to the obstinacy of the British Conservative party. He doesn’t seem to consider the possibility that this is just how politics is played and there are no unconditional alliances in the pursuit of power.

[…] even though Rammohun Roy went to England as a very conservative emissary of the impoverished Mughal emperor, he was feted not by the High Church party, but by nonconformists like the Unitarians. Willy-nilly, even conservative Indians ended up being seen as liberal fellow travellers. In the decades that followed, British Tories preferred Indian maharajas to scholars like Naoroji. It was only the Liberal Party which would nominate Naoroji for a parliamentary seat. Gokhale faced the same situation. His only interested audience in England was to be found among liberals.


In the struggle for independence, Rao makes a case for why conservatives largely supported India’s British overlords, and why many chose fight their own countrymen alongside the colonial powers. His argument is a tried-and-tested one about maintaining continuity, making incremental progress, sticking to available remedies etc. In this regard, he sees Ambedkar, Gokhale and Savarkar as incrementalist heroes who ensured that when India did gain freedom, it would retain much of the old legal and civic structure. The Indian Constitution – despite the devious machinations of socialists and Soviet sympathizers – is thankfully only a minor facelifted version of the Government of India Act of 1935.

Here, Rao anticipates an objection from the other side: given that in one stroke the Indian Constitution prohibits discrimination on the basis of caste, gender or religion, all so deeply embedded in our history and our country, would it not be more appropriate to call it a revolutionary document, far from being a conservative one? His answer is a firm “maybe”. He argues that under the British, all Indians were treated alike – as chattel to be thrown out of trains when caught travelling in the whites-only carriage. So, Indians had already internalised some of this non-discrimination anyway, and the constitution only ensured that the progress made was not lost at some later time. A supremely weak argument; but a coherent argument nonetheless.

From independence, Rao draws a straight line to the modern-day Modi government, through Partition, the Emergency, 1984 Sikh riots, 1992 Babri riots and the single-term Vajpayee government from 1999-2004. Needless to say, he papers over inconvenient pieces of history. For example, this is what he had to say about the way Advani and the BJP riled up millions of Indians to march to Ayodhya and destroy a centuries-old mosque:

BJP put together a well-crafted national programme in support of the proposed Rama temple. The party organized a motorcade, referred to as a rath yatra, from different parts of the country to Ayodhya. […] The BJP also used the Rama temple movement very intelligently on the caste front. The volunteers in the marches and motorcades came from all castes. Dalit volunteers were specially honoured as layers of foundation stones. The BJP had successfully broken away from the accusations of its critics that it was an upper-caste Brahmin-Bania party.

The denouement of the temple movement came on account of mob violence, which the Uttar Pradesh state government had solemnly assured the Supreme Court would not happen. The inability of the Hindu nationalist forces to control extreme elements remains problematic for conservatives.

And in that one line, he sweeps aside all the many ways that conservative forces – much more than any leftist threat – threaten to pull this nation apart by force. To Jerry Rao, the problem with the Babri demolition wasn’t its complete illegality, or the fact that the Hindu side has no historical claim to that piece of land, or the months of communal provocation by Advani, Uma Bharti. No, the problem was that a handful of extreme elements resorted to mob violence, which was not controlled by the Uttar Pradesh government. So really, we’re told, the UP government was at fault.

But regardless, I’m quite aware that this kind of reasoning is not entirely uncommon in Indian political circles, and even in some intellectual quarters. We can excuse Jerry Rao this piece of unoriginal falsehood as just another symptom of the moral bankruptcy that infects modern-day conservatives everywhere. While their forebears were willing to go against king and society to defend individual freedoms and bring about real change, the modern conservative movement increasingly busies itself with engaging in revisionist storytelling and name-calling instead of getting its house in order and taking a stance against extreme elements.

In responding to any and all critique of this kind of reactionary rationality, Rao likes to fall back on the concept of yuga-dharma to illustrate how the nature of Indian conservatism has evolved over time.

[…] Apastamba Sutra of the Yajur Veda, which the historian P.V. Kane dates to the fourth century bce, talks of Yuga Dharma: the virtue or the ethic that is appropriate to the age. It is Parel’s case that Mahatma Gandhi in his own inimitable way figured out that in the present yuga, it makes sense to walk away from the excessive emphasis on moksha. […] The dharma of Gandhi’s times demanded an active involvement with this world, with his country, with his city.

Modern day conservatives like Jerry Rao fail to consider that in this yuga, yuga-dharma demands that the most conservative thing to do is to stand up against Hindu extremists and defend the Indian way of life from a complete dismemberment from the inside.

In the subsequent sections on cultural, social and aesthetic spheres, Rao has precious little to offer, even when you try very hard to see his point. In the chapter on social issues, Rao offers a tepic objection to the caste system, concluding that the caste system has some limited utility in modern India but society needs to be reformed to make sure that things like untouchability are not brought back in fashion. On the role of women, Rao acknowledges wholeheartedly that women have been mistreated and marginalized for millennia – an unusually candid admission from a writer who seems to skirt all other issues, no matter how obvious they may be to Indians or outsiders:

The same issue received considerable attention from our detractors like Kipling who argued that Indians did not deserve freedom principally because we were given to oppressing our women and our poor and in fact it was the British who protected these unhappy residents of our fair land

Lost in the chapter on aesthetics is another easily-missed admission of guilt: the mistreatment of Muslims. Rao accepts that Muslims are treated as purely political entities to be herded and cajoled into voting for whichever party represents their interest. He sees much to be achieved to bring them back to the mainstream and open up the floor to debate on social issues affecting Muslims.

Issues connected with Indian Muslims that do not deal with religion are largely seen through a political prism and not a social one. I believe that this is a mistake. Muslims are more than just voters. They have given to the country important legacies in architecture, painting, music, dress, food, landscape gardening, literature and much more.

Mysteriously, however, his thoughts on purdah, the role of women in Islamic society and hot-button issues like triple talaq are never clarified. More importantly, his expression of solidarity with Muslim conservatives is entirely undercut by the fact that this is the only time in the book when the author considers the plight of Muslims. You need to be three-quarters of the way through the book to find an acknowledgment of Muslim contribution to Jerry Rao’s “Indic culture”. This and other substantive issues with the book are the subject of the next section.


What It Is Not

At the outset, Jerry Rao’s book is not an honest retelling of Indian history. It leans too heavily on upper-caste tropes of “centuries of humiliation” under successive Muslim rulers, falls prey to the same trite upper-class arguments about the “benevolent British”, and consistently diminishes the serious differences that have always existed between various schools of thought. Let’s consider the matter of Muslims first.

The Islamic Question

In his entire chapter on Indian conservatives in the political sphere, Rao does not find space to drop a single Muslim name. I can name a few stellar individuals right off the bat: Maulana Azad, Shafaat Ahmed, Sir Muhammed Iqbal, and the indomitable Sir Allah Bakhsh.

The last name may be unfamiliar to some, and in all fairness deserves a whole post to himself, but here’s the run-down: Allah Bakhsh was the Premier of Sind in British India – up to 1942 a career conservative within the British Raj. An inveterate secularist, he championed a popular movement against the divisive Muslim League. His popularity was so immense that the Muslim League made nearly no advances into the province of Sind until his death in 1943. In 1942, Churchill’s infamous speech to the British parliament where he refered to Gandhi’s “Quit India” movement with utter disdain and made some unsavoury remarks about the possibility of granting independence to Indians. Allah Bakhsh made it clear that he’d had it – he renounced his post and fully intended to dedicate the rest of his life to gathering support for a free, secular and united India. The united part of his personal manifesto bothered the Muslim League, and every clue points to their involvement in his eventual assassination in 1943.

Was this not relevant to Rao’s case for conservative thought in the country?

Rao might counter my objection by stating that Allah Baksh was indeed a conservative for the most part but by renouncing his premiership, he also renounced all claims to being part of Indian conservatism. Fair enough. But if one is to buy this argument, why does Naoroji figure so conspicuously in Jerry Rao’s narrative? Naoroji too began as a conservative who thought he could make a difference from within the British parliament. Although he made some progress towards his goal of Indian home rule, he soon realised that the powers in Britain wanted control over India at any cost, and saw the predatory Crown as a leech sucking the Indian body dry. By the time Naoroji died in 1917, he was thoroughly disillusioned with the British ability to govern India and wanted them gone.

Naoroji was as radical as they came in 1917. And yet, Rao has no trouble including him in the political narrative. Wilful omission? Maybe. Double standards? Most definitely.

This exclusion of Muslim individuals isn’t restricted to the Independence movement – Rao ignores all Muslim contributions to Indian political thought despite the fact that for over 600 years, this nation was ruled by Muslim rulers. I want to go easy on the author and assume that he ignored them because they were causing many changes to Indian culture by bringing their new ways of life to this land of Hindus. At the risk of being accused of whataboutery, I want to put to Rao this following: if this is the case, why not at least mention Akbar, a man who fought his own zealous family to ensure equal treatment of all citizens regardless of their religious, ethnic or cultural background? For a man so fond of name-dropping, the silence on political changes due to Mughal rule is deafening. On the matter of trade and economic issues, why not mention Sher Shah Suri, the man who facilitated free and fair trade so much that during his time, a caravan could travel unmolested from Peshawar in modern Pakistan to Chittagong in Bangladesh – a distance of over 2000 km. Such free movement is still only a distant memory in modern India, where highway robberies are painfully common. As a lover of free markets and open trade, shouldn’t Rao appreciate this unprecedented effort a bit more?

The 16th century Grand Trunk Road, a truly impressive trade route connecting Bengal to the Hindukush

In the end, it is obvious to all but the most intransigent that Jerry Rao’s recounting of Indian political history deliberately omits Muslim names while trying to secure ‘Indian conservative’ firmly in the hands of Hindu actors. In case you needed more convincing, here’s how the author summarizes what Indian culture is:

I would argue that “we the people” is meant to be a reference to people with a shared culture, however limited or tenuous that idea may be. We call it Indian culture. The fact that many of its traditional elements have a Hindu touch does not make it an exclusively Hindu culture. The Ramayana and the Mahabharata are doubtless central. But so are the Jataka tales, Jain sutras, Sufi music, the Sikh gurbani, Reverend Beschi’s Tamil epic Thembavani, Abraham Panditar’s Carnatic music compositions on Jesus, Avestan verses, Bene Israel psalms, Santhal chants and so much more.

So it’s everyone on the planet except mainstream Muslims. Good to know, Jerry!

Conservatism and Its Masters

Perhaps the most cringeworthy parts of the book are where Jerry Rao echoes Indian conservatives in his defence of the British Raj as a benevolent, positive addition to Indian history. A century of poverty, strife and gradual resurgence seems to have granted him a doe-eyed version of what the British were actually doing in India. This is how Jerry Rao views the

The fundamental political dispute that defined the first half of the twentieth century in India had to do with the approach to the Raj. Many conservatives believed that with all its faults, on balance the Raj must be leveraged as a force for the good. […] It is not uncommon to keep running into the view that we were in a sense lucky not to have been colonized by the Portuguese, the Spanish, the Dutch or even the French. The Indian encounter with the Anglo-Saxon has been seen as one that resulted in a refreshing outburst of creativity, which had constructive outcomes.

A “refreshing outburst of creativity”? In what, massacring peaceful protesters?

And yet, Rao does not spare the pre-British Mughals the same generosity; this despite the undeniable fact that everything from food to clothing to our culture itself was made infinitely more colourful by Mughal patronage.

Rao’s claim that the 1950 Indian Constitution must be seen as a conservative document is comical in its absurdity. His whole argument hinges around the Manusmriti, an ancient Indian document that lays out the various rules governing Hindus, codifies the ways in which they may interact with each other and prescribes a very rigid set of roles that individuals of each caste, creed and gender could perform. Many devout Hindus consider this document to be divinely handed down from God to the sage Manu – therby making it inviolable and sacred. Most contemporary discourse about “Brahminical orthodoxy” ultimately refers back to this text. Let’s consider the evidence presented before us:

One can argue that the idea of non-discrimination too had an evolutionary history through the Raj. […] The jury is out on whether the Manusmriti was simply an idealized text or if it was practised. But for what it is worth, it did have a measure of social sanction and it did provide for differential punishments for identical crimes committed by persons belonging to different castes. It turns out that the Raj successfully subverted this ideology fairly early in the game.

[…] in the area of gender, the practices of the Raj were not necessarily much behind those prevalent in Britain and America. In the late nineteenth century, the Madras Medical College did admit women. In the early twentieth century, Cornelia Sorabji was not allowed to practise in the Bombay High Court because women were not allowed to practise in English courts at that time. The enhancement of women’s rights can also be seen as a gradual and phased affair, rather than one which was parachuted in by our Constitution.

Some have argued that the grant of universal adult franchise by our Constitution was truly revolutionary. The very chronology by which the political institutions of India evolved from the Regulating Act, Pitt’s India Act, the Charter Acts, Queen Victoria’s Proclamation, the creation of Councils, the Minto-Morley Reforms, the Montagu-Chelmsford Act and the 1935 Government of India Act all the way to our Constitution makes it an evolutionary, gradual, constitutional process. The retention of the key features of the political institutions bequeathed to us by the Raj makes the process a conservative one. The new Constitution did go against doctrines like the Manusmriti. But that process had started long ago.

Much has been written regarding the status of Manusmriti in pre-colonial Indian culture, and I don’t want to belabour this point too much. However, two things need to be noted: first, as pointed out by historians such as Ram Guha and Shashi Tharoor, the Manusmriti was considered as useful in daily affairs as the Bible is to Americans today. Laws existed separate from the rules laid out in the Manusmriti, and it was really the British who gave Manusmriti more weight than society did. The Gentoo Code that the British adopted in their dealings with Indians was the first time in centuries that the Manusmriti came to be regarded as anything more than a historical relic. This is not to imply that all pre-colonial Indians were casteless hippies enjoying life freely. No, by codifying these loose and amorphous rules as the basis of all Indian law, the Raj actually cemented the very discrimination that Jerry Rao so gleefully tries to downplay.

Second, if Jerry is fine with the British state because it subverted the provisions in the Manusmriti, one wonders if this is a matter of principle or a convenient factoid the author is exploiting. Supposing a Muslim ruler had done the same thing by imposing a set of rules that applied to Hindus without any regard to their castes, would Rao be equally glad that age-old shackles of caste had been broken by a wise ruler? What if Jerry Rao reads a bit more Indian history and learns that Aurangzeb did exactly this? Would he start singing praises about the great ruler Aurangzeb who ruled over all of India and destroyed the caste system for all eternity? I doubt it very much, and I think this inconsistency proves that for Jerry Rao, the Manusmriti matters purely because the British first legitimized it, and then subverted it. That’s not conservatism; that’s just boot-licking.


Coda

It’s now getting tiring to point out the fact that Indian conservatives are without exception drawn from the same mold of upper-caste, upper class urbanites who seem to be entirely removed from the rest of India’s “unwashed masses”, all while simultaneously preaching what the caste system actually is to people whose daily lives are defined by it. Trust me, I hate this dreadfully boring continuance as much as anyone else. And it brings me no small amount of frustration to be saying that of a writer who I thought could make a genuine attempt at wrestling with the vexed issue of conservatism in India. But Rao shows neither the self-awareness nor the honesty required to carry out such a task. In the end, his book is just another in a long line of sad restatements of cliched elite truisms about India’s glories and its colourful past, and adds nothing to enrich popular discourse. If I’d gone my whole life reading this book, I don’t see how I would have been poorer by a paisa, an ounce or a thought. However, I suspect that “The Indian Conservative” is going to be instructive to liberals looking to rebutt Indian conservate arguments. If nothing else, it goes to demonstrates all the reasons why it may be considered at best a hollow intellectual space, and at worst a dangerous normalisation of previously taboo apologisms.

In one word, Jaithirth Rao’s attempt at mapping out the history of conservative thought in India can best be summarized as ‘dishonest’. It papers over many issues in Indian culture purely because the author finds them inconvenient to his narrative that ther is a positive thing called “Indian culture”. Where impossible to ignore, Rao’s hamfisted arguments only delegitimize the conservate case, even while exposing his less-than-adequate research. Nevertheless, the book is important as an emblem of the growing brazenness with which Hindu apologism is seeping into everything in India. If nothing else, it may be a sign of the books to come.

Categories
Book Reviews

Book review: Bullshit Jobs

We keep inventing jobs because of this false idea that everyone has to be employed at some sort of drudgery because, according to Malthusian Darvinist theory, he must justify his right to exist

Buckminster Fuller

David Graeber’s ‘Bullshit Jobs’ is a 360-odd page expansion of his earlier essay ‘On The Phenomenon of Bullshit Jobs‘, which set out to make the case that many high-paying, respectable jobs are completely unnecessary and meaningless. I must admit, I wasn’t familiar with Graeber’s essay before reading the book. For the most part, I hadn’t really thought very deeply about the phenomenon, despite knowing many people I can now recognise as being caught in such BS jobs.

When I picked up this book, I was going solely by the title and coverpage, which looked interesting and provocative. I thought was it was going to be one of those ‘The 4-hour Workweek’ kind of pop-management books, where the author makes a bunch of loose, interesting but essentially forgettable claims to support a “cool”, contrarian viewpoint. My experience with such books has generally been that they’re easy reads, and make for good ways to kill time on a flight or something, and I always hated the fact that these authors take 200+ pages to make a point you could easily fit on a business card with some room to spare.

‘Bullshit Jobs’ is nothing like that. It’s a deceptively simple idea hiding a far more sinister, insidious structure that is becoming increasingly hard for me to ignore.

The first chapter introduces the reader to Graeber’s previous essay, and defines what he considers to be bullshit jobs:

Bullshit jobs are not just jobs that are useless or pernicious; typically, there has to be somd degree of pretense and fraud involved as well. The jobholder must feel obliged to pretend that there is, in fact, a good reason why her job exists, even if, privately, she finds such claims ridiculous

The examples he provides include the well-remunerated armies of administrative staff in universities, PR consultants, IT subcontractors, financial sector workers etc. Jobs that are meaningless because they perform redundant, circular or unnecessary activities that only aim to further the existence and creation of more such jobs. In effect, Graeber’s book identifies modern industries of corporate rent-seeking. They’re just as likely in the public as well as the private sector, and increasingly more common in the private sector, where the firms are shielded from public scrutiny, and protected by neoliberal apologists who justify their existence because in a free market, the market creates needs, which are then filled by jobs. Therefore, bullshit jobs cannot be bullshit because they’re only responding to a market need for such jobs (I will comment on my own assessment of this argument later on).

Moreover, he points out, there’s a distinction between “bullshit jobs” and “shit jobs”:

Bullshit jobs often pay quite well and tend to offer excellent working conditions. They’re just pointless. Shit jobs are usually not at all bullshit; they typically involve work that needs to be done and is clearly of benefit to society; it’s just that the workers who do them are treated badly


The second chapter creates a taxonomy of bullshit jobs – flunkies, goons, duct tapers, box tickers and taskmasters. Flunky jobs are those that exist purely to make someone else look or feel important – like executive assistants for a mid-level manager who wants to feel important, and receptionists for an office that only does business online. Flunkies sometimes end up doing 80-100% of non-bullshit aspects of their bosses’ jobs. Goons’ jobs have an aggressive element to them, and there’s no reason for them to exist except that other people employ them – think national armies, lobbyists and telemarketers. Duct tapers are employees whose jobs exist because of flaws, inefficiencies or problems in the organisation; who are there to solve a problem that shouldn’t even exist – Graeber includes a poignant little example about how throughout history, prominent men have wandered around oblivious to the goings-on around them, leaving their wives, sisters or mothers to clean up after them and negotiating solutions as they arise, thereby “duct-taping” the issues caused by these great men. Box-tickers are people who exist predominantly to allow an organization to claim to be doing something it isn’t – like diversity consultants, factfinding commissions, PR people, sustainability consultants etc. Finally, taskmasters come in two varieties – unnecessary superiors and creators of unnecessary tasks for others.

While the taxonomy is definitely interesting and sometimes insightful, I see many of these BS jobs (especially flunkies, box-tickers and goons) as a form of signalling and a type of Mullerian mimicry, where you stand to lose by not indulging in this meaningless activity.

This reminded me of a (possibly apocryphal) story of a Fortune 500 CEO, when asked about why his company is paying him an outrageous amount despite the company’s poor performance, stated that if they didn’t do it, it would signal to its competitors and investors that the company isn’t serious about improving performance, and other executives would jump ship, thinking it was sinking. Essentially, for any company that seeks to buck this trend of useless jobs, it’s a Catch-22: “yes, a CMO is entirely useless to the day-to-day functioning of the company” (Uber hasn’t had one in over a year), “… but we can’t attract top marketers to our company if they don’t see a neat career pathway leading to the top”.

So, a BS job – if sufficiently common – is meaningful merely because it is widely expected among peers, therefore requiring you to employ a flunky if you are to perform your job properly. Of course, it’s another matter if your job is itself a BS job. And the flunky becomes the flunked.


The third chapter is mostly a treatise on the consequenes of Puritanical “moral confusion” that Graeber talks about in his previous book on debt. The gist of it is that our assumptions about why humans work and what motivates them to keep working in meaningless jobs are completely wrong. We assume that people, when given the choice to be a parasite, would definitely take it. Graeber makes a case for the opposite view: that actually, people hate boredom, and want to be the cause of something meaningful. As Graeber’s interviewees see it, “a human being unable to have a meaningful impact on the world ceases to exist.” I think this is a particularly powerful insight, and helps to explain why most people end up gravitating to one of two kinds of jobs: BS jobs at important companies, and non-BS jobs at startups that have no real shot at success. We look for purpose even as we try to chase money, fame and all the other things humans have always chased after.

Here, Graeber takes a historical view at the changing nature of what it meand to work, and how it related to other abstractions like time, money, the economy, God and so on. In my opinion, this is the weakest part of the book and could easily have been trimmed down to a few pages of high-level analysis without going into the details of serfdom and ancient Roman concepts of slavery.

Nevertheless, Graeber teases two factors that could explain why a reasonable person values jobs so highly – even if (and sometimes, especially because) they’re complete BS. One reason is that in modern cities, it’s impossible to have many friends that aren’t somehow related to work. The other, which I found quite interesting, is a concept of “scriptlessness”, where people don’t know how to respond to having a job they think is meaningless simply because there’s no prescribed or socially-acceptable way of dealing with it.

Here again, Graeber circles back to his reasoning that neoliberal economic thinking is at fault. Part of the reason that nobody has noticed this epidemic of BS jobs, he argues, is that people simply refuse to believe that capitalism could produce such results.

I’m sympathetic to this line of argumentation, because this is what standard economic theory teaches us and the more educated one is, the less likely you are to question the wisdom of such a fundamental premise of the modern world. It’s become gospel (on the political left as much as the right) to assume that free markets force companies towards a relentless pursuit of efficiency, which means that any jobs created in a free market must be necessary or somehow demanded by market forces.

This kind of circular logic reminds me of the scene from Vice where Dick Cheney argues for why the United States can not be accused of torturing its prisoners:

Cheney: “We believe the Geneva Convention is open to… interpretation.”

Tenet: “What exactly does that mean?”

Addington: “Stress positions, waterboarding, confined spaces, dogs.”

Rumsfeld: “We’re calling it enhanced interrogation.”

Bush: “We’re sure none of this fits under the definition of torture?”

Addington: “The U.S. doesn’t torture.”

Cheney: “Therefore, if the U.S. does it, by definition, it can’t be torture.”


If you imagine David Graeber to be a cat making a meal out of some sorry animal, the first three chapters are where he fusses over the location and licks the carcass clean of hair, feathers and other detritus. The fourth, fifth and sixth chapters are where Graeber really digs his teeth into the meat of the issue and goes to town. And this is also where his own self-assured political and moral leanings begin to come through more clearly.

He begins by tackling the arguments that seek to justify BS jobs, and shows that people who defend such jobs as simply a reflection of market demand and/or government regulation are speaking out of their rear ends.

It’s extremely unlikely that government regulation caused private sector administrative jobs to be created at twice the rate as it did within the government itself. In fact, the only reasonable interpretation of these numbers is precisely the opposite: public universities are ultimately answerable to the public, and hence, under constant political pressure to cut costs and not engage in wasteful expenditures

He points out that historically, BS jobs have been confined to a few sectors – most prominently, in the legal profession as in Dickens’ Bleak House. They rose in prominence as businessmen and corporations raised the profile of adminstrative tasks at the expense of actual, physical labour through concepts like “scientific management”, and the resulting emphasis on middle managers and supervisors led to corporate overlords usurping power from factory unions, teachers and nurses through the creation of new and entirely pointless professional-managerial positions. Graeber points to the current state of affairs, where banks derive most of their profits from fees and penalties, and car companies make more money from interests on car loans than from actually selling cars.

“Efficiency” has come to mean vesting more and more power to managers, supervisors and other presumed efficiency experts, so that actual producers have almost zero autonomy. At the same time, the ranks and orders of managers seem to reproduce themselves endlessly.

Graeber sees the new system as akin to a revamped, facelifted Feudalism 2.0 that oppresses workers while rewarding overseers and supervisors. He proceeds to make his point by examining how feudalism worked historically, what changes made it less prevalent, and how capitalist logic brought it back from the dead. This section is positively riveting, and reads like the impassioned sermon of a fervid missionary. He uses economic data to show how even by libertarians’ own estimation, many of these jobs are a drain on society, destroying more capital than they are paid. For good measure, he also points out the role of religion by showing how God himself may be held responsible for some of this undue stress on having a job and working your ass off, even if it’s ultimately meaningless:

The Judeo-Christian God created the universe out of nothing. (This in itself is slightly unusual: most Gods work with existing materials.) His latter-day worshippers, and their descendants, have come to think of themselves as cursed to imitate God in this regard

All these factors, Graeber argues, mean that present-day managerial feudalism is maintained by a delicate balance of resentments. Here, he synthesizes his pointed understanding of the problem, in a form he calls the “paradox of modern work”:

  1. Most people’s sense of dignity and self-worth is caught up in working for a living
  2. Most people hate their jobs

The final section discusses some political implications of this trend towards bullshitization of jobs. Graeber also recommends a policy fix, but only hesitatingly and almost unwillingly, leading one to wonder why he needed to at all. Regardless, he leans heavily in favour of universal basic income as a possible cure for the proliferation of bullshit jobs. After all, who would even bother working a BS job if they didn’t have to worry about feeding their kids or taking care of medical expenses?

In the main, Graeber’s ‘Bullshit Jobs’ is not to be read for its policy proposals or even for the role of politicoreligious institutions in creating the economic structures we’re saddled with today. In my opinion, the book must be read as a critique of the popular discourse surrounding the social role of jobs and what work ought to be. On this front, the book is par excellence. Political theorists can argue about whether this discourse is because of politics, or drives political incentives. And religious scholars can debate whether religious scripture condones or militates against it. But for most of the rest of us, the ideas provided by David Graeber form a solid bedrock to build better relationships with our jobs.

Categories
Book Reviews

Book review: Early Indians

‘Early Indians’ by Tony Joseph has a simple claim: to bring us up to speed on what we know about prehistoric Indians. To do this, he uses a variety of sources to triangulate arguments and make a simple and robust claim: all Indians have a mixture of immigrant ancentries.

The book begins, as any work rooted in scientific concepts would, with definitions: what he means by ‘Indian’, what he means by ‘prehistory’, what kind of data he uses, where we find such data and whose work he derives from. This is where Joseph’s approach stands out from similar ones made by authors such as Romila Thapar – Joseph goes beyond simply using archaeological data; he uses draws from genetic data in the form of genomes and lineages, and insights from linguistics as well. By setting up these concepts nicely, he also ensures that the reader doesn’t need a dictionary or an encycopedia. Every new concept that is introduced is given sufficient attention before turning to how it’s useful and what information can be drawn from it. For example, consider this passage in the first chapter:

When geneticists talk about the first modern humans in India, they mean the first group of modern humans who have successfully left behind a lineage that is still around. But when archaeologists talk about the first modern humans in India, the are talking about the first group of modern humans who could have left behind archaeological evidence that can be examined today, irrespective of whether or not they have a surviving lineage.

This distinction is helpful for the reader to appreciate why multiple sources are necessary, and also buttresses Joseph’s claim that the various fields don’t have to disagree with each other’s findings. It just takes some contextualization for us to appreciate why they may be different.

The first real insight for me was the role of haplogroups (branches in the genetic tree) in decoding ancestry. All humans outside Africa carry lineages that follow from M, N or R haplogroups. While south Asia has all three of these, Europe only has N and R. What does this say? Quite simply, that the first groups of humans to leave Africa followed a route that brought them close to India, where they may have settled before moving to Central Asia, and then making the push towards Europe. But if they were moving across the Sinai and Levant, wouldn’t Europe be closer to them, and thus likely to be settled first? Well, it turns out that the first successful groups of early humans first moved into Asia via the Bab el Mandeb at the southern edge of the Red Sea, crossing over into Yemen and Saudi Arabia. Earlier expeditions through the Levant and Judaea were unsuccessful (likely because of the presence of Neanderthals), and the Red Sea route was very much a viable route in the interglacial period.

This means that very early in our species’ history, South Asia was home to a great majority of humanity – a poignant reflection of today’s reality, where the Indian subcontinent alone accounts for over a fifth of the world’s population. So, in a way, the story of Indians is the story of our species, whether some of us like that idea or not.

The second chapter goes into some detail about the pre-Harappan farming communities in the Indian subcontinent. The site he chooses to focus on is Mehrgarh in Pakistan, a spectacular example of Neolithic civilization in the subcontinent.

The fast-eroding ruins at Mehrgarh, Pakistan

In 7000 BC, the Mehrgarh people, called ‘First Indians’ by Joseph, had masonry, brick houses, more or less rectilinear walls, fireplaces, red paints and even early domesticated versions of barley, cattle and goat. The most amazing of these remains are the “grave goods”, or things people were buried with. We see shells, necklaces, headbands and semiprecious stones, moved around through trade networks that reached as far as the Makran coast. There’s even cotton you guys! And they had dentistry.

This leaves an obvious question: what happened to them? Turns out, they moved all over India. The current genetic makeup of Indians is a mix of two distinct lineages: Ancestral South Indians (ASI) who derive from the First Indians at Mehrgarh and Iranian agriculturalists, and Ancestral North Indians (ANI) who are a mix of First Indians, Iranian agriculturalists and Steppe pastoralists. In other words, north and south Indians really are different people.

The third chapter deals with that jewel in Ancient India’s crown: the Harappan Civilization. One of the oldest and most sophisticated civilizations of its time, the Harappan civilization rivalled Uruk as the preeminent civilization of its time, stretching over an area of a million square kilometers – modern India is an entity covering three thousad square kilometers. The people traded with Mesopotamia, lived fairly peaceful lives and had a real appreciation for art and crafts, despite not having spectacular temples or ziggurats. They also spoke proto-Dravidian, the forebear of modern-day Tamil, Telugu and Kannada. By using linguistics, genetics and archaeology, Joseph shows the unvarnished truth: the mature Harappan Civilization had few rivals in its time, and once it fell, the subcontinent would take close to a millennium to reach that level of advancement again.

Harappan architecture - Wikipedia
Harappan architecture

Harappan remains are now scattered all over India and Pakistan, where they’ve just turned into sad mounds of dust and pieces of inconvenient history. In Pakistan because pre-Islamic history has become taboo for some reason, and in India because the Harappans predate the ‘Aryans’, who are said to be the people who brought proto-Hinduism, Sanskrit and everything else that a proud Hindu values. In the fourth chapter on Aryans, the author’s powerful argumentation steps in to defend scientists from religious and political zealots. He picks apart Harappan culture to show that it contained many elements of what we consider to be Indian culture, and the Aryans really only brought Sanskrit, horses and an undue emphasis on violence, a warrior culture, ritual sacrifices and supreme mastery of metallurgy.

There is substantial evidence that the Indus civilization was pre-Aryan.

The Indus civilization was mainly urban, while the early Vedic society was rural and pastoral. There were no cities in the Vedic period. The Indus seals depict many animals but not the horse. The horse and the chariot with spoke wheels were the defining features of the Aryan-speaking societies. The chariot found at Daimabad in the Deccan, the southernmost Indus settlement, has solid wheels and is drawn by a pair of humped bulls, not oxen. The tiger is often featured on Indus seals and sealings, but the animal is not mentioned in the Rigveda.

All of these go to show that the Aryans were alien to these lands, and could only have been an immigrant (or invading) population. Evidence presented by Joseph shows that the Harappans worshipped or revered some sort of a phallic symbol, which we now know as the ubiquitous Shivalingam. But the funny thing is, the Rigveda actually denounces ‘shishnadeva‘, translated to ‘the phallus god’ or ‘phallus worshippers’, a clear allusion to the Harappan culture. Archaelogical evidence also shows evidence of deliberate destruction of phallic symbols and idols in every Harappan settlement we have.

However, Joseph shows that this disdain for Harappan culture doesn’t last forever. By the time of the Upanishads (500-100 BC), the ‘shishnadeva’ has been coopted into a religion loosely resembling the Hinduism we know today. In other ways too, the two cultures merge into one: Dravidian words are taken into Sanskrit, retroflex consonants (consonants that need you to curl your tongue, the very thing that distinguishes Indian accents) become common. Houses are built around courtyards, bullock carts are still in use across the country, bangles are important to this day, trees continue to be worshipped – the peepal tree in particular, the significance of water buffalo in some cultures, dice games, chess and even the practice of applying sindoor are ways we carry on in the traditions of the Harappans.


Tony Joseph’s book is a great example of the kind of confident works Indian authors are starting to produce. There’s a resurgent self-assurance in Indian writing, lacking the mass-market appeal of recent years. Gone are the days when the Indian writer was tremulously searching for validation from Western audiences. Joseph writes his book for one audience alone: the curious, anglophile Indian. He wants you to be informed, to be proud and to also be able to stand up and take control of the cultural narrative. The epilogue is a perfect exemplification of this: he makes a compelling argument for how and when the caste system came to be solidified in Indian culture. Joseph argues that the caste system in India did not arrive with the ‘Aryans’. Instead, it fell into place much, much later – about two millennia later.

In bringing history, science and culture together, Tony Joseph is able to convince the educated, cosmopolitan Indian to give up the self-flagellating fatalism we sometimes slip into. He brings history to contemporary India and argues for the value of studying our past in order to rectify its mistakes. He closes the book perfectly with a meditation on caste, endogamy and the role of migrants like the Sakas, Mughals and Parsis in deciding the cultural makeup of modern India.

‘Early Indians’ is more than just a simple narration of historical events, and it’s definitely not an academic piece of writing. It’s the perfect example of what authors are capable of producing when they build neat, rigorous arguments that contextualise history.

Categories
Marketing Philosophy

Bundles

Properties, products and the philosophy of marketing

Do you ever look at things and go “hmm, I wonder what makes this the thing, and I wonder what makes it a thing“? People have, for a long time. The problem of what makes a thing what it is was tackled by people as removed from modern life as Socrates and Plato. Ancient Greek philosophers believed that objects possessed two types of characteristics: essential and accidental, the former being properties that the object cannot do without, and the latter being more or less dispensible. Consider a dagger like this one below.

Look at that beauty! Mughal dagger from the 17th c. Source: Museum of Islamic Art

What makes it a dagger? It’s not the jewels, because there are other daggers without any jewels on them. It’s not the sheath either because storage is only a concern if your dagger is a piece of art or has some sort of personal value to you. Even the handle is not essential since you can stab someone perfectly fine without it. You’d hurt yourself in the process but that only goes to prove its effectiveness as a weapon. So, these are all examples of accidental parts of a dagger. The only essential one is the blade. A dagger that is all blade would still be a dagger.

Well, that’s the crude gist of it at least. The Stanford Encyclopedia of Philosophy has a lot more to say on the matter, and (as usual with these things) I’m not qualified enough to argue about the merits of each philosopher’s interpretation of the essential-accidential distinction. There’s another fascinating branch of philosophy that deals with the idea of substance, and bundle theories abound there as well. Amazing stuff.

That said, it’s not all philosophy today. The idea of products as bundles of characteristics is actually quite useful to a marketer such as myself.

A philosphy of marketing

“What is marketing?” is how every marketing textbook begins, and “what does a marketer do?” is how every single marketing professor has tried to begin a course. It’s cliche, boring and unhelpful. More importantly, nobody ever gives you a convincing answer. Everybody asks the question, but nobody answers it. I think the reason it cannot be understood in the traditional marketing paradigm is that marketing has never fully admitted that it is not art. Or science. Every few weeks, some or the other normie tries to summarise this into a “a mix of art and science” style response. Many of them (I’m looking at you, Seth Godin) have even made a career out of this blithely stating this truism.

It’s neither. Marketing is neither art nor science. The utility of this way of thinking is that once you realise that marketing is separate from these two, you see that just like art, science, knowledge, language and all the others, you see the need for a philosophy of marketing. It’s a category of its own, and in my opinion, it’s more closely related to language than to either art or science. Marketing, like language, works on rules and procedures. Unlike language, however, there’s value in conformity and in non-conformity. Neither art nor science place the same kind of emphasis on non-conformity and innovation as marketing does. Whereas art and science propagate through replication, marketing only propagates through innovation: nobody would have bothered about digital marketing if there wasn’t a need to explain the difference between new marketing and traditional marketing. Change is essential to the continued survival of the industry. At the same time, marketing cannot function without conformity, discovery and reflection. In this way, it’s once again closer to language and philosophy of the mind.

And so, we see the link to the philosophy of substance. One of the most important foundational axioms of marketing is that products are bundles. It’s such a simple concept, but it is rarely ever taught in marketing courses. Why does everybody see a car differently? Because they all focus on a different feature of it.

Let’s consider the classic VW Beetle ads from the 60s by DDB.

How Volkswagen changed the face of advertising | Wallpaper*

Why did this ad work? Because it focuses your attention on one thing: the iconic shape of the car. Don’t like the shape? No problem, they’ve got you covered.

How Volkswagen changed the face of advertising | Wallpaper*

In others, the ad focuses on something else: the specs (see below).

1960 VW Beetle specifications and technical details | Volkswagen ...

So why does this happen? Because a car – every car, not just the Beetle – has many parts, and thus is many things. A car is a bundle of seats, wheels, engine, roof, trunk, the people inside etc. It’s hard to say which parts are essential to a car since there have been cars without seats, a traditional steering wheel, wheels, engines, roof, etc. Pretty much anything you think may be essential has probably been substituted or removed at some point, and so there’s no use in dwelling on the essential-accidental distinction.

Marketers make money by exploiting the fact that products are bundles. A good salesman sells a car to an old guy by selling the auto-open door and other accessibility features; to a thrillseeker, he stresses the horsepower; to a mother he shows off safety features and counts the number of airbags inside the car.

Here, the more astute might say “oh that’s just sales. Marketers don’t just talk about the function. They sell you on feelings”. I’m coming to that.

Products as bundles of services

Objects as bundles of properties is a fairly old idea. Nobody doubts that a car has many features – that’s why spec sheets exist. But a good marketer sees the feeling and emotion associated with each feature. So, products are more than just bundles of functional parts – they are bundles of emotion.

What gives rise to these emotion? When a Coke ad (see below) shows people feeling energized, does that evoke a similar feeling of satiety in you?

Amazon.com : 1973 Coke Coca-Cola Refresher Course-History-English ...

Only a naive person says “absolutely!” If watching an ad made you forget hunger, McDonald’s ads could cure world hunger. But they haven’t because ads only remind you of a certain emotion, in order to induce the need for a service. In the case of Coke, it’s the service of quenching your thirst. In the case of a car, it’s the service of transporting you from one point to another. So, the source of all effective marketing is in fact this one simple reductionist axiom:

Products are bundles of services

That’s it. To be an effective marketer, you only need to understand that a product simply performs a bunch of services. It’s a placeholder with no value when it is not performing the exact services that a consumer wants. What is a car? It is any object that transports you, provides social space, shelter during a thunderstorm, saves you from a polar bear (if need be) and lets you and your chosen other get funky when you have nowhere else to go etc.

But but, you say: that could be any number of things – a train, for example. I agree. A well-designed train can perform nearly every service that a car can. And that’s why younger people no longer feel a strong need to buy a car, a trend that has led to plateauing car sales in Europe. Why bother buying a car when a train does the job? That rationale is why car ads are increasingly using “prestige” as a selling point in their ads. Why are car ads so same-y these days? Because every one of them shows a polished, upper-class guy (and it’s nearly always a guy) driving around and feeling proud. Or they appeal to masculinity. And they end up looking like this:

Nissan GT-R Ad from Europe | DrivingEnthusiast.net

Yawn. We’ve ended up with boring ads like this because every other service of a car is being performed by something else. The novelty has worn away and marketers are running out of services to market. The only issue is that they don’t see it.

Want to meet someone? Why bother, use Zoom.

Want to look cool? Here’s an iPhone, for a tenth of the price.

Want to be a man? Here’s a gym membership.

Want to show you care about the environment? Here’s a train ticket.

That’s why this is such a powerful concept. And that is why products are bundles of services.

Categories
Society US Politics

America is a (glamorous) “shithole country”

Think back to the last time you thought “damn, I like America”. What was it that caused you to say that? Was it Hollywood? Was it the military victories of the World Wars? Was it Yellowstone? Was it the history of commitment to internationalism and free trade? Or was it the unlikely story of a backwater republic’s rise to power within a century of indepedence?

For me, it was the election of a black man to the office of President in 2008. When Obama was elected president, I was in high school. At home, the first term of the UPA was nearly done, and the story of India’s development was a conclusio inevitabilis. Politics at home was dry, predictable and repetitive. Like everyone around me, I amused myself with the affairs of the USA. For a whole year, I followed the breathless coverage of his “Yes We Can” campaign, watched all his interviews on talk shows, almost memorised his victory speech and closely followed the first years of his presidency. My mind screamed “America is the best!” and I wanted to move there as soon as I could.

In my heart, though, there was a seed of doubt sown by something I read in an op-ed: America is 13% African-American, and its economy is built on the backs of people of colour; yet, it took the country 230 years to let a black man rise to be Commander in Chief. Why? As years went by, I started noticing cracks in the Great American monolith and the more I knew, the less inclined I was to give America a free pass in world politics.

This post is a long-overdue crystallization of that line of thought. Some of you who read the title may go “well no shit!”, but I’m not trying to preach to the converted. My intention is to reach out to the skeptical, maybe even the unbelievers. I’ll try to lay out a case that doesn’t assume that you hate Trump already, or that you’re a globalist, liberal, SJW, libcuck, libtard… You get the point.

Part 1: What makes a shithole country?

Let’s be honest about one thing: we wouldn’t be here discussing “shitholes” if Trump hadn’t brought that word into popular discourse. His original comment, per The Washington Post, included Haiti, El Salvador and several African countries. Going by the nations covered by Trump’s travel ban, I’m assuming this meant Somalia, Nigeria and several middle-eastern countries as well.

The WaPo piece includes an explanation by a White House spokesperson:

Certain Washington politicians choose to fight for foreign countries, but President Trump will always fight for the American people . . . Like other nations that have merit-based immigration, President Trump is fighting for permanent solutions that make our country stronger by welcoming those who can contribute to our society, grow our economy and assimilate into our great nation.

Raj Shah (son of immigrants from one such shithole country)

Note that the White House people don’t dispute the substance of the allegation. Trump himself issued this rebuttal:

OK. So poor countries can be considered shithole countries under some conditions. Fair enough. What else do we know about the countries Trump considers bad? There’s a pretty detailed account of what kind of countries Trump doesn’t like in this NYT piece. From there, we can add a few more characteristics of shithole countries:

  • Has high prevalence of AIDS and other deadly diseases
  • Large section of population is homeless or lives in low-security housing

In the past, he has made several unsavoury comments about Mexicans, and from them we can also get an idea of what makes them so revolting to the American mind.

From all of the above, we get a more complete picture of what makes a country a “shithole”:

  1. High rate of poverty
  2. High incidence of preventable/deadly/communicable disease
  3. Homelessness
  4. Personal violence

To the above, I will also proceed to add a few more features that I think of when I think of the word “shithole”. Feel free to play your own version of a free association game to see what you, your friends and family come up with. Here’s my shortlist of essential shithole characteristics:

  • Institutionalised corruption
  • Political and politically-motivated violence
  • General sense of lawlessness and prejudicial justice
  • Lack of accountability in governance

Depending on where you are in life, you may even think that an absent or weak “social safety net” is one of the conditions of being in a shithole. To me, a social safety net is a paid feature in the freemium game called Life. We can agree to disagree on this one.

A small proviso

In many ways, Trump’s position on immigrants is nothing new. On this and other matters, his is the voice of a silent majority on Capitol Hill and in towns far from the “coastal elite”. He is no great orator; his greatest political gift is that he says the quiet part out loud. A simmering hostility towards immigrants is almost essential to the American life. Bush Jr. acted on the same Islamophobic principles during his “war on terror“, Nixon felt the same paternalistic revulsion towards Chilean socialism when he ordered the overthrow of Salvador Allende. Raegan used “war on drugs” as a dog-whistling tactic to rouse anti-immigrant feelings in middle America even as he pumped the Contras in Nicaragua full of arms, leading to the very refugee crisis that Trump bemoans now. But wait! Before you begin to think of this issue as a Republic construct, let me remind you that the Contras were created almost out of thin air by Jimmy Carter. Let me also remind you of Clinton, the man who turned the immigration system into the violent edifice we see today. And at last, let’s not forget Obama’s immigration track record, which was built around a rotten racist core that demonised immigrants and made humanitarian refugees (which were, by the way, created by America’s policy of waging endless war) seem like grifters begging for freebies.

So let’s not act all sanctimonious about this: everybody in Washington has always believed something roughly along the same lines.

Part 2: What makes America a shithole?

Hint: It’s not this guy.

Let me be honest about another thing: I’m not the first one to say that America is a shithole country. Right after Trump made his comments about Haiti and El Salvador, a wave of political pundits descended onto liberal magazines like The Atlantic and New Yorker to lay out their reasons for why America is itself a shithole country. Much ink was spilt on the question of exactly who made it this way, with the inevitable conclusion being that, yes, it was Trump’s fault all along.

(There is a one-liner to be made of how people in White Houses shouldn’t be slinging mud, but I can’t find the right words for it.)

For my part, I’ll try to ignore the Trump connection, because I think it’s shortsighted, politically motivated and also plain disingenuous to posit that the country was somehow much better earlier and this one guy has driven it into a ditch within the last 3.5 years. Trump does sometimes have a role to play, but largely as an actor within a much broader system. I’ll explore this in part 3. In part 1, I laid out my citeria for what makes a shithole, and now I’ll try to show that America does in fact live up to each one of those criteria.

Poverty and homelessness

What do you think is the poverty rate in America? And what do you think is another country with a similar poverty rate? Whatever you thought, you were wrong. It’s 15%. Think about that: one in six Americans is below the poverty line. And depending on whom you ask, that’s equivalent to either Lebanon or Indonesia. If you want to see how high the poverty rate can go, here’s a useful heatmap:

Southern states remain the poorest in the U.S. even as the ...

Among US states, US Census data shows that Mississippi has a poverty rate (19.7%) roughly equivalent to Iraq (according to the World Bank and CIA World Factbook).

But if you want to see grinding poverty, you need to look at the “other” territories of the USA. In many ways, US treatment of outerlying islands is the textbook definition of “stepmotherly”. American Samoa has poverty rate of 65%, and a per capita income equivalent to Botswana. Puerto Rico, despite the odds and decades of neglect, has a median per capita income of $20k, which is more than Greece and less than Saudi Arabia.

Nearly as egregious is the youth poverty rate: almost one quarter of all American youth live in poverty. Kids are even worse-off: the Stanford Center on Poverty and Inequality calls the US a “clear and constant outlier in the child poverty league”. One in five children in the US don’t get enough to eat. The UN Special Rapporteur on poverty toured America and concluded that it has some of the most extreme poverty he had seen anywhere in the world. (The introduction to his report can be found here, and the full report here.)

Does that not make it a “poor” country? No wait, you may add, what about New York City and Los Angeles and the beautiful kleptopolis of Seattle, WA? Ah yes, the tale of the American city, where fortunes are made and dreams are realized. But whose dreams exactly? Over half a million Americans have nowhere to go at the end of the day. The three cities of NYC, LA and Seattle have over 150,000 homeless people between them. New York, that quintessential “city of dreams”, has nearly 80,000 homeless people, the majority of whom have been on the streets for over a year.

To my friends who want to pretend like the scores of homeless at your subway station don’t exist: at what point do you stop looking up at those gleaming high-rises and look down at the grime and dirt of the streets?

Disease

Everybody likes a good Bernie joke. Here’s one by Conan O’Brien: Bernie Sanders says his campaign is trying to appeal now to senior citizens. The problem is, every time Bernie says, “Feel the Bern,” the seniors think he’s talking about acid reflux.

Acid reflux, dental implants and hip replacements are great for use in one-liners about old age. But what about obesity? Or random parasitic infections? This widely-quoted paper found that nearly 12 million Americans have an undiagnosed or neglected parasitic infection. In 2017, a study by Baylor University found that in the rural south, all sorts of diseases of extreme poverty continue to thrive. Ever heard of hookworm? Nearly 34% of people tested in Alabama were found to have traces of it.

And it’s not just entirely preventable third-world diseases. Equally appalling are the rates of “diseases of affluence“: diabetes, obesity, asthma, coronary heart disease, cancer, allergies, gout and alcoholism. Despite the misnomer, “diseases of affluence” are not entirely born out of sedentary lifestyles and an excess of comfort. Studies show that more and more, it’s the poorer regions of the world that are being affected by lifestyle changes and unhealthy diets.

Empty calories are often very cheap calories for poorer sectors around the world, so that consumption of processed or dominantly carbohydrate diets with insufficient whole grains, fruits, and vegetables is more common among the poor. In addition, poorer households often are less able to pay for the expensive consequences of these diseases in the middle-aged and elderly (e.g. insulin provision for diabetics, the consequences of heart attack and stroke in the elderly). Ironically the same poorer sectors in poorer parts of the world and even within the United States can simultaneously face the issues of “traditional malnutrition” (i.e undernutrition, insufficient consumption of vitamins, iron, zinc, calories), especially among children and women, as well as diseases of overconsumption of empty calories.

https://serc.carleton.edu/integrate/teaching_materials/food_supply/student_materials/1205

And what does that lead to? Obesity, that’s what. Nearly 42% of all Americans are obese, which has increased from 30% in 2000.

Prevalence of Self-Reported Obesity Among U.S. Adults by State and Territory, BRFSS, 2018. See map details in table below.
Obesity in the USA. Source: CDC

But eh, you might say, obesity is no big deal. My momma is pretty fat and she rolls around just fine.

What about infant mortality? What about the fact that more children in the US die in the first few hours of their lives than in 50 other countries, many of them considerably poorer and lacking in resources? It’s not just the children: the United States has the worst maternal death rate in the developed world, with black women three times as likely to die of childbirth than white women. Predictably, this is much, much worse in the rural south. The CDC admits that over 60% maternal deaths are entirely preventable, and if you take that into account, the US would still be ranked in the mid 20s worldwide, and in the bottom half among developed countries. And this situation is only getting worse:

Chart: The maternal mortality rate in the U.S. (26.4) far exceeds that of other developed countries.
Deaths per 100,000 live births. Source: NPR

I don’t want to belabour the point, but there is also this other thing called an “opioid epidemic” merrily sauntering through middle America. But I guess legitimate wars on drugs would be too much for the helpless American populace to handle. Drugs come from Mexico and Colombia, fool! Have you not watched Narcos? Drugs are made in the jungle, and they most definitely are not because of one pharmaceutical company headquartered in Stamford, CT. Even if that were true, not now! Not when there’s this other unseen epidemic that is mysteriously spreading across the country. There are rumours that some people have lost jobs or something, but I don’t know man. It all seems anecdotal to me.

So what kind of care can you expect when you’re sick, pregnant or for some reason need the healthcare system to take care of you?

Unemployment benefits? Maybe. But not gratis.

Mandatory maternity leave? Zilch.

In a pandemic? $1200, take it or leave it.

Special consideration? None.

Living wage? GTFO.

Job security? Nope.

If you happen to be dying, or need intensive care but cannot afford to pay your medical bills, you’re humanely sedated and carefully dumped butt-naked at a bus stop in the freezing cold.

So yes, America ticks the “disease-ridden” and “no social safety net” boxes quite comfortably.

Crime and violence

Even before BLM, most sentient beings knew the perils of living in America: guns, religious fanatics, white supremacists and an absentee healthcare system all together mean that to move to America was never the best option you had. In order to really see the pernicious undercurrent of crime coursing through American veins, you need to look deeper than the shocking (and rightly so) incarceration rates in the US.

Yes, there is a drug issue in the USA. And yes, there is a violent crime issue as well. And obviously, there’s a gun crime issue too. According to some highly intelligent people, the spike in the 60s-80s was caused by lead. Yes, the heavy metal. Not violent leaders or a history of institutionalised racism or gratuitous wars leading to a cult of the soldier. Lead.

Be that as it may. The first and most important thing to know about American violence is that it works very differently from the way crime works in developing countries. In most modern states, there are two categories of violence: interpersonal and state-inflicted. Interpersonal violence is simple: you harm someone else and he harms you back. State-inflicted violence is when people in authority use state apparatus to cause you harm.

In America, interpersonal violence exists everywhere and forms the visible violence that most people talk about when they discuss violence. The south is, predictably, more violent than the north, but not in all kinds of violent crime. Of course, there’s the issue of definition: what is a violent crime, and what is not. As commonly understood, violent crime includes mugging, assault, homicide, rape, hate crime etc. Horrible, but generally there are legal remedies to these. Obviously, the way to deal with a fear of interpersonal violence is to carry some sort of deterrent: pepper spray, guns, bodyguards, body doubles etc.

What most people don’t ever see but always have an uneasy feeling about is the other kind of violence: state-inflicted. The kind of violence that you can’t do anything to deter. This is the kind of violence that people in positions of privilege don’t fully comprehend. Police brutality is the most obvious manifestation of state violence.

In Torture and State Violence in the United States, Robert Pallitto lays out a comprehensive view of the widespread use of violence by state actors to stamp out dissent and cultivate a sense of fearful awe among the American populace. Today, thanks largely to the Black Lives Matter movement, we are all aware of the extent to which police brutality is common. To a person of colour, modern America is scarcely different from a warzone.

For example, black people and people of colour are much more likely to end up in violent interactions with the police, and more than twice as likely to be tasered to death. Being tasered is actually the best-case scenario if you’re a person of colour. Tasers in general are not lethal, and allow policemen to handcuff you without having to bump you over the head with a glorified baseball bat. Deaths in custody and suicides following arrest are commonplace, and are several times the rate in UK, Australia or NZ.

This is taken from a brilliant CNN piece about police crime, and I highly recommend going through the original for more details.

(Sidenote: there’s a nice report from the UK about the inner workings of police violence there. Yes, it’s a different country with vastly different social norms and much less violence of any kind but it’s instructive as to how people actually die, and what sorts of remedies are offered to the victim’s family.)

Let’s say you’re the target of police violence in America. What happens to you then? What can you do to hold them accountable? As any person from a shithole country can tell you, absolutely nothing at all. The technical term in the US is “qualified immunity“, which is basically fancy-people talk for “unless they violated some federal law, you can go fuck yourself”. Supreme Court judges have sided with the police in quashing case after case meant to hold police accountable for the violence they perpetuate. Nearly every infamous cop accused of violence, brutality and murder has walked away practically scot-free. Most are only suspended for a brief time, and nearly all get to keep their salaries and pension.

According to this peer-reviewed paper, “the average lifetime odds of being killed by police are about 1 in 2,000 for men and about 1 in 33,000 for women. Risk peaks between the ages of 20 and 35 for all groups. For young men of color, police use of force is among the leading causes of death”. This level of callous disregard for human life is scarcely any different from India – which, I should have mentioned at the top, is most definitely a shithole country – where policemen routinely get away with murder, rape and all manner of torture. Some even become popular icons.

Remember Cops?

Ring a bell, America? Your pop culture is filled with “rogue cops” who don’t care about justice and use it as a means to personal glory. Let’s not forget the glorious dumpster fire that is the show Cops which makes it seem like every POC is up to something shady and if it weren’t for the ever-watching eye of the beat cop who’s armed to the teeth, the entirety of Western civilization would just come crashing down.

Corruption

This is the final aspect of America’s shitholery that I’m going to consider. Not because it’s conclusive, but because in nearly every discussion of developing countries like Nigeria, India and Mexico, “corrupt” is used as a sort of dirty word, a smear intended to show uncivilized these countries are, and used as a prop to lean on and gloat about how great the West is for having gone beyond cash bribes.

Reuters has consistently reported on how the Supreme Court uses its flawed machinery to shield murderous cops from justice. The untrained, “educated” eye is ready with a defense: courts only act on precedent, and can only act within the bounds of the law.

But the uneducated native of a shithole country (such as myself) can tell you in an instant that there has to be some sort of funny business going on here. No judicial system can uphold one statute of abused and misused laws for 50 years without puncturing some holes in it. Besides, the issue of violent cops has been in the popular mind for at least 30 years now, since the murder of Rodney King in Los Angeles in 1991. According to nearly every independent study, at least 1000 people die at the hands of the police every year. The “lack of precedent” argument doesn’t pass the sniff test.

There are hints that the judiciary in America is indeed extremely corrupt. A recent Reuters report found that thousands of state and local judges across the United States were allowed to keep their positions on the bench after violating judicial ethics rules or breaking laws they pledged to uphold. In the same decade, two Pennsylvania judges were found guilty of sending thousands of minors to juvenile detention in return for cash kickbacks from the detention center operators. Then there’s the now-infamous case of a judge who overturned a billion-dollar lawsuit against an insurance company that had financially supported his appointment to the bench.

A paper from 2009 raised this question of corruption in US courts, finding that it may be a seriously underreported issue. The paper found that there are indeed no robust mechanisms in place to prevent and uncover low-level judicial corruption, but estimates that around 3 million bribes are paid each year in the US judicial system. 3 million individual bribes.

All of this means that there is most definitely a corruption in America’s courts. And the American public don’t know it simply because there’s just no way to know about it. In other words, America, your courts are no better than the banana tribunals of rural Rwanda.

Is that not the definition of being a shithole?

Part 3: The Glitz and The Glamour

The part where Trump makes an appearance

I have one more thing to be honest about: I lied earlier; Trump does matter. He matters because he’s part of the woodwork now, and any discussion of the Trump administration’s actions without discussing the influence of the man they’re all cheerfully following to the grave would be just as foolish.

Let’s begin at the beginning. Trump is a byproduct of America’s shitholery system. The roots of his billions are in unhonoured contracts, low-level kickbacks and relentless exploitation of US insolvency and bankruptcy courts. His hotels materialized only because he struck deals with municipalities and unions. His apartment complexes were built on land previously used for low-rent housing, which he found ways to swallow up – generally by abusing eminent domain. At every step of the way, he used other people’s poverty and misfortune to the benefit of a handful of wealthy people who could afford his properties. When thinking of Trump’s rise to power, the term “klepto-plutocrat” comes to mind.

(Sidenote: Trump’s signature project – the border wall – can only ever come to fruition through a free-wheeling abuse of eminent domain. Vox has a nice short explainer on this topic. In many ways, the Trump story is almost causally linked to the evolving concept of where private property rights must be superceded by the need to provide public goods.)

Even as Trump’s projects sank and took whole communities with them, Trump himself stayed above the water. This cultivated feeling of personal invulnerability permeates Trumpian thought, and informs every single decision his administration makes. Consequently, the very kind of people who are drawn to Trump are the kind of people who stop at an accident scene to steal wallets and jewelry. There’s no need to name names here, because literally every last one of them is animated by a desire to profit from America’s wretchedness at any cost.

Trump’s worst vice, then, is that he takes his hands off the wheel just so he can claim insurance later. Whereas previous administrations tried to keep the country from descending to anarchy, Trump feeds the flame to try and gain from it. When the Bush administration’s “War on Terror” led to mass Islamophobia and anti-war riots, Bush tried to put out the fire by insisting that he was fair and made a point of trying to bring Islamic clerics into political dialogue. When Obama realised that his administration’s actions on immigration reform had led to more border deaths, he saw to the passing of DACA as a token gesture. When Obama’s environmental reforms and Obamacare led to the Republicans flipping the Senate and the House, he went soft on African-American issues and even went so far as to denigrate Black Lives Matter, leading to many recent commentators to question his overall position on the matter of black rights.

In nearly every administration before Trump, there was present a self-correcting impulse which kicked in after something major had occurred. Trump, on the other hand, actively makes things worse, like he has done in the ongoing BLM protests. A few months ago, as the COVID cases started to rise, Trump saw it fit to spout conspiracy theories, asking people to go out and not believe the “Chinese hoax”. The administration has used protests to try to conceal its more nefarious dealings: he commuted the prison snetence of Roger Stone, the man who helped Trump take the presidency, and who was later convicted of obstruction, witness tampering and perjury. You know where else this happens? You guessed it: in sub-Saharan kleptocracies. The administration has also used the pandemic aid as a political tool by withholding details of who received how much. Would anybody be surprised if Trump himself was found to be skimming off the top? Of course not. That’s just what leaders of shithole countries do. Remember Lula?

The people around him are no different: even as oil companies faced losses and employees lost jobs, Big Oil CEOs reel in big bonuses. Even as the country is convulsed by COVID-related deaths and related job losses, the stock market is at a record high. Even when Florida’s pandemic response has been on the same level as India’s, a Florida pastor got rich peddling bleach as a cure. Just across the sea, Cuba’s population is largely free of the virus, and officials worry only about the risk of Floridans infecting Cubans.

Some other countries that have managed to contain the COVID epidemic? Rwanda, Uruguay, Vietnam and Senegal. People from Rwanda are allowed to travel to Italy and other parts of the EU. Guess who can’t? People from China, India and the US of America.

Oh, how the tables have turned.

Categories
Culture

Why I don’t read the news

Most people who know me would probably assume (based on how eager I am to discuss current affairs) that I have a couple of news apps, a newspaper subscription or a twitter feed filled with news content. Three years ago, they would have been right. Not anymore. This is a post about why I don’t read breaking news – and what I do instead. First, a brief overview of my argument against breaking news. That way, if this is the limit to your attention span, I’ve managed to get my point through to you.

Broadly speaking, all breaking news falls victim one or several fatal (but oh so human) errors:

  1. It is reactionary and status quoist
  2. It is divorced from context, and
  3. It exaggerates threats to your person

Breaking news is the problem

Nearly all news coverage is conservative. Even ultra-liberal outlets like The New Yorker have this assumption at their heart: “the world is originally good and people are flawless; institutions corrupt them”. But newspapers and fortnightlies have to (by their very nature) take a measured, balanced approach to their curation. Breaking news’ original sin is that it focuses on what’s being broken more than what is being built. When you need to put something out every few minutes, you can be sure than a lot of that something is nonsense.

I think we diminish the constructive value of change in societies. 24/7 news coverage tends to warp our understanding further and makes us react to emerging news rather than sit back, take a breath and assess where things are going. The news industry (and those on the periphery like social media sites) profits from creating false narratives out of non-issues. Everywhere around you, there are non-issues being blown out of proportion. Go on Facebook and you’ll see this playing out in real time: your kooky uncle shares some government conspiracy to take away his medical supplies; your distant cousin thinks that China created Coronavirus to break Western dominance and emerge as the undisputed superpower. All of these ideas and narratives have always existed, but breaking news amplifies the impact of such fringe voices by giving them airtime. Whether it’s for casual mockery or serious debate, most “news” isn’t actually news: it’s just plain hand-wringing about change.

There’s a wonderful Quartz article from a couple of years ago that made me realize the stupidity of a 24/7 news cycle:

As news organizations embrace the internet economy, they’re pushed to publish with more immediacy, more emotion and more frequency. They’re speeding up to fill infinite space — and in the process, they’re losing sight of their responsibility to help readers understand their world.

The paradox of this false urgency is that we end up with far more words being written, far more time spent reading, and far less clarity, context and understanding.

Rob Howard, writing for Quartz

In that sense, breaking news and rollercoasters are very alike: they provide excitement and evoke fear of change while ultimately leading us nowhere.

Questions of quality

Another consequence of relentless news coverage is entirely predictable and all too human: people get complacent and fall into patterns of lazy behaviour. Sometimes, it creates stories with absolutely no fact-checking, no context and nothing to hold it up other than some flimsy reporting. In the grand scheme of things, we are still naked bipeds running scared from fearsome saber-toothed monsters of the night, and we use bad reporting to cover up our inherent prejudices and biases in a facade of “journalism”.

Consider the now infamous case of some Muslim men arrested in India for “links to ISIS”. The story was trumpeted by every newspaper, website and magazine at the time. It was reported that 9 men were nabbed while communicating with each other to take care while handling some “hazardous” chemicals. Of course, the police victoriously proclaimed that they had stopped a terror attack, and civic society had just been handed a jump scare, only to realize that it was nothing to worry about.

The truth?

Though reports spoke of “chemicals”, the only chemical named in all of them was hydrogen peroxide, because one 100 ml bottle was labeled so. Known as a hair bleach and a mouth rinse, hydrogen peroxide was described by the Anti-Terrorism Squad as a chemical “preferred” by the Islamic State to make bombs (DNA, January 24).

Between 2015 and 2017, Europe experienced six bomb blasts in which hydrogen peroxide was indeed used. Three of these explosions were planned by the Islamic State. However, the other chemical used in these blasts was TATP or triacetone triperoxide. There was no mention of triacetone triperoxide being found in the homes of those arrested in Maharashtra.

Jyoti Punwani, writing for Scroll

And that’s the state of news media in the 21st century – first at the scene, quick to label, eager to point fingers, and almost always wrong. And that’s why I try to avoid reacting to news pieces, especially when they’re fresh.

My news diet

Over the years, I’ve figured out a stable cadence at which I’m comfortable with absorbing news. I’ve realized that no matter how much I try to avoid it, people will insist on discussing developing issues without having much reliable information about it. This happened with the Iran crisis, early COVID-19 reporting and now again, with the economic impact of COVID. I don’t like to be uninformed, but I don’t really want to be reactionary either. Far too many people I know just regurgitate whatever opinion they get off Joe Rogan, Sam Harris, Hasan Minhaj and John Oliver. I don’t want to be one of those. So, I’ve subscribed to a news briefing by Axios just so I’m aware of big news. Every weekend, I catch up on the week’s major stories through a couple of podcasts and magazines. That’s it. This level of awareness takes me approximately 1 hour per week, and I’m not measurably less-informed than any news junkie.

Along the way, I’ve also developed trust in some media outlets, and I generally don’t trust all outlets with all news equally. Here’s the summary:

  • Headlines: Axios – I used to use the BBC podcast, but it was taking up too much of my time so switched to Axios about a year ago
  • Local: Stuff (NZ), The Hindu (India), NYT (US), Axios (US)
  • Culture: The Atlantic, Scroll, Quint, Quartz
  • Business: Livemint (India), NZ Herald (NZ), Wall Street Journal (US), The Economist (World)
  • Politics: WPR (World), The Guardian (UK), NPR (US), FiveThirtyEight (US)
  • Science: Nautilus, ScienceDaily, Undark, New Scientist, HeritageDaily, Phys.org
  • Opinion: BBC, Guardian, Aeon, The Nib, Longreads, Marginal Revolution, UnHerd

I use Google News to catch up if I feel like I need the top headlines. I change the region from NZ to US to UK to India, and by the end of it I generally know enough to get by. Of course, you can make it even simpler by setting up a Feedly – it takes 10 minutes and you’ll be forever grateful you did it.

Categories
Philosophy Society

You Should Be Reading Kierkegaard

A lonely philosopher for socially distanced times

I’m not a pedant, but I find the recent conversation around “social distancing” kind of meaningless. In these times of sickness and death, we use “social distancing” to mean that we, as responsible members of society, will try to stay away from other people in order to prevent the spread of disease. Yes, most of us are probably at a very low risk of contracting COVID-19, but should we go outside to the park, take the bus or go to the supermarket, we may pass it on to someone else who may actually be at a relatively high risk. This physical isolation from the world has been called “social distancing”, even though what we actually mean is “physical distancing” – we aren’t trying to turn into hermits or recluses for the sake of posterity; we’re merely trying to stay away from vulnerable people we don’t really know or see.

But what physical distancing has done to us is clear: we have actually socially isolated ourselves. Doesn’t matter that we didn’t mean it that way or that it was done with the best intentions – we are no longer as social as we used to be. It may just be a phase, and humanity might just get back to its annual orgy in the desert, underfunded public transport and licking donuts willy-nilly.

But while we’re here, we’d be remiss to not take a minute and enjoy the view. And who better to guide us around social isolation than the man, the legendary philosopher, the lonely depresario of the mid-19th century – Soren Kierkegaard. Before I dive into why Kierkegaard is the patron saint of social distancing, let’s take a quick view of his life and times.

Much has been said about Kierkegaard and the school of philosophy he created (existentialism). But here’s what you need to know: he was one of two siblings (the others died before he reached adulthood), he was engaged and then wasn’t, forever regreted said breakup, worried about his place in the world and his relationship with Jesus (he despised Christianity for its many perversions of Jesus’ teachings, and the rampant decadence and corruption of the Church), cried about his strained personal relationships with everyone around him (strained mostly because of Kierkegaard himself), most likely was clinically depressed, most definitely was a keen observer of human nature and a very very very very prolific writer of extraordinary ability and insight.

When I get up in the morning, I go right back to bed again. I feel best in the evening the moment I put out the light and pull the feather-bed over my head. I sit up once more, look around the room with indescribable satisfaction, and then good night, down under the feather-bed.

Kierkegaard in Either/Or

He was basically Radiohead in the 1840s:

A heart that’s full up like a landfill
A job that slowly kills you
Bruises that won’t heal
You look so tired, unhappy
Bring down the government
They don’t, they don’t speak for us
I’ll take a quiet life
A handshake of carbon monoxide
With no alarms and no surprises
No alarms and no surprises

Radiohead in No Surprises

What is a poet? An unhappy man who hides deep anguish in his heart, but whose lips are so formed that when the sigh and cry pass through them, it sounds like lovely music…. And people flock around the poet and say: ‘Sing again soon’ – that is, ‘May new sufferings torment your soul but your lips be fashioned as before, for the cry would only frighten us, but the music, that is blissful.

Kierkegaard in Either/Or

How to read Kierkegaard

I’m a believer in the postmodern interpretation of the work as being separate from the author. Yes, the author is dead, but as Kierkegaard said,

The tyrant dies and his rule is over, the martyr dies and his rule begins.

Kierkegaard is a martyr, and his works cannot be appreciated without an understanding of the weird, twisted and self-inflicted social isolation that he lamented to his dying day. So, that’s the first step: find a biography.

I like to think of biographies as intimate introductions to the person being written about. Too many biographies are works of fluff and smoke that seem to be written for the author more than the subject. I don’t care what brand of toothpaste you use or the 17 habits that make you successful. I want to know what animates you, why you do things, what makes you who you are, not the mere chronology of events in your life. I think it’s essential while reading a biography to get introduced to the subject, shake his cold, (maybe dead) hands and look straight into their eyes and see what they see. The best book to do this, peek into Kierkegaard’s restless soul, is “Philosopher of the Heart” by Clare Carlisle. Carlisle’s beautiful prose is a perfect companion to Kierkegaard’s own. Kierkegaard’s life almost writes itself, and a lesser writer could easily have mangled its tragic complexity, but I found myself furiously highlighting practically every page of the book. Here’s a paragraph:

Stuck in this crowded stagecoach, Kierkegaard imagines himself towering above his peers – like Simeon Stylites, the fifth-century Syrian saint who lived on top of a pillar, conspicuously devoted to prayer, for more than three decades. People wondered whether he did it out of humility or pride: was he looking down on them from his superior height. Or had he raised himself up like Jesus on his cross, held aloft in all his fragility, willing to be mocked and scorned? Simeon Stylites, the celebrity recluse: the paradox is irresistible; perhaps this should be his next pseudonym?

Clare Carlisle, in “Philosopher of the Heart

So my suggestion to you is this: go get the book. Given everything that’s going on, you’re probably not doing much else anyway.

Why read Kierkegaard

Do you breathe every now and then? Do you have friends that don’t seem to stick around? Do you somehow find a way to ruin a perfect relationship? Do you hate being alone, but seem addicted to it somehow? Do you sometimes get sad for no real reason? Do you find yourself drifting into thoughts of despair, sadness and casual fantasies of suicide with no desire to actually act on it? Do you see no point in things people do? Do you think social niceties and “courtesies” are empty gestures designed to deceive and fool?

If you said “yes” to any of the above, you have reason to read philosophy. And even more reason to read Kierkegaard, for he understood a fundamental truth that many historians, philosophers and “intellectuals” fail to realize:

It is quite true what philosphy says, that life must be understood backward. But then one forgets the other principle, that it must be lived forward.

Philosophy is meaningless if it doesn’t allow us to live better lives. Life is not a hedge maze with an eventual goal and a patterned, manicured path. It’s a jungle, and sometimes you need a seasoned companion to appreciate the sights and sounds. Kierkegaard is a great guide, especially when you have nobody else around you. He is the lonely person’s best friend, and his thoughts are most accessible in ‘Either/Or’ and ‘Fear and Trembling’, the first about choices in life and the next about Kierkegaard’s relationship with God, Jesus and himself. Together, these two books contain some of the most powerful pieces of writing I’ve ever come across.

Fear Either/Or Trembling

But don’t take it from me, here’s a selection of my favourite quotes by Kierkegaard, in no particular order.

On grief:

My grief is my castle, which like an eagle’s nest is built high up on the mountain peaks among the clouds; nothing can storm it. From it I fly down into reality to seize my prey; but I do not remain down there, I bring it home with me, and this prey is a picture I weave into the tapestries of my palace. There I live as one dead. I immerse everything I have experienced in a baptism of forgetfulness unto an eternal remembrance. Everything finite and accidental is forgotten and erased. Then I sit like an old man, grey-haired and thoughtful, and explain the pictures in a voice as soft as a whisper; and at my side a child sits and listens, although he remembers everything before I tell it.

Either/Or

On happiness:

Happiness is the greatest hiding place for despair.

Either/Or

I have never been joyful, and yet it has always seemed as if joy were my constant companion, as if the buoyant jinn of joy danced around me, invisible to others but not to me, whose eyes shone with delight. Then when I walk past people, happy-go-lucky as a god, and they envy me because of my good fortune, I laugh, for I despise people, and I take my revenge. I have never wished to do anyone an injustice, but I have always made it appear as if anyone who came close to me would be wronged and injured. Then when I hear others praised for their faithfulness, their integrity, I laugh, for I despise people, and I take my revenge. My heart has never been hardened toward anyone, but I have always made it appear, especially when I was touched most deeply, as if my heart were closed and alien to every feeling. Then when I hear others lauded for their good hearts, see them loved for their deep, rich feelings, then I laugh, for I despise people and take my revenge. When I see myself cursed, abhorred, hated for my coldness and heartlessness, then I laugh, then my rage is satisfied. The point is that if the good people could make me be actually in the wrong, make me actually do an injustice-well, then I would have lost.

Either/Or

On life:

I stick my finger into existence and it smells of nothing. Where am I? Who am I?

Either/Or

What is existence for but to be laughed at if men in their twenties have already attained the utmost?

Either/Or

No one comes back from the dead, no one has entered the world without crying; no one is asked when he wishes to enter life, nor when he wishes to leave.

Either/Or

My life is absolutely meaningless. When I consider the different periods into which it falls, it seems like the word Schnur in the dictionary, which means in the first place a string, in the second, a daughter-in-law. The only thing lacking is that the word Schnur should mean in the third place a camel, in the fourth, a dust-brush.

Either/Or

On love:

You love the accidental. A smile from a pretty girl in an interesting situation, a stolen glance, that is what you are hunting for, that is a motif for your aimless fantasy. You who always pride yourself on being an observateur must, in return, put up with becoming an object of observation. Ah, you are a strange fellow, one moment a child, the next an old man; one moment you are thinking most earnestly about the most important scholarly problems, how you will devote your life to them, and the next you are a lovesick fool. 

Either/Or

On philosophers (honestly, it could be said of anything too theoretical or academic):

What philosophers say about actuality is often just as disappointing as it is when one reads on a sign in a second-hand shop: Pressing Done Here. If a person were to bring his clothes to be pressed, he would be duped, for the sign is merely for sale.

Either/Or

On God:

If there were no eternal consciousness in a man, if at the bottom of everything there were only a wild ferment, a power that twisting in dark passions produced everything great or inconsequential; if an unfathomable, insatiable emptiness lay hid beneath everything, what would life be but despair? If it were thus, if there were no sacred bond uniting mankind, if one generation rose up after another like the leaves of the forest, if one generation succeeded the other as the songs of birds in the woods, if the human race passed through the world as a ship through the sea or the wind through the desert, a thoughtless and fruitless whim, if an eternal oblivion always lurked hungrily for its prey and there were no power strong enough to wrest it from its clutches – how empty and devoid of comfort would life be!

Fear and Trembling

On Christianity:

The greatest danger to Christianity is, I contend, not heresies, heterodoxies, not atheists, not profane secularism – no, but the kind of orthodoxy which is cordial drivel, mediocrity served up sweet.

Fear and Trembling

The matter is quite simple. The bible is very easy to understand. But we Christians are a bunch of scheming swindlers. We pretend to be unable to understand it because we know very well that the minute we understand, we are obliged to act accordingly. Take any words in the New Testament and forget everything except pledging yourself to act accordingly. My God, you will say, if I do that my whole life will be ruined. How would I ever get on in the world? Herein lies the real place of Christian scholarship. Christian scholarship is the Church’s prodigious invention to defend itself against the Bible, to ensure that we can continue to be good Christians without the Bible coming too close. Oh, priceless scholarship, what would we do without you? Dreadful it is to fall into the hands of the living God. Yes it is even dreadful to be alone with the New Testament.

Fear and Trembling

On choices:

I see it all perfectly; there are two possible situations — one can either do this or that. My honest opinion and my friendly advice is this: do it or do not do it — you will regret both.

Either/Or

If anyone on the verge of action should judge himself according to the outcome, he would never begin.

Fear and Trembling

On regret after making said choices:

If you marry, you will regret it; if you do not marry, you will also regret it; if you marry or do not marry, you will regret both; Laugh at the world’s follies, you will regret it, weep over them, you will also regret that; laugh at the world’s follies or weep over them, you will regret both; whether you laugh at the world’s follies or weep over them, you will regret both. Believe a woman, you will regret it, believe her not, you will also regret that; believe a woman or believe her not, you will regret both; whether you believe a woman or believe her not, you will regret both. Hang yourself, you will regret it; do not hang yourself, and you will also regret that; hang yourself or do not hang yourself, you will regret both; whether you hang yourself or do not hang yourself, you will regret both. This, gentlemen, is the sum and substance of all philosophy.

Either/Or

On other stuff:

Numbers are the most dangerous of all illusions

Fear and Trembling

What is youth? A dream. What is love? The dream’s content.

Either/Or

My time I divide as follows: the one half I sleep; the other half I dream. I never dream when I sleep; that would be a shame, because to sleep is the height of genius.

Either/Or

Tell me that isn’t some of the most delightful reading you’ve ever done and I won’t believe you.

More than anything else, Kierkegaard reminds us: yes, we are physically distanced from everything else right now, but social distance is something else altogether:

My soul is like the dead sea, over which no bird can fly; when it gets halfway, it sinks down spent to its death and destruction.

Either/Or

You don’t want to be socially distanced.

Categories
Politics Society

2020: The Year Liberalism Dies

Okay, the title is a bit dramatic – but not needlessly. Liberalism has enjoyed a long and storied run since the end of WW2. But ever since the USSR collapsed and the alternative ceased to exist in 1991, liberalism has grown to be increasingly the default ideology of any and every public intellectual. However, 2020 is likely to be the beginning of its end. Yes, you can blame COVID-19 for it, but there’s much more to the impending liberal crisis than just a one-off unlucky break.

A problem of definition

Who is a liberal? “Someone who upholds liberal values”. And what are those? Oh, you know – individual liberty, equality before law, separation of church and state, free markets, a strong state, independent judiciary, gender equality, gender equity, the welfare state, social safety net, human rights, freedom of expression, freedom of free religious association, environmental consciousness, capitalism, democratic principles, free and fair elections, a republican state, plurality of opinion, separation of powers among the three pillars of government, an elected legislature, world peace, universal right to the pursuit of happiness… You know. The obvious. Everybody knows what a liberal is.

Yes, political scientists and politicians understand liberalism very differently and in a much more nuanced manner. But most people don’t. Even if we could agree that everybody knows what a liberal is, we would still find that not every liberal is equally liberal. I find that even the liberalism vs progressivism is all too often simply a case of many distinctions without a difference. So, the “progressives vs others” friction has always existed within liberalism, which COVID is merely giving space for.

Because of the exceedingly vague definition of liberalism, personalities as vastly different as Narendra Modi, Bernie Sanders, Greta Thunberg and Boris Johnson have all happily co-existed under the liberal banner. Or at least, they all found it expedient to call themselves liberal at one time or the other. Liberalism’s original sin is this vagueness of definition. The vagueness was partly intentional: it was useful in WW2 and the Cold War to be able to gather under one banner to unite against a common enemy. But now, as liberalism is increasingly unrivalled, political leaders and thinkers have had to delineate their beliefs and policies more clearly, which has led them down their own ideological “make your own adventure” where it’s possible to mix and match liberal principles as one sees fit. This state of affairs was always tenuous and liable to fracture at the seams. In 2001 after 9/11 and in 2008-09, during the GFC, we saw the first hints of the breakdown of the liberal tent. In 2020, we will see the end of it.

COVID-19

The novel coronavirus has been a perfect storm of several independent events coming together. It makes sense for me to try to articulate why a simple virus is the reason for the breakdown of a 200 year old political order.

First, it’s a virus. Bacteria are easy to grow in labs, test things on and kill. Viruses are notoriously hard to study since many do not reproduce under laboratory conditions, and because they mutate rapidly and no two strains are the same. Moreover, the way antivirals are developed is that scientists first identify a protein that they can try to disable. Then, they ensure that this protein is unique to the virus and not a common byproduct of other human bodily processes. Then, they are tested for efficacy, safety and effectiveness. This lasts several decades and as a result, the economics of developing antivirals is insane. Vaccines are easier to develop, but even they take 18-24 months to be brought to market and even then, are only effective against one strain of one particular virus. A simple mutation can make a whole family of drugs irrelevant. Because of this, very few firms bother with antivirals and vaccines.

Second, it disproportionately affects old people. More familiar viral diseases like HIV, flu and Hepatitis are different: either they affected everyone or affected children more. As a result, nobody cared about some old people dying of an unexplained illness because the logic was “meh, they were going to snuff it soon anyway”. Even now, as I write, young countries like India, New Zealand, Syria and those in the Sahel region have not been affected as badly as older ones like Germany, Italy and Japan. Traditionally, countries tend to accumulate older people as their institutions improve and development causes a reduction in mortality to due to pestilence and war. So, developed countries actually more likely to be hurt by COVID-19. For liberal countries like the US that were used to lecturing underdeveloped nations on things like poverty eradication, cleanliness and education, COVID has come as a rude shock and shown that their institutions back home need to be fixed first. Isn’t that an inversion!

Third, the symptoms are very common and easy to ignore. When was the last time you went to a doctor just because of a fever or dry cough? Never, that’s when. And old people complaining of difficulty breathing is like fish complaining about being wet. Nobody cared because we’ve seen this before and we’ve all been conditioned to accept that these things happen from time to time.

Finally, it started in China. China doesn’t share any information with the rest of the world. We know that. In most other cases, that’s fine because a lot of countries are cagey with transparency to the outside world (think Bhutan, Moldova, Russia, etc.) But with diseases, this means that the rest of the world is kept in the dark and robs governments of time to act. China’s experience with SARS taught the Chinese state a valuable lesson: if you find a new disease, don’t tell everybody about it; they’re not going to help, and will only use it as an excuse to lecture your people about the harms of eating random animals. And China learnt that lesson very well. Almost too well.

Liberalism at war (with itself)

Crises like this are supposed to bring societies together, and provide an opportunity to bury past differences. But COVID-19 has done the opposite: it has exposed all the ways in which liberalism is at war with itself. A core idea of modern progressivism is the idea of intergenerational warfare: that Boomers saddled the Millennials with a failed state and a bad economy, thereby hurting their chances. So, when COVID comes around, a frequent theme of early response to it was schadenfreude. The youngsters were ecstatic that these pesky oldies were going to kick the bucket because of their own selfish actions decades ago. “You voted to open up healthcare, make it profit-driven and let companies gouge patients while profiteering off death and illness. You deserve this new disease. Suck it, grandpa!”

As time went by, we started seeing people using the economic opportunities presented by COVID to enrich themselves. People started buying up sanitizers, toilet paper and masks and reselling them online. Some others started using the cheap flights as an excuse to get out of the country and enjoy a holiday they wouldn’t otherwise be ablet o afford. These “Coronavacations” were the economic reaction by a younger, more progressive generation knowing that they were safe. “To hell with global warming. Right now, I’m going to have some fun.”


As the disease began to spread, the first impulse was to shut everything down. First came gatherings and protests, then public transport, then borders, then flights, then even venturing outside for a walk. As the disease took shape and turned into a pandemic, that bright beacon of liberal symbolism – the European Union – began to crumble. It began as a wave of anti-migrant sentiment when Europe closed its borders disallowing refugees from the Middle East. Then, it morphed into something else: Italy’s borders were closed for the first time in decades. Then, it became a widespread mistrust of everything alien – entire cities, villages, states were placed under lockdown. Anything that had a border was shut off from the rest of the world by any means necessary.

The great liberal cause of free public transport suddenly made so much less sense. Do we really want to encourage everybody to travel so freely and spread diseases willy-nilly? A consensus quickly appeared: no, we do not.


As people started to stay at home more and workplaces shut down, environmental activists were delighted: the planet would get a breather. But of course, they couldn’t openly rejoice in the face of this calamity.

Source: NASA

“Maybe we didn’t need so much productive capacity after all?”

“But that’s what the free market had created so it must have been right!”


And then, of course, came the real progressive issues: flexible working arrangements, working from home, parental leave and paid sick leaves.

“If we could all have worked from home this easily, why haven’t we been? And now that we have all realized that healthcare is super important, can we please get it now? Thanks.”


But then, if everybody works from home, would that not lead to an increase in domestic crime? What about caring for the elderly? Most of us younger folk were all too happy to just let someone else take care of that job because we were away at work. But now that we’re home, are we supposed to work, care for our parents, help our kids with homework, shop for groceries online and still nurture our hobbies? Yeah, right!

And then there’s education. Most progressives want tuition-free education or some equivalent. Classic liberals don’t. The free-market argument lay on examples like Harvard and MIT, and the progressive argument rested on HBCUs, minority welfare and issues of the urban poor. What does that argument mean in a post-Corona world? Nothing, because everybody’s studying online anyway.


Running in the background was the question of economics: if everybody stays home in fear of the worst, how will the liberal idea of “eternal economic growth” be sustained? Nearly every country affected by the virus is looking into some form of economic stimulus package consisting of a mixture of lowering interest rates and corporate loan waivers. As the breadth and length of this stimulus grows, progressives everywhere are beginning to ask if this is the best way to go about things.

Support for economic stimuli, infrastructure spending, a living wage and universal basic income are no longer liberal ideas – they’ve been mainstreamed to the extent of something like a free press and freedom of movement. These are simply not defining features of liberalism anymore.

The coming changes

Clearly, liberalism has many internal battles to figure out before it can move on. So, what will the future of liberalism be? In one word, fragmented. As Tyler Cowen writes in his Bloomberg column (published as I was still writing this piece)

Over the span of less than a week, virtually every major institution in American life has been subject to radical changes to their daily operations, and it is not clear when things will return to normal. Covid-19 may well make a bigger impression on the national consciousness than 9/11 or the financial crisis of 2008.

And he may be right. More than that, it’s going to lead to a further refinement of what it means to be a liberal. Increasingly, it will come to mean nothing at all. By the end of the year, liberalism’s component movements will all break away and find a political voice of their own. We have seen this before: disappointment with climate inaction created the space for Green parties around the world, and job losses with globalization led to the resurgence of populist liberalism.


That said, here are my wild speculative thoughts on how COVID-19 is likely to reshape politics in the coming years:

  • A tentative rethinking of globally extended supply chains – politics and paranoia will lead to countries deciding to try to manufacture everything by themselves. Self-sufficiency will become the operative word of the new decade
  • As everybody rushes to make their own stuff, expect environmental concerns to take the backseat. Once again, forest cover will begin to recede rapidly in countries like India and China
  • For people in Europe and the rest of the Western world, COVID will always be a “Chinese virus”, spread by globalization and exacerbated by open borders. Expect these to lose their sheen and come under increased attack from populists who use this to further xenophobic politics
  • The end of the Euro project – Germans and French citizens may rightly feel that the reason COVID spread to their countries from Italy was because of the Eurocentric visions of their ruling parties which prevented them from closing their borders sooner
  • The rise of explicitly feminist politics that prioritize women’s issues over other liberal causes
  • Healthcare will finally become a universally acknowledged right – most of the opponents of Medicare For All in the US were old people. Now, as they realize their vulnerability, expect them to change their stance
  • As health benefits become inevitable, companies looking to keep their costs low will begin to recruit even more men. Thus, the feminism’s raison d’etre will come full circle
  • Public transportation will just not be anybody’s concern anymore – who wants to advocate for faster disease spread?
  • The erosion of individualism in the Western world – finally, the individual rights project that began with Protestantism and Martin Luther will see itself come to an end as communities everywhere reassert themselves and recluses realize the importance of having someone to talk to, empathise with and help out in need

But no matter how society responds to this pandemic, one thing is for certain: liberalism as we know it will not survive 2020.

Categories
US Politics

Bernie’s Coronavirus Problem

Politics is the art of the possible, the attainable – the next best

Otto Von Bismarck

It’s no secret that I – like most other sentient beings – would prefer a progressive candidate to some brain-dead party stooge like Joe Biden. In any other year, the Super Tuesday results notwithstanding, Bernie would have had a very realistic shot at the nomination: there are a few more large, progressive states like Oregon, NY, New Jersey and Pennsylvania, and some newly-progressivized ones like Wisconsin that are yet to caucus. In any other year, it would have been possible for Bernie to hold on and hope for the best.

But not in 2020, for 2020 looks to be the Year of the Virus. And the Virus – COVID-19, Coronavirus, SARS 2.0, “Chinese Pneumonia”, whatever you want to call it – is taking its toll on the world’s economy and political institutions. In authoritarian regimes like Iran, Russia and China, the Virus is being credited with weakening trust in the state and opening up an avenue for reforms. In failed and flailing states like Iraq, Afghanistan and the Balkans, it is going practically unchecked and is thus sowing the seeds of a future revolution. In Italy, which has fast turned into the largest non-Chinese center of the pandemic, the virus has taken its toll on the newly-elected government, its stability and the entire economic system. It doesn’t help that even without the Virus, Italy was close to a recession and people’s lives were pretty miserable already.

In stable economies, however, the virus is actually leading people to dig in and repose greater faith in their elected democracies. In India, South Korea, Israel and Japan, the growing concern over the virus is leading to mass hysteria and confusion. Savvy governments there have used this to their advantage in two ways: first, by making it seem like the spread of the Virus is a purely chance event and second, they have made themselves the arbiter of decided what is “fake news”, thereby gifting themselves the power to intervene in any misinformation campaigns. The latter case is obvious in Singapore, where a recent anti-fake news measure has been used to gag journalists and bloggers, leading to a chilling effect on free speech.

The US is quite different but entirely recognizable. The broken healthcare system in that country is making everything worse, and confounding statements by the CDC, White House and Congress are adding to a sense of widespread confusion and contributing to creating mass hysteria around an infection that is still (as of this writing) only lethal in 3% of all cases. Statistics aside, there are more political concerns for anybody watching the news: if everybody stays home, though, what does that mean for the presidential contest that was front and center just a couple of weeks ago?

The worst hit by widespread calls for self-isolation is, of course, Bernie. Bernie’s strength lies in his popular following, and the massive rallies his devoted followers congregate in to celebrate the rise of a social democrat in an age of hyper-capitalism. But the very same rallies that brought him fame and immeasurable relevance in 2020 are now out of the question. Televised debates, in which he has generally done pretty okay, are also unlikely to have a live audience, which means that Bernie can no longer get the audience all fired up and rooting for him – a tactic that he’s used to spectacular effect in the past. More than that, the mass panic and confusion around the Virus is making Trump some sort of a paternalistic figure, the “savior in the White House” that’s going to save the country. To a beleagured White House with no real popularity or credibility, mass confusion and a vulnerable populace is creating some semblance of a fan following.

For the most part, Bernie’s loss is Biden’s gain. As the only other horse in the Democratic race, Biden stands to gain from every follower and rally Bernie cannot hold. His centrist position has been thoroughly embraced by every other candidate’s followers. And as far as his campaign managers are concerned, every rally Biden doesn’t have to attend is a rally he can’t fuck up and say something stupid in. And every debate without a live audience is an audience that Bernie can’t steal with his charisma and populist charm. In fact, the Virus is practically Biden’s buddy right now. As VP to Obama – who presided over the last time the country was in the grips of mass panic – Biden’s campaign is getting a boost as undecided people previously attached to Obama now turn their backs on Trump and flock to a familiar, authoritative figure in the face of an unfamiliar threat.

In the end, of course, it all boils down to the popularity of the two candidates – Trump and whoever else the Dems pick. I’m okay with the choice being less than ideal and the choice being a Trump v. Other Guy because as Birmarck said, that’s just politics. I’m just sad that the Other Guy won’t be Bernie. Again.