Copyright notice

December 10, 2012 at 12:15 pm | Posted in Uncategorized | Comments Off
Tags: , , ,

Please note that all material on this site is copyright Kathleen Taylor 2014. Feel free to copy any of it, only please state clearly where it came from by including a link to http://www.neurotaylor.com. Thanks.

Passion? Commitment? High drama? I’m a scientist; give me a break

April 15, 2014 at 4:43 pm | Posted in Uncategorized | Leave a comment

Love_ScienceEnthusiasm and its close friend excitement are the topic for this post.

<sigh>

There’s a lot of both about these days. The news is full of people being praised for their passion and commitment, from volunteers to athletes to science communicators. The media seethes with thrilling stories of ‘high drama’, from Ukraine to South Africa. Public discourse, it seems, is all about excitement; yet however much we have, we’re constantly being told we ought to have more. And that’s a problem, because excitement is not an unqualified Good Thing.

I knew this already, but if I hadn’t, writing a book on cruelty would have made it clear. Passion, after all, originally had to do with suffering. As for commitment, suicide bombers, war criminals, religious maniacs and other fanatics have tons of it. (Some cruelty is about thoughtlessness and lack of empathy, but not all.) Commitment to their beliefs can drive people to commit horrific atrocities. In the worst crimes, excitement — rather than sadism — is often an additional motive. Being cruel can really get the adrenaline going, and adrenaline’s a powerful drug.

Away from the terrible darkness of cruelty, however, there are more mundane reasons for wishing we could just all calm down a bit when it comes to this societal thirst for continual excitement. One is that, here in the West, we have ageing populations, and older people tend to prefer life to be less of an emotional roller-coaster. (Which adult, honestly, would go back to the agonies of childhood and adolescence, when everything mattered so intensely?) Another is that life’s too full of stuff already; we deal with far more, daily, than our grandparents had to think about.

Both those reasons, of course, may actually be driving the emphasis on excitement, as people long to stay young, or the competition for their attention gets fiercer. It’s also possible that the demand for thrills is a response to our increasingly safe and managed world, and perhaps to the feeling, well expressed by Paul Bernal, that much of that management rather dehumanises the managed.

Whatever the cause, there seems to be a prevalent expectation that excitement is, if not a human right, at least an ideal to which we should all aspire. Information by itself is not enough; it must be garlanded with gimmicks to catch the wandering consumer’s eye.

In my home territory of science, and in the news I sample every day, this is a potentially catastrophic problem. At its heart is a conflict between two domains: factual knowledge, and the media. It’s made worse by private ownership and the profit-motive, but even public sector organisations like the BBC are vulnerable to financial pressures for accountability, value-for-money and ratings. As indeed, increasingly, are public sector scientists.

The problem is simple. Media reporting is about what sells, or what boosts ratings. That constrains what is reported towards short, simple, highly salient (attention-grabbing) material. Since only unusual things are salient, the media is silent about most of everyday experience, and greatly distorts much of the rest. Yet the same media outlets are also presumed to reflect, in some way, how society is, and many people take their views about many things not from direct experience but from what they see on the telly, or read in the papers or online. The result: false beliefs all over the place.

This may seem old-fashioned, but news — including science news — shouldn’t be kow-towing to the impact agenda. It marches, or it should march, under a different flag, because what matters is not impact, but accuracy. Scientific findings aren’t true in the absolute sense in which ‘truth’ is often used; they’re not religious revelations or political convictions. But in the more ordinary sense of carefully trying to reflect reality, they’re far more true than a lot of media output.

For example, it’s irksome to see people talking, enthusiastically, about how exciting science is. Sure, once in a while. The rest of the time they could be hauled up before the Advertising Standards Authority for misdescription. I love science because it’s interesting, but interesting is not the same as constantly exciting (and my tastes are somewhat specialised). Not all science even manages to be interesting even to the scientists doing it, let alone the general public. Why should it? It’s not there to entertain, it’s there to figure out how the world works and help us work better in it.

Passion and commitment, so praised elsewhere, are an active danger for scientists. If your beloved theory is too beloved, you may not be able to see its flaws, and if it’s disproved, your commitment to it needs to stop. As for excitement, it’s often the most exciting papers that get the most publicity — and are then retracted. (Scientific misconduct is a problem that rarely makes the mainstream news.) Meanwhile, the boring stuff goes on quietly making progress.

As for the ability to communicate science, to bring out its interest for people who don’t know much about it, IMHO it has less to do with being exciting, and more with the ability to see the world from their point of view and make the topic relevant to them.

Excitement is over-rated. When it comes to science, and to news in general, it may be actively damaging. At the very least, we shouldn’t be letting it take priority over accuracy. Leave entertainment to the entertainers, drama to the dramatists, and commitment to anyone harmless. Science isn’t a cult. It doesn’t need more passion and excitement; if anything it’s got too much already. Passionate believers can lead a field astray and waste vast amounts of funding; thrilling tales can become distorting myths; high drama can distract from accurate research. (The same goes for news.) Better to treat our sciences the way we should be treating other areas of knowledge: with care, doubt, and in-depth investigation. And if that doesn’t sound totally thrilling … well, so what?

Work? What a silly idea

March 26, 2014 at 9:40 pm | Posted in Uncategorized | Leave a comment
Tags: , , , , ,

A study making headlines today in the Journal of Occupational Health Psychology suggests that having a mentally demanding job before you retire is associated (in about 4000 Americans) with “higher levels of cognitive functioning before retirement, and a slower rate of cognitive decline after retirement”. Use it or lose it, in short.

And who wants to lose it? A retirement spent blowing the kids’ inheritance on having fun is one thing; a retirement blighted by stroke or dementia is quite another.

A letter to my local paper last week, meanwhile, remarked on the way we insist on people ‘finding jobs’, even as companies increasingly use technology to replace them. Or to shift the work elsewhere: the forms which would once have been filled by a secretary are often now completed by customers, online.

Which got me thinking — as I and many before me have thought — about work and the way we organise it.

Frankly, it’s rather silly. The timing’s inept, the concept old-fashioned, and the execution often cruel. For much of history this hasn’t mattered, as there’s been plenty of work to go around; also many people didn’t live long enough to worry about retirement. But things are changing, as the available labour shrinks. And not only shrinks, but shifts towards two extremes: the much-puffed ‘knowledge economy’, and the rest.

  • ‘high-end’ jobs pay relatively well and demand a lot of skills and brainpower (e.g. university teachers and researchers). They also have high workloads and long hours;
  • more manual jobs, which we can’t yet replace with technology, are typically much lower-paid, despite the fact that it’s hard to see how caring for the sick and elderly is less important than teaching kids why Hamlet, or quantum mechanics, matters. Of course, you don’t need extensive training to be a carer.

(We might infer that high pay is perhaps a reward for time invested in previous study? — except that investment bankers can earn far, far more than university lecturers. Is it then a reward for effort, or physical labour? Tell that to a farmer. For danger? Ask a fireman. For being brilliant and/or irreplaceable? That’s what the most highly-paid often seem to be saying, but there’s very little evidence that they’re right.)

There’s less work to go round, especially for those without the best qualifications. And what work there is doesn’t always pay enough to live on. The private sector is on a win-win here. They can get away with paying low wages because the state will fill the gap. They can hike rents, or drive up house prices, because that’s ‘the free market’. They can make profits by pushing their costs onto others, and still whinge about the tax they have to pay — when much of that tax goes on payments that wouldn’t be needed if they weren’t so utterly focused on making money.

In Britain, we hark back to the days when great companies built houses for their workers, or gave their kids schools. That kind of philanthropy may still go on, but we don’t hear much about it. Instead we hear a lot about companies who seem to live by the ancient Roman principle: “homo homini lupus” (“Man is a wolf to man.”) And they have the cheek to complain about government ‘red tape’! Guys, if you behaved better we wouldn’t need to impose the regulation on you — and on everyone else.

As work becomes scarcer, the rhetoric of its desirability intensifies. You’d think humans lived entirely and only to work. The unemployed are stigmatised, their benefits decried (yet the far more expensive pensions of the elderly are OK, because they earned their rewards). Kids are so indoctrinated with the need to find a job that they spend much of their childhood cramming, agonising over exams, struggling with homework, knowing they have to achieve — at a time when they’re dealing with the massive social pressures of growing up. Small wonder some drop out. People who can’t work feel dreadful guilt. Some who lose their jobs are driven to suicide.

There’s something pitiful about a first-year university English Literature student distraught because she’s “wasted time” reading Wuthering Heights when it wasn’t on her course. Or a seventeen-year-old whose only idea about all the cultural riches available to them is how to get work that will pay them enough to buy lots of stuff. Come to that, there’s something pitiful about a middle-aged adult lying awake at night worrying about how they’ll cope with both a sick parent and the demands of running their own household, while working all the hours their job demands. And there’s certainly much to be pitied in the lonely senior, deprived by retirement of company and stimulation, or the hard-working tax-payer who, as they reach retirement, is diagnosed with some appalling illness, like dementia.

Why do we do it this way? It’s bad for our brains, our health and our happiness. At the time of life when we are most able to enjoy ourselves, some of us are working ridiculous hours while others face empty days. Women lose out if they have kids, especially if they choose not to deposit the sprogs in childcare. Some people aren’t paid enough to live on; others earn far more than any human being could reasonably need. Then we reach retirement age, and suddenly that’s it: we’re pensioned off, our productive days over. Yet creativity doesn’t cut out at sixty-five, nor intelligence shrivel at seventy. A man who turns 65 in the UK can expect to live a further 17.8 years, a woman 20.4 years, according to the Office for National Statistics. That’s a lot of years to write off, especially with an ageing population.

There are many ways in which we could change this mess. Most of them are extremely unlikely to happen, not least because the mess didn’t come about by accident. It suits the people in charge, insofar as anyone’s in charge. Yet it may be worth stating some options anyway, if only because they’re far too radical for serious politics and so aren’t often heard. (I’m not a politician, so I don’t need to be serious.)

  • Make the private sector pay its way. Rent caps (why should taxpayers spend masses on housing benefits so that landlords can get rich?). Tackling tax evasion and business subsidies. Redistribution: in the UK this year, a few lucky bankers collected over £5 million each from Barclays Bank. Why are they worth 50 times what we pay our most senior nursing directors, let alone frontline nurses?
  • Encourage job-sharing, volunteering, hobbies and part-time work. Make it acceptable for people of both sexes to take career breaks in midlife. Pay parents better: bringing up kids is hard work. Defusing the social pressures around work and worklessness with clear financial incentives would do wonders for the nation’s health bill, apart from its other benefits.
  • Rethink education. Currently it’s mostly stuffed into our youngest years, and some of it’s pretty irrelevant to most adults. It should be lifelong, as much a part of our routine as running a bank account. That old canard about brain development ceasing around 18 is nonsense.
  • A living wage and/or minimum income guarantee (discussed here in the US context). Many people who’d like to volunteer can’t afford to; many who can afford to can’t spare the time. Making sure that everyone has a minimum guaranteed income to live on would help with at least the first of these problems, as well as reducing the devastating costs of stress-induced mental illnesses. It would also save on the gigantic benefits bill, not least because it would be a good deal simpler to administer than current systems.
  • High-end jobs like running a university, company or bank may be extremely hard and stressful, but the work itself is not intrinsically unpleasant or dangerous (except insofar as the sedentary lifestyle brings health risks). Jobs which are unpleasant and dangerous should be paid more, or workers given a tax break, to express the nation’s gratitude that we don’t have to do this stuff. And if that study I mentioned is correct, perhaps we should be targeting the financial rewards towards encouraging workers to continue their education.
  • Abolish retirement except on health grounds. If work — in moderation — is so good for us, we shouldn’t be driving people away from it. If there’s less work to go round, we need to be more creative in how we organise it — because there’s plenty to do; it’s just that much of it isn’t currently paid work. Making high-end jobs less demanding and low-end work more interesting, and giving people more life space to do unpaid work, would make retirement look less attractive, as well as providing benefits for workers and society.

Work is bound up with many half-acknowledged ideas: about fairness and reciprocity, status and identity. While there was plenty of it, there wasn’t much need to examine its rationales, and how deep-seated feelings and ways of thinking affect them. But work is changing, and we need to change our ideas about work.

 

The Brokeback effect: homophobia and premature mortality

February 17, 2014 at 12:03 pm | Posted in Uncategorized | 2 Comments
Tags: , , , , , , , , ,

This morning I was late starting work, so I was still doing housework when my attention was hauled back to the final few minutes of the BBC’s Today Programme. For those of you whose lives have so far been free of this 3-hour matutinal ritual, it’s the UK’s daily fix of politics and ‘opinion-forming’, famed for its sometimes aggressive style, and carrying huge clout among the governing classes (though there are plenty of Brits who’ve never heard of it).

A woman described as ‘Commentator Ann Leslie’ (she writes for the Daily Mail) was attacking another woman, who’d been complaining about some men’s magazine putting a child’s doll on the front cover. Or something; I wasn’t really listening. What did make me listen was when they got onto the topic of stereotyping kids’ toys, and one of Today’s presenters, Evan Davis, said quietly, “I like pink”. Leslie remarked, “You would.” The other presenter said, “Steady on!”, and at that point the programme ended.

Evan Davis, you see, is gay.

A few minutes later, I’m logging on and going through the email, among which are tables of journal contents and notifications of interesting new developments in the science world, and I come across this article:

“Structural stigma and all-cause mortality in sexual minority populations”

This is a big US study, linking data from a social survey about levels of anti-gay prejudice to the country’s ‘National Death Index’. And here’s what it found (so says the abstract):

  • Sexual minorities living in communities with high levels of anti-gay prejudice experienced a higher hazard of mortality than those living in low-prejudice communities
  • This result translates into a shorter life expectancy of approximately 12 years (95% C.I.: 4–20 years) for sexual minorities living in high-prejudice communities.
  • Analysis of specific causes of death revealed that suicide, homicide/violence, and cardiovascular diseases were substantially elevated among sexual minorities in high-prejudice communities.
  • Strikingly, there was an 18-year difference in average age of completed suicide between sexual minorities in the high-prejudice (age 37.5) and low-prejudice (age 55.7) communities.

In other words, living in a community full of people who hate and/or despise you, and are happy to let you know it, is really, really bad for your health, if you’re a member of a sexual minority (or, I’d guess, some other equally controversial outgroup).

In terms of life expectancy, being gay among homophobes is comparable with being poor in a nation of rich people (and one whose citizens are taught to revere money and see poverty as self-inflicted).

When people who express prejudices are challenged, they often play the humour card: “Oh, it was just a joke!” They know that we tend to like someone who makes us laugh, and that we don’t like the sense of exclusion implicit in the statement: “You can’t take a joke.” It’s an old ploy, and often effective; no one wants to be seen as humourless.

But as research is increasingly showing (and this study is one of many), anti-gay prejudice is no joke. It’s also silly, in that whether you’re gay or straight or bisexual is irrelevant to most social situations (like how well you do your job, or even whether you have a sense of humour). So why should we care about a person’s sexuality, unless we’re planning a sexual relationship with them?

Libertarians might argue that free speech is a crucial freedom, but these ‘jokes’ aren’t funny. They’re dangerous. For gay people, poor people, and other minorities too, the cumulative social pressure can be lethal.

Twenty years ago, a young member of the UK Labour Party, Neil Kinnock, made a remarkable speech about what he foresaw if the Conservatives took power.

“I warn you that you will have pain – when healing and relief depend on payment.

“I warn you that you will have ignorance – when talents are untended and wits are wasted, when learning is a privilege and not a right.

“I warn you that you will have poverty – when pensions slip and benefits are whittled away by a Government that won’t pay, in an economy that can’t pay.”

The speech ends:

“I warn you not to be ordinary.
I warn you not to be young.
I warn you not to fall ill.
I warn you not to get old.”

Judging by recent research (two decades of social progress later), you’d still better not be poor or non-heterosexual either.

Forgotten sins: of fashion and morality

January 28, 2014 at 12:37 pm | Posted in Uncategorized | 1 Comment
Tags: , , , , , ,

We tend to think of morality as long-lasting. Think of the Ten Commandments, carved in stone, or the claims of evolutionary psychologists that moral sentiments have ancient origins. Morality is associated with permanence, certainty, and truth. So it’s odd how rapidly our moral opinions can change.

Each of us could plot our own moral profile, a graph showing how much we care about a range of topics, from the environment to immigration. Concern, however, doesn’t necessarily imply understanding of an issue. You can see why: there’s so much to know about nowadays that we have to outsource most of the knowing to others. Yet expressing moral judgements is a common form of social bonding, and there are many situations in which we’re expected to have an opinion about a topic – often by people as ignorant of it as we are.

In part, this might explain why morality can change so fast: as people learn more, they may revise their judgments. However, many moral opinions are driven by fashion and the media, the sources from whom we learn about stuff we can’t be bothered to learn about. These sources don’t exist to impart information, but to sell things – and moral outrage sells; so it is in their interests to foster judgmental fervour. The result is a lot of strong opinions with a shaky factual basis. New facts alone may not suffice to modify these beliefs, but a wider cultural shift, with its changing social, legal and financial incentives, can.

In the UK a few decades ago, many women – and even young teenage girls – ran a gauntlet of sexual comments and touches by men. It was part of life, however unwelcome. Now the culture has changed so much that ageing celebrities are being hauled through the courts, and the media, for acts committed years ago. Current stars, and the public, know it’s no longer acceptable to fondle fourteen-year-olds or force yourself on an unwilling female. They know they may lose their job or even go to prison. This doesn’t mean such acts no longer happen, but they now carry more social risk.

We’ve certainly noticed the moral change in this case; there’s been so much public comment on sexual abuse. Yet there are some features of morality which have slipped out of our culture almost unnoticed. These forgotten sins (and their opposites, since every vice has its complementary virtue) used to be common, but no more. Is it that we simply don’t have the mental capacity to think about the older sins because our heads are so full of new ones, in this world of cyberbullying and online grooming? Or is it that older sins have become unfashionable – partly through their link with previous generations, and partly because they’re too uncomfortable for us complacent moderns to think about?

If so, might it be worth taking another look at them? It’s very hard for someone immersed in a particular social situation – i.e. living as most humans do – to see that situation’s problems. This is why outsiders, though often unwelcome, are so useful. They add new perspectives, whether you’re an organisation thinking of employing more diverse staff or a nation debating some hot-button moral issue.

If you haven’t a helpful outsider handy, another approach is to look at what isn’t in the public conversation. Silences can sometimes speak louder than words, and if no one’s talking about certain kinds of bad behaviour, that may be because it suits today’s society to encourage that behaviour.

As an example, how about vanity? It’s generally considered a form of pride, and we think of it as pride especially concerned with physical appearance. A quick check with Google Books shows that, at least in written words, discussion of vanity is declining, and has been since early in the nineteenth century, as this graph from Google ngrams shows.

Decline in 'vanity' and 'vain', 1800-2000

Decline in usage of ‘vanity’ and ‘vain’, 1800-2000, in Google’s books database

Meanwhile, industries concerned with personal appearance are worth billions, and forecast to grow still further. Data from the American Society for Aesthetic Plastic Surgery show that in the decade-and-a-half between 1997 and 2012, surgical cosmetic procedures in the US grew by 75%, to nearly 1.7 million. Procedures not requiring the beautified to be cut open grew by 920%, to over 7.5 million. Apparently, in 2010 Americans spent $33.3 billion on ‘personal care’, which would be enough to reduce world poverty by half … if it had been spent on reducing world poverty.

It’s as if being ugly is now a worse moral flaw than being conceited. Which is peculiar, because we know from celebrities that where beauty leads, good behaviour doesn’t always follow. Nor are people with facial disfigurement incapable of the highest human virtues, whereas being conceited is associated with bad behaviour. Yet the beauty-is-good stereotype is so prevalent, it’s even featured in an fMRI experiment.

Perhaps that’s why we don’t talk so much about vanity these days. Maybe the moral principles which used to condemn it came up against the market’s demand for profit. Plus, technologies have turned what was once a matter of fate – personal appearance – into a matter of personal choice. (Providing you have the money, of course, but that takes us into yet another minefield, the morality of poverty, and this post has touched enough sore spots for the time being.)

The constant demand for economic growth, the new powers to change our appearance, and the earning potential of playing on human vanities – and anxieties – together make up a powerful set of forces to set against an elderly moral principle. With respect to vanity (and the vain demand a lot of respect), it seems that the clash between market and morals has been won by the market, at least in the West and for now.

Of course, vanity isn’t the only unfashionable vice. I wonder if the market’s winning on the others too.

In praise of coming second

December 9, 2013 at 2:04 pm | Posted in Uncategorized | 1 Comment
Tags: , , , , , ,

Heresy though it may be to admit it in our competitive culture, there’s a lot to be said for being runner-up. Winners may increasingly take it all in financial terms (in the US, income inequality has been rising for decades), but as Christmas should remind us, there’s more to life.

I live near England’s second city, Birmingham, and recently went there for a concert in its fabulous Symphony Hall (see my previous post). Whenever I visit Brum, I’m always conscious of being deliberately unsurprised by how pleasant it is these days. Stereotypes linger, and when I was growing up the place had a pretty poor image. For many people it still does. Besides, it’s the second city – i.e. not the first. Loser!

The prejudice is unfair. Yes, there are parts of Brum where I wouldn’t walk alone in daylight, never mind after dark. But that’s cities for you, and Birmingham’s council has done wonders in recent years to make it an attractive and enjoyable place. Even the oppressive ugliness of New Street rail station is being transformed. Another big project, the splendid new library, was publicly-funded (fat chance of that these days) and, astonishingly, came in under budget, unlike many projects, public and private. Centenary Square, where you’ll find the library and Symphony Hall, is as handsome a space as anything you’ll see in London. And no, I’m not on commission.

Yet Birmingham is widely despised. It’s often ignorance; the national media and government, largely based in London, seem rarely to notice Brum unless someone’s been killed in it (to be fair, this is true of much else outside London). Yet the place has much to offer:

  • concert venues which attract top artists (from Andreas Scholl to Rihanna)
  • excellent sporting, arts, shopping and conference venues
  • fantastic architecture and public spaces
  • fine universities doing world-leading research. Every UK institution is now expected to do this; fewer actually manage it
  • plenty of history, science and technology

Birmingham is, after all, not only an industrial city; it was home to major artists like the pre-Raphaelite Edward Burne-Jones, great Quaker families like the Cadburys, scientists like Joseph Priestley, poets like Louis MacNeice, etc. (there’s a longer list on Wikipedia).

But it’s not London. Birmingham always comes second (if not fourth, behind London, Oxford and Cambridge, the so-called Golden Triangle).

Yet being second has its advantages. House prices, for one. They’ve rocketed in recent decades, but they’re still nowhere near the extortionate levels of the Triangle. The countryside’s easier to reach and the city easier to drive through than London – and nicer; my route into Brum soars through the spectacular Spaghetti Junction. The people seem better-humoured and less self-important, the pace less ruthlessly frenetic, and the London attitude of charging you for every possible thing hasn’t altogether permeated the Midlands. London’s a superlative city and a great place to visit, but I’m glad I don’t have to live there.

As for cities, so I suspect for people. Second-raters tend to be better balanced, pleasanter, less wearing to interact with, less highly-strung.

When I was at university, there seemed to be – at every stage, in every subject – one person singled out for special favour. We called them BMGs (‘brightest mind of their generation’). Some of them, I hope, have lived long and prospered, but many failed to develop the predicted stellar careers. Why? Because, I suspect, of the weight of confidence placed in them. Either they writhed with anxiety at having such great expectations to live up to, or they became insufferably lazy and complacent.

Some BMGs dropped out. Others now have academic jobs, and make their colleagues’ lives hell by not pulling their weight. They’re the nuisances every department suffers, the ones who wriggle out of any work they can (except research), who exploit good will until everyone longs to get rid of them. Less flashy candidates are much more useful in real-life institutions, where too many putative geniuses can be a nightmare for actually getting things done.

Real life doesn’t need a world filled with winners. Society isn’t built for it, whatever the ‘all can have prizes’ brigade would have you believe. And not everyone who doesn’t win is a loser. Some simply opt out of the game. Others, too smart to chase the fantasies of money, status and fame, value longer-term goals, like being useful and having friendships with people. In the long run, a wealth of research now suggests that chasing these goals is far better for human well-being. Winners, in other words, win at a nasty cost.

At Christmas and New Year people tend to review goals and values. Perhaps we should think about whether the constant emphasis on coming first is really as fine a thing as it’s made out to be.

Happy Christmas!

Knowing Music

November 20, 2013 at 2:18 pm | Posted in Uncategorized | 2 Comments
Tags: , , , ,
Image of Symphony Hall at night

Symphony Hall at night

Today I’m thinking about the different ways we know music.

Last Saturday, I was lucky enough to have tickets for a classical music concert in Birmingham’s Symphony Hall. The event was, I think, the first all-Mozart concert I’ve ever attended, and it was wonderful.

The program began with the Marriage of Figaro overture, ended with the Requiem, and in between we heard one of Mozart’s loveliest pieces, the Piano Concerto No 21. It’s sometimes called ‘Elvira Madigan’ – not by Wolfgang Amadeus himself, but since it was used in the 1960s movie of that name.

I like listening to Western classical music for many reasons, but one is because of the depth and range of experiences involved. The music’s structural complexity and long traditions allow it to tap into many emotions. There’s the sublime simplicity of Mozart, which held the Birmingham audience completely spell-bound. There’s the exhilaration of Sibelius’s Karelia Suite, or the exaltation of Wagner’s great ‘Valhalla’ motif from the Ring cycle. You can get a supernal chill from Bartok (Duke Bluebeard’s Castle), breath-taking flamboyance from the likes of Sarasate (his Zigeunerweisen), heart-breaking grief from Bach (in the St Matthew Passion), or overwhelming awe from Saint-Saens (try the Organ Symphony) or Berlioz (in his Symphonie Fantastique). You can hear the sea in Britten’s Peter Grimes, feel the seduction in Bizet’s Carmen, sense the Shakespearean tumult in Prokofiev’s Romeo and Juliet, and smile at Rimsky-Korsakov’s bumble-bee. And these are only a few examples.

(Other genres, like hip-hop, pop and R&B, are doing something quite different, in which tunes and chords share the limelight not just with rhythms but with images and dance moves. Often their structure is simpler, their emotional range narrower, and their dynamic range set permanently to ‘loud’. Personally, I find the results immensely boring, just as fast food’s dull compared with decent home cooking. But there are times when fast food’s what you want.)

As well as the emotions, how you listen to classical music can vary, often within a single piece. Whether you’re consciously savouring the flow, self-consciously attentive to the structure, letting the feelings wash over you, or even drifting off into other thoughts (i.e. hearing, not listening) – that depends on how well you know the music, your concentration, the performance, and much more besides.

It was the piano concerto that set me thinking about how we listen to, and recognise music. It’s a piece I got to know when I was very young, as my parents had a tape of it, performed by the great Hungarian pianist Geza Anda. I’ve never formally studied it, though, so I don’t know it the way a musician would. And until the concert, I hadn’t heard it for years.

Yet as soon as it began, the gap of time was bridged. The feeling of recognition was like relaxing into a warm bath. I knew instantly what was coming next; I knew every point at which the performance differed from the Anda version, and if the pianist had put a finger wrong I’d have been instantly, wincingly aware of it. I’ve often, hearing something on the radio, known that it wasn’t ‘my’ recording, without being able to say what piece it is or who wrote it. And hearing something live, of course, is quite different from hearing recordings, especially when the acoustics are as fantastic as they are in Symphony Hall.

At two points in the piece, there are cadenzas – show-off moments, basically – where the pianist has a choice of what to play. As soon as the pianist started his first cadenza, I knew it wasn’t the one Geza Anda played, and I felt the internal switch from warm emotional bath to cooler cognition. I became interested in the music, its structure, how each phrase reflected aspects elsewhere in the concerto … in other words, I was listening much more analytically.

The slow central movement switched me back into the fuzzy glow of – not memories, exactly, but the feelings associated with them. It’s surely one of the most beautiful pieces of music ever written (available on YouTube, if you can put up with the preceding ad for something very different). It needs limpid, delicate lightness – think sunlight gleaming through a waterfall – as so much of Mozart’s music does. I was on tenterhooks for the first few notes of the piano’s entry, until I realised he’d basically got it right. Phew!

And yet, as I said, I’ve never played this piece, and hadn’t heard it for years. Music digs deep tracks in the mind, especially in childhood. Works I’ve learned to love as an adult, and listened to much more recently, don’t bring the same intense awareness of details.

Research suggests that, like language, music is easily and naturally picked up in childhood, and that children who don’t encounter it early in life may lose the ability to revel in it later. Yet many schools, and parents, see classical music as too difficult, or an unnecessary luxury (even nowadays, when recordings are cheap and orchestras are working hard at outreach). Anyone classically-trained can easily move into pop music – and many have – but it’s harder to move the other way. Yet for many kids, all they ever hear is what Freddie Mercury (I think it was he!) called ‘Kleenex music’: simple, disposable, forgettable.

People claim that classical music is elitist. (Here in the UK concerts are cheaper than football matches.) I can’t help wondering how much of that response is defensive. Calling something elitist gives you an excuse for not making the effort required to learn more about it. Classical music needs work, certainly, unless you’re young enough to soak it up without effort. So does learning any new skill, but does that mean that anyone with a skill is somehow ‘elitist’?

A child who learns to love classical music has been given a great treasure. He or she will have immense resources to fall back on, in good times or bad. Music engages our brains much more extensively than many other activities. It’s good for us, too, reducing stress markers and promoting that sense of ‘flow’ which is associated with rest and relaxation. It’s “amongst the most rewarding experiences for humans”. And learning to play teaches teamwork and self-discipline, quite apart from being fun to do and a boost to self-esteem.

It’s a real shame that so many children miss out on these life-enhancing joys.

(The performers at the concert were the Orchestra of the Swan, with the City of Birmingham Choir, Anthony Hewitt piano, Rhian Lois soprano, Anna Huntley mezzo soprano, Samuel Boden tenor, Benjamin Cahn baritone, and Adrian Lucas conductor.)

“Revolutionary!” “Controversial!” What media talk about science really means

October 14, 2013 at 12:38 pm | Posted in Uncategorized | 1 Comment
Tags: , , , , , ,

Like science itself, mainstream media reporting of scientific findings can be confusing, not least because ordinary words are given specialised meanings. To help the perplexed, here’s a light-hearted gloss of some of the commonest terms used by media folk to talk about science and scientists.

–––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––-

“Expert” – knows more than we do.

“A leading …” – we’ve heard of this person (and we can barely spell their research discipline), so they must be important.

“Professor” – can mean either a) a professor or b) anyone with a doctorate.

“Scientists believe …” – the ones we asked said …

“X says [something controversial]” – and so they did, with a little judicious editing. Whaddya mean, out of context?

“New” – almost certainly not, but if even the researchers aren’t thorough about their literature searches, you can’t expect us media types to know about previous work. We can barely remember what happened yesterday.

“Extraordinary” – it sounded weird to us (but then, so does most of this science stuff).

“Important” – we can see how this might have something to do with the real world.

“Groundbreaking” – it’s in Nature, Science, or PNAS. Or, it’s female, disabled, or ethnic minority.

“Breakthrough” – something useful may possibly result from this at some point in the future.

“Landmark” – every scientist we asked, the press release, and the journal’s editorial all said this might make a difference.

“Revolutionary” – contradicts something said by somebody else.

“Controversial” – we suspect the only researchers who think this are the study’s authors, but hey, it’s eye-catching.

“A study suggests” – even we recognize that this one’s so provisional we need to say so.

“Abstract” – we haven’t a clue what this means, let alone if it’s any practical use.

“Theoretical” – see Abstract.

“Challenging” – we have no idea what they were banging on about.

“Developing” – we’re pretty sure they haven’t either.

“Theory” – anything more than a guess, put forward by a scientist.

“Hypothesis” – an irate scientist complained about how we misused the word theory.

“Anecdote” – a term of abuse used by scientists to complain about the media. We prefer to see anecdotes as baby datapoints.

“X causes Y” – X has been linked Y to by some statistical method. You don’t want to know the details, do you? Good.

“Correlated with” – a patient scientist explained to us that correlation is not causation, so now we can show off.

“X doubles the risk of Y (no baseline risk given)” – trust us, we’re headline-writers. And we’re not telling you what the original risk was because it’s so tiny that you’d realise this is a total non-story.

“X doubles the risk of Y (baseline given)” – we know you can’t make any sense of this statement without knowing what was doubled, so we’ve fed you the number. What we haven’t told you is that there are probably so many other factors causing Y that you don’t need to panic over X, at least until you’ve stopped smoking, changed your diet, done more exercise, moved to somewhere less polluted, and stopped worrying about all the crap in the media.

“A gene for X” – this gene produces a protein which may have some small influence on something in the body which eventually has something to do with X.

“Brain regions associated with X” – these brain regions seemed to be doing something when the few people tested in the study were X-ing, so they may have something to do with X. They’ve also been associated with lots of other things, but we like to keep the story clear and simple.

“Brain activity” – some complex statistical measure which some specialised research-folk think may be quite highly correlated with changes in brain cells, and a lot more less specialised folk think may have something to do with the mind, whatever the hell that is.

“Neurologist” – anyone who does anything related to actual brains (i.e. not a psychologist).

“Neuroscientist” – any person working on brains who’s explained to us that they’re not a neurologist.

“Remarkable” – a scientist who shows signs of being successful despite being a woman.

“Brilliant” – this guy’s a better self-publicist than most of his colleagues.

“Maverick” – weird even by scientific standards, and quite likely to be wrong.

“Confident” – probably a bully, and even more likely to be wrong.

“A lone voice” – the probability of wrongness is close to 1.

“Professor X could not be contacted …” – Professor X has had dealings with the media before.

“Fluent communicator” – wow, a scientist who doesn’t just stare at their feet!

“Engaging communicator” – this one smiles!

“Brilliant communicator” – this one can talk and they’re not bad-looking, for a geek.

“Difficult” – we suspect this one has autism.

“Dedicated” – you really chose to spend your career doing that?

“Committed” – it’s ridiculous how seriously you take this stuff.

 

So you want to be a university manager?

October 1, 2013 at 11:12 am | Posted in Uncategorized | Leave a comment
Tags: , , , , ,

So you want to be a university manager? You’ve come to the right place! This short guide is all you’ll ever need to make a success of your new role as a steersperson of one of our great institutions of learning. Once you’ve worked through the Five Key Areas, and memorised the Six Key Messages, you’ll be ready and raring to set out on your new, exciting career path.

In fact, if you’ve got any other management guides (OMGs), you can chuck them in the garbage can right now. You’re a university manager (UM). The usual rules of management don’t apply to you.

That’s because OMGs are all about how to manage normal people. But being a UM – the privileged state of UMhood, as UMs like to think of it – isn’t about managing normal people. It’s about managing academics.

Key Message 0: Academics are not normal people.

In any management guide, there are five Key Areas you need to think about: Morale, Incentives, Listening, Respect, and Communicating. This guide will help you improve your managerial practice in all five areas. We’ll start with the slippery concept of staff morale.

Key Area 1: Morale

OMGs will tell you that good management needs to focus on staff morale. This is nonsense. Academics pride themselves on being rational thinkers. The scientists among them, especially the economists, will give you a useful tip: they don’t work with anything they can’t measure. Neither should you!

Staff morale is notoriously hard to measure. That’s because it’s touchy-feely, not rational. Even mentioning morale, let alone trying to improve it, will upset your more autistic academics, provoke the cynical ones, and make all of them less likely to read your emails. Besides, most academics are left-wing contrarians who don’t want their morale improved, because then they might have to approve of ‘the System’.

This is why you can’t just measure staff morale by asking the staff.

Important Note:  Student morale is another matter, because students haven’t been at your institution long enough to turn into academics. So student morale can be measured by asking the students. As you know, it is measured, and it matters for funding. Staff morale isn’t, and doesn’t. So if you’re keen on morale, focus on the customers who pay your salary, not the people who take up too much of your time already.

Key Message 1: Leave academics’ morale to academics. They’re smart, aren’t they? They can figure it out.

Key Area 2: Incentives

OMGs say that a good manager needs to reward staff. They often quote research which is supposed to show that people respond better to positive incentives (rewards) than to negative ones (punishments).

Important note:  ‘Reward’ doesn’t necessarily mean money – which is just as well, since your institution probably doesn’t have any. It means social rewards. A social reward can be anything from a smile, an honourable mention at a departmental meeting, or praise in an annual review, right through to rewards that will actually cost you: biscuits for committees, free drinks, or a departmental party.

Ignore the temptation to be nice. Academics are smart and will see through your attempts to conciliate them. Negative incentives are much more effective. Some UMs employ both, but carefully: an initial brief commendation followed by a long list of criticisms. Academics pride themselves on seeing both sides of an argument, and this move will make them feel uncomfortable about actively hating their UM.

Key Message 2: Never praise an academic if you can avoid it.

Key Area 3: Listening

OMGs say that a good manager is one who listens to staff, walking the floors, knocking on doors, inviting open, informal communication.

With academics? Are you joking, OMG? Given half a chance, most of these people would talk for hours on their specialist subject. You don’t have time for that.

Besides, many academics are introverts who are afraid of human contact. They’re also hugely overworked. They’re not going to thank you for coming and bothering them, especially as their natural left-wing cynicism leads them to believe you won’t pay any attention to what you hear.

For the same reasons, there’s no point encouraging social occasions, or, if they do take place, attending them.

Unfortunately, the idea that managers should listen has gained ground in recent years. So a good UM will occasionally schedule ‘listening forums’. If you plan to do this, make sure the setting is formal and that there are plenty of senior academics present. And be careful not to make any clear statements about how – or whether – the information you get will be processed at higher levels. (Promises are hostages.) That way you’ll deter people from making complaints that you might have to do something about.

Key Message 3: Academics don’t want to be listened to. They want to be left alone. If they were that social, they wouldn’t be academics.

Key Area 4: Respect

It’s a favourite OMG mantra: respect your staff.

Why? If they were worth respecting, they wouldn’t be academics. They’d be managers, like you, earning your salary. These people are grunts. They wouldn’t last an hour in the real world.

On the other hand, it is important that they respect you.

Some management theorists insist that respect is gained by soft skills: emotional intelligence, sociability, and so on. For normal people, this may be true, but remember: as an UM, you’re dealing with academics. They’re most at ease with abstractions, so make yourself an abstraction! Your people will respect you more if they hardly ever see you. At the same time, you need to make them feel that a word from you could ruin their future.

UMhood isn’t about soft soap. It’s about power.

Key Message 4: Respect, in academia, should flow one way only: from the bottom to the top.

Key Area 5: Communicating

OMGs will tell you that good communication is essential to successful management. Let’s unpack that a little.

Management jargon is often sneered at by people who say that language should be about communication. These people are idiots. They don’t understand what management jargon is for.

Besides, academics love jargon. For them to complain about your jargon is pure hypocrisy!

Remember, most of your institution’s recent funding increases have been spent on either your salary or your new office, not on hiring more academics. Academics know this. They see the UM as the enemy. They are also ideologically indoctrinated to perceive ‘the System’, and anyone who supports it, as evil.

You can’t sweet-talk these people. As that great management theorist Niccolo Machiavelli said, your only option is to crush them before they crush you.

Management language is about two things: making yourself look powerful, and making yourself look efficient. It’s a weapon in the fight all UMs have to fight, every hour of every day. The language you use is the headlock by which you subdue your staff.

Efficiency is why you’ll hear managers saying ‘actioning’ instead of ‘putting into action’, ‘progressing’ instead of ‘making progress’, and ‘less’ instead of ‘fewer’. Less syllables = more efficient.

Power is why a smart UM will often appear to act inconsistently. If you’re inconsistent, you’re unpredictable, and that makes people uncertain. As they get more anxious, they see you as more powerful.

Important note:  Inconsistency also works with the OMG notion that managers should immerse themselves in details. You can’t possibly get your head round all the details of managing a set of academics, so don’t kill yourself trying. Instead, make sure you master a few details efficiently. That way, you’ll be able to make excellent use of random micromanagement.

A good UM is a master of this art. By micromanaging only a few aspects of your institution’s systems, you can ignore the constraints which, in practice, prevent the kind of changes you demand from actually happening. Then you can criticise the academics for not making those changes. Meanwhile, you display a hopelessly disorganised grasp of other institutional processes. This reminds your staff that life isn’t fair, thus lowering their expectations to realistic levels.

Remember, people who are depressed often slide into learned helplessness, so a depressed academic is likely to be an inert academic. And inert academics are much, much easier to manage.

Key Message 5: Management jargon is there for good reason. Use it well.

Congratulations! You’ve now reached the end of the only guide you’ll ever need to being a university manager.

That’s it. That’s all you need to know.

The path to UMhood lies before you. Go for it!

And remember, with great power comes …

Most OMGs would end that sentence with a boring old cliché: ‘great responsibility’. Not this guide. As an UM, you’ll learn to end it with ‘greater salary’. Enjoy!

A modest proposal for the science media (2)

September 17, 2013 at 11:28 am | Posted in Uncategorized | 1 Comment
Tags: , , , , , , , , ,

(Feel free to take it in the spirit of the great Jonathan Swift’s original.)

What prompted the proposal

Most mornings, I say hello to the Internet. In return, I find a slew of press releases, heralding recent, or even advance, publications in neuroscience, psychology, medicine and health. Most of it’s from just-published studies, dolled up by press officers and spoonfed to the media. And, as the UK’s Astronomer Royal Martin Rees has implied, a lot of it’s tosh.

I tell my students that it’s better to read first-rate science fiction than second-rate science. It’s more stimulating, and no more likely to be wrong.

In my previous post, I discussed two examples of high-profile science. One, a study published in the Journal of the National Cancer Institute, proposed a link between omega-3 fatty acids and prostate cancer. The second, in the prestigious journal PNAS, proposed a link between copper (Cu) and Alzheimer’s disease (AD). They triggered this particular piece of devil’s advocacy, but they’re only two examples of very many, and my beef is not with them, but with the system that produced them.

A modest proposal for reform

No media organisation with more than 10,000 regular readers should be allowed to publish news of any scientific research until one year has passed from the date of first publication, or the study has been successfully replicated.

The omega-3s/cancer and copper/Alzheimer’s studies were widely reported. What did this high profile achieve for the general public?

Many will have missed, ignored, or instantly forgotten the news. Some of the rest, however, may have worried about whether they should change their diet. They may have asked their doctors about it, or wasted time surfing dubious Internet sites. A few may even have used the study as an excuse for not eating more fruit and veg. Others may have thought, crossly, that they wish the bloody scientists would make their minds up, or remarked that you can’t believe anything you read in the media these days.

I’d be willing to bet that very few will have rejoiced at the extent of their new empowerment, thanked the Press for bringing them the truth so quickly, and happily formulated a new, fish- and fruit-free diet in order to live long into healthy old age.

In other words, we may have a slight increase in anxiety, cynicism and distrust of science, but on the positive side we have … what, exactly? The thrill of novelty. Pages filled, buttons pressed, teeny-weeny neurotransmitter hits delivered. Readers fooled into thinking this particular organisation is hovering at the cutting edge and holding the boffins to account. And this benefits the public – how?

It benefits the media. It also benefits the authors and their institutions, given the current insistence of funding bodies on generating ‘impact’. But science, wisely, distrusts new findings and insists on them being repeated before it takes them too seriously. Might not the science-media-reading public benefit from knowing that a similar standard has been applied to what they are being told? We’re already overloaded with data, or at least information. The modest proposal would reduce quantity and boost quality. It should also give journalists more time to do investigative science journalism.

As it is, quite often reporters don’t understand what they’re reading, because they aren’t specialist science journalists, and the reader doesn’t bother glancing past the headline anyway. This isn’t a criticism of either: readers and journalists are busy people, and even experienced scientists can struggle to understand publications in other scientific disciplines.

But scientists are to some extent held to account by the profession’s self-correcting mechanisms, and its longer timescale – though publicity-chasing for ‘impact’ may distort these. In the media, sensational findings are common; reports of how they were discredited are rarer.

Six conditions for publishing science journalism

Obviously the modest proposal would need expansion. For starters, how about the following six conditions to be met before the news is published:

1. At least two independent experts to be consulted about the study’s merits, and their views to be reported on whether the study is mainstream, minority, maverick, nonsense, or dangerous nonsense

a. If both experts class it in the ‘dangerous nonsense’ category, the journalist may consult three more experts
b. If these disagree, the views of all five should be reported
c. If all five concur that it’s dangerous nonsense, the report should be scrapped

2. Similarly, the experts to be asked for their judgement on whether the headline is appropriate, and that judgement either to be taken into account, or at least reported

3. The journalist to have read at least the introduction and discussion sections of the article, not just the press release

4. Potential conflicts of interests to have been checked by the journalist – and not just by reading the authors’ ‘yup, we’re clean’ statement

5. For health studies, a risk assessment has shown that the putative risk to human health (the ‘culprit’) identified in the study is higher than the risk to health which would follow from giving up the culprit – or at least state both numbers in the report

6. Information about the study should, at the very least, include baseline values as well as the size of the effect/risk/change, the type of study, a link to the article, and the number of participants tested.

Or, how about a kitemark?

I try to be a realist, and realistically, the modest proposal is never actually going to happen. Instead, it would be great if someone could set up a kitemark, a signal of quality, for science journalism, administered by an independent body. I don’t think it would be fair to apply it wholesale to organisations, but individual pieces, and major blog posts, why not? If they were required to display the percentage of their articles which have received a kitemark, that might encourage them to aim for a higher percentage.

Come to think of it, why don’t we do that for the whole of journalism, and make the industry fund the jobs required? Better quality and more employment, in one!

A modest proposal for the science media (1)

September 17, 2013 at 11:18 am | Posted in Uncategorized | 5 Comments
Tags: , , , , , , , , ,

(Feel free to take it in the spirit of the great Jonathan Swift’s original.)

What prompted the proposal

Most mornings, I say hello to the Internet. In return, I find a slew of press releases, heralding recent, or even advance, publications in neuroscience, psychology, medicine and health.

Two in particular have caught my eye of late. One, a study published in the Journal of the National Cancer Institute, proposed a link between omega-3 fatty acids and prostate cancer. The second, in the prestigious journal PNAS, proposed a link between copper (Cu) and Alzheimer’s disease (AD). Both were much discussed in the media. They triggered this particular piece of devil’s advocacy, but they’re only two examples of very many, and my beef is not with them, but with the system that produced them.

Everything’s bad for you

As it happens, I do some editorial work for a small research charity, the Institute for Food, Brain and Behaviour (IFBB), and I was recently at a colloquium with some people who’ve spent their careers researching omega-3 fatty acids. Let’s just say that when I heard about the fatty acids/prostate cancer article, I raised a sceptical eyebrow; when they heard, they tore its methods into little tiny pieces. You can read a brief summary of some of the criticisms on the IFBB website.

This isn’t just abstract theorising, or scientists quarrelling among themselves. It matters. I’ve already been told of one elderly man who’s very anxious about whether he should stop taking fish oil supplements. The stress has probably already done him more harm than the omega-3s ever will – but he believes what he reads in the papers, as many people do.

(And yes, a cynic might say he shouldn’t, but that’s close to accepting it’s OK for papers to print stuff that is, when you get down to it, not true.)

The copper/Alzheimer’s study is preliminary work, done in mice and in human cell cultures. What the authors say in the paper is this:

“Whereas the role of environmental factors in the development of the sporadic form of AD is controversial, long-term exposure to higher levels of Cu may contribute to this process, at least in some cases.”

Note the qualifications: ‘may’, ‘in some cases’.

By the time this reached the press release, it had acquired the headline, ‘Copper identified as culprit in Alzheimer’s disease’, a warning advertisement – how to tell if you’re about to get Alzheimer’s – and (because most of us don’t chew on copper pipes, though we may get our drinking water from them) a list of copper-containing foods.

Why? Are we to stop eating “red meats, shellfish, nuts, and many fruits and vegetables”, for fear of dementia? Or perhaps we should merely cut down on these foods? Red meat I can understand; both doctors and environmentalists keep telling us we should be eating less of that. But surely the benefits of fruit and veg outweigh the risk of dementia? – especially given everything else that’s been linked to Alzheimer’s over the years.

The press release’s last two paragraphs, for those who get so far, are more cautious. They read as follows:

However, because metal is essential to so many other functions in the body, the researchers say that these results must be interpreted with caution.

“Copper is an essential metal and it is clear that these effects are due to exposure over a long period of time,” said Deane. “The key will be striking the right balance between too little and too much copper consumption. Right now we cannot say what the right level will be, but diet may ultimately play an important role in regulating this process.”

Indeed. So why isn’t this right at the top of the press release?

To be fair, the press release provides a link to the article, which not all of them do. It’s got quotes from the authors, and it does a good job of explaining. It’s not a bad example of the genre.

And yet …

This is one study. On animals and cells, not people. It has a plausible mechanism, which is more than many nutritional epidemiology studies do, but it seriously needs replication. Not everyone agrees on what causes Alzheimer’s, and no one’s saying copper is the only ‘culprit’, that morally-laden word. (Bad chemical! Stop tormenting those brains!) Possible causes for this awful disease frequently hit the headlines, long before we’ve any idea whether they really are ‘the’, or even ‘a’ culprit. We still can’t do much to help someone with Alzheimer’s.

Likewise, one study hinting at a possible link between omega-3 fatty acids and prostate cancer does not make a theory, let alone a major new truth.

Health reporting implies the impossible

Besides, even if ye ordinary hassled consumer swears off eating anything with either copper or omega-3 fatty acids in it, that doesn’t mean he’ll dodge either Alzheimer’s or cancer. There’s this implicit claim: if you can only eat/behave/think properly, your life will be long and healthy, your old age serene, and your death an easy one. The definition of ‘properly’ varies, but it’s always presented as having the authority of science behind it. This causes problems if the report is also trying to emphasise how new and revolutionary its wonderful new findings are, because it makes science look as featherweight as the media.

With the claim comes a nasty moral innuendo: if you’re ill, it’s your fault, your failure. This victim-blaming isn’t just abstract psychologising. In the UK, disabled people are now often labelled as ‘benefit scroungers’, as if their inability to work is either a) a lie, or b) entirely self-inflicted. It isn’t.

Where’s the evidence that a single, perfect lifestyle for health exists? Or, for that matter, that people whose lifestyles aren’t perfect are morally flawed?

And this doesn’t just apply to health issues. The ideas that

• all our problems can be fixed by physicists, chemists, engineers, etc.

• good enough tech will obviate the need for hard work

• if we could only get the systems right we’d be able to erase human wickedness

• you are a failure if you do not know about — or at least, have an opinion on — far, far more than any previous generation of humans

are just as dubious, yet all are implied in the breathless reporting of advances in science, new technologies, and institutional failures.

This isn’t just froth. It matters. These assumptions worm their way into our heads, changing both our behaviour and our attitudes to others. They set up false expectations. When those expectations fail, we blame the people who couldn’t meet them, not the media who spread them. We also torment ourselves with our efforts to achieve an impossible perfection. The result is unnecessary unhappiness.

In my next post, I’ll consider a modest proposal for reform.

Next Page »

Blog at WordPress.com. | The Pool Theme.
Entries and comments feeds.

Paul Bernal's Blog

Privacy, Human Rights, Law, The Internet, Politics and more

Fruit & Flies.

Diary of a neuroscientist.

RSA blogs

watching the world of brain research

Mind Hacks

Neuroscience and psychology news and views.

NeuroLogica Blog

watching the world of brain research

Follow

Get every new post delivered to your Inbox.

Join 1,355 other followers

%d bloggers like this: