Tags: Copyright, Fair use, Intellectual property, Kathleen Taylor
Please note that all material on this site is copyright Kathleen Taylor 2013. Feel free to copy any of it, only please state clearly where it came from by including a link to http://www.neurotaylor.com. Thanks.
Tags: Birmingham, Brum, competitiveness, goals and values, inequality, winner-takes-all, winners and losers
Heresy though it may be to admit it in our competitive culture, there’s a lot to be said for being runner-up. Winners may increasingly take it all in financial terms (in the US, income inequality has been rising for decades), but as Christmas should remind us, there’s more to life.
I live near England’s second city, Birmingham, and recently went there for a concert in its fabulous Symphony Hall (see my previous post). Whenever I visit Brum, I’m always conscious of being deliberately unsurprised by how pleasant it is these days. Stereotypes linger, and when I was growing up the place had a pretty poor image. For many people it still does. Besides, it’s the second city – i.e. not the first. Loser!
The prejudice is unfair. Yes, there are parts of Brum where I wouldn’t walk alone in daylight, never mind after dark. But that’s cities for you, and Birmingham’s council has done wonders in recent years to make it an attractive and enjoyable place. Even the oppressive ugliness of New Street rail station is being transformed. Another big project, the splendid new library, was publicly-funded (fat chance of that these days) and, astonishingly, came in under budget, unlike many projects, public and private. Centenary Square, where you’ll find the library and Symphony Hall, is as handsome a space as anything you’ll see in London. And no, I’m not on commission.
Yet Birmingham is widely despised. It’s often ignorance; the national media and government, largely based in London, seem rarely to notice Brum unless someone’s been killed in it (to be fair, this is true of much else outside London). Yet the place has much to offer:
- concert venues which attract top artists (from Andreas Scholl to Rihanna)
- excellent sporting, arts, shopping and conference venues
- fantastic architecture and public spaces
- fine universities doing world-leading research. Every UK institution is now expected to do this; fewer actually manage it
- plenty of history, science and technology
Birmingham is, after all, not only an industrial city; it was home to major artists like the pre-Raphaelite Edward Burne-Jones, great Quaker families like the Cadburys, scientists like Joseph Priestley, poets like Louis MacNeice, etc. (there’s a longer list on Wikipedia).
But it’s not London. Birmingham always comes second (if not fourth, behind London, Oxford and Cambridge, the so-called Golden Triangle).
Yet being second has its advantages. House prices, for one. They’ve rocketed in recent decades, but they’re still nowhere near the extortionate levels of the Triangle. The countryside’s easier to reach and the city easier to drive through than London – and nicer; my route into Brum soars through the spectacular Spaghetti Junction. The people seem better-humoured and less self-important, the pace less ruthlessly frenetic, and the London attitude of charging you for every possible thing hasn’t altogether permeated the Midlands. London’s a superlative city and a great place to visit, but I’m glad I don’t have to live there.
As for cities, so I suspect for people. Second-raters tend to be better balanced, pleasanter, less wearing to interact with, less highly-strung.
When I was at university, there seemed to be – at every stage, in every subject – one person singled out for special favour. We called them BMGs (‘brightest mind of their generation’). Some of them, I hope, have lived long and prospered, but many failed to develop the predicted stellar careers. Why? Because, I suspect, of the weight of confidence placed in them. Either they writhed with anxiety at having such great expectations to live up to, or they became insufferably lazy and complacent.
Some BMGs dropped out. Others now have academic jobs, and make their colleagues’ lives hell by not pulling their weight. They’re the nuisances every department suffers, the ones who wriggle out of any work they can (except research), who exploit good will until everyone longs to get rid of them. Less flashy candidates are much more useful in real-life institutions, where too many putative geniuses can be a nightmare for actually getting things done.
Real life doesn’t need a world filled with winners. Society isn’t built for it, whatever the ‘all can have prizes’ brigade would have you believe. And not everyone who doesn’t win is a loser. Some simply opt out of the game. Others, too smart to chase the fantasies of money, status and fame, value longer-term goals, like being useful and having friendships with people. In the long run, a wealth of research now suggests that chasing these goals is far better for human well-being. Winners, in other words, win at a nasty cost.
At Christmas and New Year people tend to review goals and values. Perhaps we should think about whether the constant emphasis on coming first is really as fine a thing as it’s made out to be.
Tags: classical music, cognition, emotions, Mozart, music
Today I’m thinking about the different ways we know music.
Last Saturday, I was lucky enough to have tickets for a classical music concert in Birmingham’s Symphony Hall. The event was, I think, the first all-Mozart concert I’ve ever attended, and it was wonderful.
The program began with the Marriage of Figaro overture, ended with the Requiem, and in between we heard one of Mozart’s loveliest pieces, the Piano Concerto No 21. It’s sometimes called ‘Elvira Madigan’ – not by Wolfgang Amadeus himself, but since it was used in the 1960s movie of that name.
I like listening to Western classical music for many reasons, but one is because of the depth and range of experiences involved. The music’s structural complexity and long traditions allow it to tap into many emotions. There’s the sublime simplicity of Mozart, which held the Birmingham audience completely spell-bound. There’s the exhilaration of Sibelius’s Karelia Suite, or the exaltation of Wagner’s great ‘Valhalla’ motif from the Ring cycle. You can get a supernal chill from Bartok (Duke Bluebeard’s Castle), breath-taking flamboyance from the likes of Sarasate (his Zigeunerweisen), heart-breaking grief from Bach (in the St Matthew Passion), or overwhelming awe from Saint-Saens (try the Organ Symphony) or Berlioz (in his Symphonie Fantastique). You can hear the sea in Britten’s Peter Grimes, feel the seduction in Bizet’s Carmen, sense the Shakespearean tumult in Prokofiev’s Romeo and Juliet, and smile at Rimsky-Korsakov’s bumble-bee. And these are only a few examples.
(Other genres, like hip-hop, pop and R&B, are doing something quite different, in which tunes and chords share the limelight not just with rhythms but with images and dance moves. Often their structure is simpler, their emotional range narrower, and their dynamic range set permanently to ‘loud’. Personally, I find the results immensely boring, just as fast food’s dull compared with decent home cooking. But there are times when fast food’s what you want.)
As well as the emotions, how you listen to classical music can vary, often within a single piece. Whether you’re consciously savouring the flow, self-consciously attentive to the structure, letting the feelings wash over you, or even drifting off into other thoughts (i.e. hearing, not listening) – that depends on how well you know the music, your concentration, the performance, and much more besides.
It was the piano concerto that set me thinking about how we listen to, and recognise music. It’s a piece I got to know when I was very young, as my parents had a tape of it, performed by the great Hungarian pianist Geza Anda. I’ve never formally studied it, though, so I don’t know it the way a musician would. And until the concert, I hadn’t heard it for years.
Yet as soon as it began, the gap of time was bridged. The feeling of recognition was like relaxing into a warm bath. I knew instantly what was coming next; I knew every point at which the performance differed from the Anda version, and if the pianist had put a finger wrong I’d have been instantly, wincingly aware of it. I’ve often, hearing something on the radio, known that it wasn’t ‘my’ recording, without being able to say what piece it is or who wrote it. And hearing something live, of course, is quite different from hearing recordings, especially when the acoustics are as fantastic as they are in Symphony Hall.
At two points in the piece, there are cadenzas – show-off moments, basically – where the pianist has a choice of what to play. As soon as the pianist started his first cadenza, I knew it wasn’t the one Geza Anda played, and I felt the internal switch from warm emotional bath to cooler cognition. I became interested in the music, its structure, how each phrase reflected aspects elsewhere in the concerto … in other words, I was listening much more analytically.
The slow central movement switched me back into the fuzzy glow of – not memories, exactly, but the feelings associated with them. It’s surely one of the most beautiful pieces of music ever written (available on YouTube, if you can put up with the preceding ad for something very different). It needs limpid, delicate lightness – think sunlight gleaming through a waterfall – as so much of Mozart’s music does. I was on tenterhooks for the first few notes of the piano’s entry, until I realised he’d basically got it right. Phew!
And yet, as I said, I’ve never played this piece, and hadn’t heard it for years. Music digs deep tracks in the mind, especially in childhood. Works I’ve learned to love as an adult, and listened to much more recently, don’t bring the same intense awareness of details.
Research suggests that, like language, music is easily and naturally picked up in childhood, and that children who don’t encounter it early in life may lose the ability to revel in it later. Yet many schools, and parents, see classical music as too difficult, or an unnecessary luxury (even nowadays, when recordings are cheap and orchestras are working hard at outreach). Anyone classically-trained can easily move into pop music – and many have – but it’s harder to move the other way. Yet for many kids, all they ever hear is what Freddie Mercury (I think it was he!) called ‘Kleenex music’: simple, disposable, forgettable.
People claim that classical music is elitist. (Here in the UK concerts are cheaper than football matches.) I can’t help wondering how much of that response is defensive. Calling something elitist gives you an excuse for not making the effort required to learn more about it. Classical music needs work, certainly, unless you’re young enough to soak it up without effort. So does learning any new skill, but does that mean that anyone with a skill is somehow ‘elitist’?
A child who learns to love classical music has been given a great treasure. He or she will have immense resources to fall back on, in good times or bad. Music engages our brains much more extensively than many other activities. It’s good for us, too, reducing stress markers and promoting that sense of ‘flow’ which is associated with rest and relaxation. It’s “amongst the most rewarding experiences for humans”. And learning to play teaches teamwork and self-discipline, quite apart from being fun to do and a boost to self-esteem.
It’s a real shame that so many children miss out on these life-enhancing joys.
(The performers at the concert were the Orchestra of the Swan, with the City of Birmingham Choir, Anthony Hewitt piano, Rhian Lois soprano, Anna Huntley mezzo soprano, Samuel Boden tenor, Benjamin Cahn baritone, and Adrian Lucas conductor.)
Tags: humour, language of science, media, media science, satire, science communication, Science in Society
Like science itself, mainstream media reporting of scientific findings can be confusing, not least because ordinary words are given specialised meanings. To help the perplexed, here’s a light-hearted gloss of some of the commonest terms used by media folk to talk about science and scientists.
“Expert” – knows more than we do.
“A leading …” – we’ve heard of this person (and we can barely spell their research discipline), so they must be important.
“Professor” – can mean either a) a professor or b) anyone with a doctorate.
“Scientists believe …” – the ones we asked said …
“X says [something controversial]” – and so they did, with a little judicious editing. Whaddya mean, “out of context”?
“New” – almost certainly not, but if even the researchers aren’t thorough about their literature searches, you can’t expect us media types to know about previous work. We can barely remember what happened yesterday.
“Extraordinary” – it sounded weird to us (but then, so does most of this science stuff).
“Important” – we can see how this might have something to do with the real world.
“Breakthrough” – something useful may possibly result from this at some point in the future.
“Landmark” – every scientist we asked, the press release, and the journal’s editorial all said this might make a difference.
“Revolutionary” – contradicts something said by somebody else.
“Controversial” – we suspect the only researchers who think this are the study’s authors, but hey, it’s eye-catching.
“A study suggests” – even we recognize that this one’s so provisional we need to say so.
“Abstract” – we haven’t a clue what this means, let alone if it’s any practical use.
“Theoretical” – see “Abstract”.
“Challenging” – we have no idea what they were banging on about.
“Developing” – we’re pretty sure they haven’t either.
“Theory” – anything more than a guess, put forward by a scientist.
“Hypothesis” – an irate scientist complained about how we misused the word “theory”.
“Anecdote” – a term of abuse used by scientists to complain about the media. We prefer to see anecdotes as baby datapoints.
“X causes Y” – X has been linked Y to by some statistical method. You don’t want to know the details, do you? Good.
“Correlated with” – a patient scientist explained to us that “correlation is not causation”, so now we can show off.
“X doubles the risk of Y (no baseline risk given)” – trust us, we’re headline-writers. And we’re not telling you what the original risk was because it’s so tiny that you’d realise this is a total non-story.
“X doubles the risk of Y (baseline given)” – we know you can’t make any sense of this statement without knowing what was doubled, so we’ve fed you the number. What we haven’t told you is that there are probably so many other factors causing Y that you don’t need to panic over X, at least until you’ve stopped smoking, changed your diet, done more exercise, moved to somewhere less polluted, and stopped worrying about all the crap in the media.
“A gene for X” – this gene produces a protein which may have some small influence on something in the body which eventually has something to do with X.
“Brain regions associated with X” – these brain regions seemed to be doing something when the few people tested in the study were X-ing, so they may have something to do with X. They’ve also been associated with lots of other things, but we like to keep the story clear and simple.
“Brain activity” – some complex statistical measure which some specialised research-folk think may be quite highly correlated with changes in brain cells, and a lot more less specialised folk think may have something to do with “the mind”, whatever the hell that is.
“Neurologist” – anyone who does anything related to actual brains (i.e. not a psychologist).
“Neuroscientist” – any person working on brains who’s explained to us that they’re not a neurologist.
“Remarkable” – a scientist who shows signs of being successful despite being a woman.
“Brilliant” – this guy’s a better self-publicist than most of his colleagues.
“Maverick” – weird even by scientific standards, and quite likely to be wrong.
“Confident” – probably a bully, and even more likely to be wrong.
“A lone voice” – the probability of wrongness is close to 1.
“Professor X could not be contacted …” – Professor X has had dealings with the media before.
“Fluent communicator” – wow, a scientist who doesn’t just stare at their feet!
“Engaging communicator” – this one smiles!
“Brilliant communicator” – this one can talk and they’re not bad-looking, for a geek.
“Difficult” – we suspect this one has autism.
“Dedicated” – you really chose to spend your career doing that?
“Committed” – it’s ridiculous how seriously you take this stuff.
Tags: Academia, humour, management, management guide, managers, university
So you want to be a university manager? You’ve come to the right place! This short guide is all you’ll ever need to make a success of your new role as a steersperson of one of our great institutions of learning. Once you’ve worked through the Five Key Areas, and memorised the Six Key Messages, you’ll be ready and raring to set out on your new, exciting career path.
In fact, if you’ve got any other management guides (OMGs), you can chuck them in the garbage can right now. You’re a university manager (UM). The usual rules of management don’t apply to you.
That’s because OMGs are all about how to manage normal people. But being a UM – the privileged state of UMhood, as UMs like to think of it – isn’t about managing normal people. It’s about managing academics.
Key Message 0: Academics are not normal people.
In any management guide, there are five Key Areas you need to think about: Morale, Incentives, Listening, Respect, and Communicating. This guide will help you improve your managerial practice in all five areas. We’ll start with the slippery concept of staff morale.
Key Area 1: Morale
OMGs will tell you that good management needs to focus on staff morale. This is nonsense. Academics pride themselves on being rational thinkers. The scientists among them, especially the economists, will give you a useful tip: they don’t work with anything they can’t measure. Neither should you!
Staff morale is notoriously hard to measure. That’s because it’s touchy-feely, not rational. Even mentioning morale, let alone trying to improve it, will upset your more autistic academics, provoke the cynical ones, and make all of them less likely to read your emails. Besides, most academics are left-wing contrarians who don’t want their morale improved, because then they might have to approve of ‘the System’.
This is why you can’t just measure staff morale by asking the staff.
Important Note: Student morale is another matter, because students haven’t been at your institution long enough to turn into academics. So student morale can be measured by asking the students. As you know, it is measured, and it matters for funding. Staff morale isn’t, and doesn’t. So if you’re keen on morale, focus on the customers who pay your salary, not the people who take up too much of your time already.
Key Message 1: Leave academics’ morale to academics. They’re smart, aren’t they? They can figure it out.
Key Area 2: Incentives
OMGs say that a good manager needs to reward staff. They often quote research which is supposed to show that people respond better to positive incentives (rewards) than to negative ones (punishments).
Important note: ‘Reward’ doesn’t necessarily mean money – which is just as well, since your institution probably doesn’t have any. It means social rewards. A social reward can be anything from a smile, an honourable mention at a departmental meeting, or praise in an annual review, right through to rewards that will actually cost you: biscuits for committees, free drinks, or a departmental party.
Ignore the temptation to be nice. Academics are smart and will see through your attempts to conciliate them. Negative incentives are much more effective. Some UMs employ both, but carefully: an initial brief commendation followed by a long list of criticisms. Academics pride themselves on seeing both sides of an argument, and this move will make them feel uncomfortable about actively hating their UM.
Key Message 2: Never praise an academic if you can avoid it.
Key Area 3: Listening
OMGs say that a good manager is one who listens to staff, walking the floors, knocking on doors, inviting open, informal communication.
With academics? Are you joking, OMG? Given half a chance, most of these people would talk for hours on their specialist subject. You don’t have time for that.
Besides, many academics are introverts who are afraid of human contact. They’re also hugely overworked. They’re not going to thank you for coming and bothering them, especially as their natural left-wing cynicism leads them to believe you won’t pay any attention to what you hear.
For the same reasons, there’s no point encouraging social occasions, or, if they do take place, attending them.
Unfortunately, the idea that managers should listen has gained ground in recent years. So a good UM will occasionally schedule ‘listening forums’. If you plan to do this, make sure the setting is formal and that there are plenty of senior academics present. And be careful not to make any clear statements about how – or whether – the information you get will be processed at higher levels. (Promises are hostages.) That way you’ll deter people from making complaints that you might have to do something about.
Key Message 3: Academics don’t want to be listened to. They want to be left alone. If they were that social, they wouldn’t be academics.
Key Area 4: Respect
It’s a favourite OMG mantra: respect your staff.
Why? If they were worth respecting, they wouldn’t be academics. They’d be managers, like you, earning your salary. These people are grunts. They wouldn’t last an hour in the real world.
On the other hand, it is important that they respect you.
Some management theorists insist that respect is gained by soft skills: emotional intelligence, sociability, and so on. For normal people, this may be true, but remember: as an UM, you’re dealing with academics. They’re most at ease with abstractions, so make yourself an abstraction! Your people will respect you more if they hardly ever see you. At the same time, you need to make them feel that a word from you could ruin their future.
UMhood isn’t about soft soap. It’s about power.
Key Message 4: Respect, in academia, should flow one way only: from the bottom to the top.
Key Area 5: Communicating
OMGs will tell you that good communication is essential to successful management. Let’s unpack that a little.
Management jargon is often sneered at by people who say that language should be about communication. These people are idiots. They don’t understand what management jargon is for.
Besides, academics love jargon. For them to complain about your jargon is pure hypocrisy!
Remember, most of your institution’s recent funding increases have been spent on either your salary or your new office, not on hiring more academics. Academics know this. They see the UM as the enemy. They are also ideologically indoctrinated to perceive ‘the System’, and anyone who supports it, as evil.
You can’t sweet-talk these people. As that great management theorist Niccolo Machiavelli said, your only option is to crush them before they crush you.
Management language is about two things: making yourself look powerful, and making yourself look efficient. It’s a weapon in the fight all UMs have to fight, every hour of every day. The language you use is the headlock by which you subdue your staff.
Efficiency is why you’ll hear managers saying ‘actioning’ instead of ‘putting into action’, ‘progressing’ instead of ‘making progress’, and ‘less’ instead of ‘fewer’. Less syllables = more efficient.
Power is why a smart UM will often appear to act inconsistently. If you’re inconsistent, you’re unpredictable, and that makes people uncertain. As they get more anxious, they see you as more powerful.
Important note: Inconsistency also works with the OMG notion that managers should immerse themselves in details. You can’t possibly get your head round all the details of managing a set of academics, so don’t kill yourself trying. Instead, make sure you master a few details efficiently. That way, you’ll be able to make excellent use of random micromanagement.
A good UM is a master of this art. By micromanaging only a few aspects of your institution’s systems, you can ignore the constraints which, in practice, prevent the kind of changes you demand from actually happening. Then you can criticise the academics for not making those changes. Meanwhile, you display a hopelessly disorganised grasp of other institutional processes. This reminds your staff that life isn’t fair, thus lowering their expectations to realistic levels.
Remember, people who are depressed often slide into learned helplessness, so a depressed academic is likely to be an inert academic. And inert academics are much, much easier to manage.
Key Message 5: Management jargon is there for good reason. Use it well.
Congratulations! You’ve now reached the end of the only guide you’ll ever need to being a university manager.
That’s it. That’s all you need to know.
The path to UMhood lies before you. Go for it!
And remember, with great power comes …
Most OMGs would end that sentence with a boring old cliché: ‘great responsibility’. Not this guide. As an UM, you’ll learn to end it with ‘greater salary’. Enjoy!
Tags: Alzheimer's disease, Fish oil, health reporting, Institute for Food Brain and Behaviour, Jonathan Swift, nutrition, Omega-3 fatty acid, Prostate cancer, science journalism, science media
(Feel free to take it in the spirit of the great Jonathan Swift’s original.)
What prompted the proposal
Most mornings, I say hello to the Internet. In return, I find a slew of press releases, heralding recent, or even advance, publications in neuroscience, psychology, medicine and health. Most of it’s from just-published studies, dolled up by press officers and spoonfed to the media. And, as the UK’s Astronomer Royal Martin Rees has implied, a lot of it’s tosh.
I tell my students that it’s better to read first-rate science fiction than second-rate science. It’s more stimulating, and no more likely to be wrong.
In my previous post, I discussed two examples of high-profile science. One, a study published in the Journal of the National Cancer Institute, proposed a link between omega-3 fatty acids and prostate cancer. The second, in the prestigious journal PNAS, proposed a link between copper (Cu) and Alzheimer’s disease (AD). They triggered this particular piece of devil’s advocacy, but they’re only two examples of very many, and my beef is not with them, but with the system that produced them.
A modest proposal for reform
No media organisation with more than 10,000 regular readers should be allowed to publish news of any scientific research until one year has passed from the date of first publication, or the study has been successfully replicated.
The omega-3s/cancer and copper/Alzheimer’s studies were widely reported. What did this high profile achieve for the general public?
Many will have missed, ignored, or instantly forgotten the news. Some of the rest, however, may have worried about whether they should change their diet. They may have asked their doctors about it, or wasted time surfing dubious Internet sites. A few may even have used the study as an excuse for not eating more fruit and veg. Others may have thought, crossly, that they wish the bloody scientists would make their minds up, or remarked that you can’t believe anything you read in the media these days.
I’d be willing to bet that very few will have rejoiced at the extent of their new empowerment, thanked the Press for bringing them the truth so quickly, and happily formulated a new, fish- and fruit-free diet in order to live long into healthy old age.
In other words, we may have a slight increase in anxiety, cynicism and distrust of science, but on the positive side we have … what, exactly? The thrill of novelty. Pages filled, buttons pressed, teeny-weeny neurotransmitter hits delivered. Readers fooled into thinking this particular organisation is hovering at the cutting edge and holding the boffins to account. And this benefits the public – how?
It benefits the media. It also benefits the authors and their institutions, given the current insistence of funding bodies on generating ‘impact’. But science, wisely, distrusts new findings and insists on them being repeated before it takes them too seriously. Might not the science-media-reading public benefit from knowing that a similar standard has been applied to what they are being told? We’re already overloaded with data, or at least information. The modest proposal would reduce quantity and boost quality. It should also give journalists more time to do investigative science journalism.
As it is, quite often reporters don’t understand what they’re reading, because they aren’t specialist science journalists, and the reader doesn’t bother glancing past the headline anyway. This isn’t a criticism of either: readers and journalists are busy people, and even experienced scientists can struggle to understand publications in other scientific disciplines.
But scientists are to some extent held to account by the profession’s self-correcting mechanisms, and its longer timescale – though publicity-chasing for ‘impact’ may distort these. In the media, sensational findings are common; reports of how they were discredited are rarer.
Six conditions for publishing science journalism
Obviously the modest proposal would need expansion. For starters, how about the following six conditions to be met before the news is published:
1. At least two independent experts to be consulted about the study’s merits, and their views to be reported on whether the study is mainstream, minority, maverick, nonsense, or dangerous nonsense
a. If both experts class it in the ‘dangerous nonsense’ category, the journalist may consult three more experts
b. If these disagree, the views of all five should be reported
c. If all five concur that it’s dangerous nonsense, the report should be scrapped
2. Similarly, the experts to be asked for their judgement on whether the headline is appropriate, and that judgement either to be taken into account, or at least reported
3. The journalist to have read at least the introduction and discussion sections of the article, not just the press release
4. Potential conflicts of interests to have been checked by the journalist – and not just by reading the authors’ ‘yup, we’re clean’ statement
5. For health studies, a risk assessment has shown that the putative risk to human health (the ‘culprit’) identified in the study is higher than the risk to health which would follow from giving up the culprit – or at least state both numbers in the report
6. Information about the study should, at the very least, include baseline values as well as the size of the effect/risk/change, the type of study, a link to the article, and the number of participants tested.
Or, how about a kitemark?
I try to be a realist, and realistically, the modest proposal is never actually going to happen. Instead, it would be great if someone could set up a kitemark, a signal of quality, for science journalism, administered by an independent body. I don’t think it would be fair to apply it wholesale to organisations, but individual pieces, and major blog posts, why not? If they were required to display the percentage of their articles which have received a kitemark, that might encourage them to aim for a higher percentage.
Come to think of it, why don’t we do that for the whole of journalism, and make the industry fund the jobs required? Better quality and more employment, in one!
Tags: Alzheimer's disease, Fish oil, health reporting, Institute for Food Brain and Behaviour, Jonathan Swift, nutrition, Omega-3 fatty acid, Prostate cancer, science journalism, science media
(Feel free to take it in the spirit of the great Jonathan Swift’s original.)
What prompted the proposal
Most mornings, I say hello to the Internet. In return, I find a slew of press releases, heralding recent, or even advance, publications in neuroscience, psychology, medicine and health.
Two in particular have caught my eye of late. One, a study published in the Journal of the National Cancer Institute, proposed a link between omega-3 fatty acids and prostate cancer. The second, in the prestigious journal PNAS, proposed a link between copper (Cu) and Alzheimer’s disease (AD). Both were much discussed in the media. They triggered this particular piece of devil’s advocacy, but they’re only two examples of very many, and my beef is not with them, but with the system that produced them.
Everything’s bad for you
As it happens, I do some editorial work for a small research charity, the Institute for Food, Brain and Behaviour (IFBB), and I was recently at a colloquium with some people who’ve spent their careers researching omega-3 fatty acids. Let’s just say that when I heard about the fatty acids/prostate cancer article, I raised a sceptical eyebrow; when they heard, they tore its methods into little tiny pieces. You can read a brief summary of some of the criticisms on the IFBB website.
This isn’t just abstract theorising, or scientists quarrelling among themselves. It matters. I’ve already been told of one elderly man who’s very anxious about whether he should stop taking fish oil supplements. The stress has probably already done him more harm than the omega-3s ever will – but he believes what he reads in the papers, as many people do.
(And yes, a cynic might say he shouldn’t, but that’s close to accepting it’s OK for papers to print stuff that is, when you get down to it, not true.)
The copper/Alzheimer’s study is preliminary work, done in mice and in human cell cultures. What the authors say in the paper is this:
“Whereas the role of environmental factors in the development of the sporadic form of AD is controversial, long-term exposure to higher levels of Cu may contribute to this process, at least in some cases.”
Note the qualifications: ‘may’, ‘in some cases’.
By the time this reached the press release, it had acquired the headline, ‘Copper identified as culprit in Alzheimer’s disease’, a warning advertisement – how to tell if you’re about to get Alzheimer’s – and (because most of us don’t chew on copper pipes, though we may get our drinking water from them) a list of copper-containing foods.
Why? Are we to stop eating “red meats, shellfish, nuts, and many fruits and vegetables”, for fear of dementia? Or perhaps we should merely cut down on these foods? Red meat I can understand; both doctors and environmentalists keep telling us we should be eating less of that. But surely the benefits of fruit and veg outweigh the risk of dementia? – especially given everything else that’s been linked to Alzheimer’s over the years.
The press release’s last two paragraphs, for those who get so far, are more cautious. They read as follows:
However, because metal is essential to so many other functions in the body, the researchers say that these results must be interpreted with caution.
“Copper is an essential metal and it is clear that these effects are due to exposure over a long period of time,” said Deane. “The key will be striking the right balance between too little and too much copper consumption. Right now we cannot say what the right level will be, but diet may ultimately play an important role in regulating this process.”
Indeed. So why isn’t this right at the top of the press release?
To be fair, the press release provides a link to the article, which not all of them do. It’s got quotes from the authors, and it does a good job of explaining. It’s not a bad example of the genre.
And yet …
This is one study. On animals and cells, not people. It has a plausible mechanism, which is more than many nutritional epidemiology studies do, but it seriously needs replication. Not everyone agrees on what causes Alzheimer’s, and no one’s saying copper is the only ‘culprit’, that morally-laden word. (Bad chemical! Stop tormenting those brains!) Possible causes for this awful disease frequently hit the headlines, long before we’ve any idea whether they really are ‘the’, or even ‘a’ culprit. We still can’t do much to help someone with Alzheimer’s.
Likewise, one study hinting at a possible link between omega-3 fatty acids and prostate cancer does not make a theory, let alone a major new truth.
Health reporting implies the impossible
Besides, even if ye ordinary hassled consumer swears off eating anything with either copper or omega-3 fatty acids in it, that doesn’t mean he’ll dodge either Alzheimer’s or cancer. There’s this implicit claim: if you can only eat/behave/think properly, your life will be long and healthy, your old age serene, and your death an easy one. The definition of ‘properly’ varies, but it’s always presented as having the authority of science behind it. This causes problems if the report is also trying to emphasise how new and revolutionary its wonderful new findings are, because it makes science look as featherweight as the media.
With the claim comes a nasty moral innuendo: if you’re ill, it’s your fault, your failure. This victim-blaming isn’t just abstract psychologising. In the UK, disabled people are now often labelled as ‘benefit scroungers’, as if their inability to work is either a) a lie, or b) entirely self-inflicted. It isn’t.
Where’s the evidence that a single, perfect lifestyle for health exists? Or, for that matter, that people whose lifestyles aren’t perfect are morally flawed?
And this doesn’t just apply to health issues. The ideas that
• all our problems can be fixed by physicists, chemists, engineers, etc.
• good enough tech will obviate the need for hard work
• if we could only get the systems right we’d be able to erase human wickedness
• you are a failure if you do not know about — or at least, have an opinion on — far, far more than any previous generation of humans
are just as dubious, yet all are implied in the breathless reporting of advances in science, new technologies, and institutional failures.
This isn’t just froth. It matters. These assumptions worm their way into our heads, changing both our behaviour and our attitudes to others. They set up false expectations. When those expectations fail, we blame the people who couldn’t meet them, not the media who spread them. We also torment ourselves with our efforts to achieve an impossible perfection. The result is unnecessary unhappiness.
In my next post, I’ll consider a modest proposal for reform.
Tags: fiction, neuroscience, science communication, thalamus
Like any well-meaning science communicator, I’m on the lookout for new ways to communicate about my favourite topic, neuroscience.
Images, like the view of the thalamus shown here, look great, but they need words, or a lot of background knowledge, to communicate their meaning.
So it’s back to words. But what kind of words? Well, it’s said that fiction’s a good way of communicating … in some cases anyway!
I’m not too keen on the idea that science is science and should steer well clear of anything that smacks of the humanities. People have quite enough tricks for emphasizing their differences, and feeling superior in consequence, without barricading intellectual endeavours into disciplinary silos. (I know, I know, talk about championing a lost cause!) Besides, neuroscientists are happy to use the visual arts to put their work across, so why not other art forms?
All fiction is in a sense concerned with brains’ activities, but not many authors have tried writing fiction about brains themselves. Pondering this one day, I thought: “Someone should write a short story about brains! I’d read it! Probably.” Then I went back to reading whatever exciting science article had just fallen out of my inbox and onto my to-do list.
Some time later this short story wandered into my brain. It made me smile, so I caught the thing and submitted it to a writing competition (Writers and Artists) on the theme of ‘freedom’, where, to my astonishment, it was short-listed. You can download it as a PDF here (‘Freedom’, by Neurotaylor). I’m posting it on this blog as an exercise in alternative methods of science communication.
And If you know of any great fiction about brains themselves, let me know.
Here’s the story, set out as a taster so the post isn’t too long. The whole thing’s around 2000 words.
Alright mate? You on the tour? Welcome to the department. I’m Alfie.
Ever been inside a brain before? No? That’s unusual, not many people start their visit here.
Fully booked, eh? Don’t apologise, mate, we’re used to it. People wanna see the bits they’ve heard of. It’s all, “Ooh, can we go to the prefrontal cortex?”, like that’s the only department that matters. I tell you, mate, without us those guys in prefrontal would be twiddling their dendrites and rotting.
Yeah, they don’t half play up to it though. You’ll see ‘em later. All that, “Oh yes, it’s a great responsibility”, and banging on about their dopamine levels, as if the rest of us never get a sniff of dopamine. They don’t half try it on, that lot. They’re just passing messages, same as the rest of us.
Load of posers, if you ask me. Don’t tell ‘em I said that.
Anyway, welcome to the Thalamus Department. Here in the LGN section –
Tags: Big Society, feminism, Kirstie Allsopp
Sometimes it’s the little things that make you feel like letting off steam. Here’s the ever-excellent Maureen Bell, expert on book history and emerita of Birmingham University’s English Department, triggered by a chance, indirect, encounter with Kirstie Allsopp, of TV fame.
Scanning today’s TV listings in the Guardian (I know – anything to avoid leaving the breakfast table) I spotted a new treat:
9.0 Kirstie Allsopp: House Proud
New series. Crafty Kirstie Allsopp
makes a personalised doormat
using a vintage font.
I confess to a certain property-related Kirstie-and-Phil habit in the past. But it’s all getting sickeningly out of hand. Was the writer of this potted description on top satirical form? Whatever. The result is a perfect distillation of our lives and times, ladies!
The Allsopp name, of course, speaks for itself. Having extended the franchise from house-buying to home crafts and make-do-and-mend, the Kirstie brand guarantees the fun to be had from home-making. It’s especially useful at present to have a titled Tory Lady Bountiful on hand to cheer her less fortunate sisters. With paid part-time work (on anything other than zero-hours contracts) having disappeared, today’s women need hobbies. Kirstie’s contribution to the Big Society is to keep us happy in homes we can’t afford by giving us skills to keep our hands occupied. Otherwise we might venture into public spaces and use our hands foolishly and unproductively, e.g. to make and hold placards on anti-cuts marches. Crafty Kirstie indeed!
But what a bravura display of précis we have here. Every word counts in this tempting little description. Remember Rebecca West’s quotation? Perhaps you, too, had the postcard in the 1980s:
I myself have never been able to find out precisely what feminism is: I only know that people call me a feminist whenever I express sentiments that differentiate me from a doormat.
Well, now we can make doormats for, as well as of, ourselves. That’s the spirit for make-do-and-mend in austerity Britain. And a personalised doormat at that! Modernity’s fixation on the individual (I’m worth it) knows no bounds. Best of all, we can use a ‘vintage font’. Is that paradox or oxymoron? Or just moronic? ‘Vintage’ gives us all a boost of warm cup-cakey feeling and sends us into a nostalgic haze. China teacups, anyone? Remember that lace doillies, cake stands, pressed linen napkins and cake forks mean more work for those idle female hands. Bake, make, iron, smile. It’ll take your mind off it, dear. What’s not so frequently referenced in the world-of-vintage, however, is the army of servants who – in the olden days – made the graciousness of the drawing room possible. Let’s remember them as we sip the tea. They might have been treated like dirt, but at least they had jobs.
But what’s this word ‘vintage’ doing, coupled with ‘font’? Ah, ‘font’ is the modern twist. You, ladies — Britain’s makers of personalised doormats — aren’t only Victorian angels in houses, but also modern gals with computer skills. You not only know what a font is, but can tell the difference between Palatino and Dingbats. You can use the internet, visit Kirstie’s website, shoot through to M&S and buy a Kirstie Porcelain Tea for One set (£15). Transferable skills indeed!
Tags: Academia, Colleges and Universities, Creativity, Education, Isaac Asimov, Research Excellence Framework
Why do we have these people? We see plenty of them; they’re always in the media. They talk, a lot. But what’s the point of academics?
We know why we have universities and colleges. They’re safe houses, quarantining intellectuals so that the rest of us only have to put up with them at a distance, buffered by radio, TV or internet. People who don’t really fit are collected into pleasant refuges where they have others like themselves to keep them company. We’re OK with that. One mark of a good society is how it treats its academics.
All we ask in return is that they teach some vaguely useful stuff to kids to help them find jobs, while allowing the dear little things to a) make the friends who’ll help them out in later life and b) get the partying (mostly) out of their systems.
Oh, and there’s paperwork — and boy, do academics whinge about that. Like no one else faces hateful bureaucracies.
But why have academics at all? Can’t the teaching be done by teachers, instead of people who’d really rather be researching the sex lives of protons, or some such?
Traditional answers to this question are as follows:
1) Academics provide the creative sparks which drive innovation
Isaac Asimov‘s short story ‘Profession‘ makes this claim. In its future Earth, children are selected for specific jobs based on their brain structure, and programmed with the necessary knowledge. Asimov asks how, in such a system, creativity is sustained, and links creativity to rebelliousness and to — in conventional terms — failure.
Governments make this claim too, but they don’t see the need for rebelliousness, and they do appear to see creativity purely as a money-making tool. Oddly, this doesn’t stop my government piling up obstacles, like the costly, cruel and inane Research Excellence Framework. Poor management, a plethora of regulations, and soaring numbers of students and managers — but not academics — could have been designed to stamp out the creative impulse. Either our masters are stupid, or they’re not listening to front-line people, or ideology can trump both evidence and economics. My guess would be it’s a mix of the latter two.
2) Academics create, store and pass on knowledge
Computers can do that. They used only to store stuff, but online teaching’s becoming ever more common, despite its problems. And data-crunching algorithms are being loosed on big datasets to find new ideas. At present, we still need academics to programme the crunchers and interpret the results. Students also seem to prefer being taught by humans. But part of the anxiety cascading through academia at the moment may be due to a feeling that they are undervalued, and may become superfluous, in a system whose overriding ethos is about money.
3) Academics speak truth to power
I wish. Those who do generally don’t get a hearing, and most don’t. Academics may be creative, but that doesn’t make them natural activists. Besides, the system comes down increasingly hard on troublemakers. (The current enthusiasm for open access publishing is, paradoxically, making things worse: because universities have to pay the author fee, they’re now deciding who gets published, giving them a hefty motivational lever for irksome faculty.)
4) Academics help us become better citizens
Do we have any evidence for this? As an academic myself, I’d like to believe that we’re gatekeepers, helping people make sense of life’s complexities. Human beings do seem to enjoy understanding stuff, judging by the kind folks who’ve thanked me for ‘making it so clear’ about brain research. So maybe academics provide that particular reward.
Could it be that our education systems haven’t yet succeeded in starving the desire to think out of everyone, and our politics haven’t yet brought us all to believe that the only things that matter in life are cash and career?
5) Academics are part of the entertainment industry
In darker moments, I wonder: is this what it’s come down to? Watching the news, and the newsreaders’ expressions as they report on some new scientific finding, it’s sometimes hard not to hear a patronising tone. ‘Just look at what these inventive people have come up with now, girls and boys! Whatever will they think of next? And now, here’s Sally with the weather.’
Academics are more and more judged by impact: whether they can get a TV tie-in, or ‘engage the community’. But the community has plenty else to think about, so the intellectuals compete to offer light relief from the daily grind. And yet, some of the things they’re saying may really matter. Is ‘infotainment’ always appropriate?
Creators, interpreters, truth-tellers, guides, or entertainers: what are academics for? What should they be for? Are their purposes changing, and if so, for better or worse? And who’s driving that change?
Maybe that’s what academics are for! Asking questions …
Tags: Academia, Colleges and Universities, Education, REF, Research, Research Excellence Framework
Picture a middle-aged man in a small, unlovely office. He’s a thinker, a writer, an intellectual: he’s an academic at a UK university. (That’s already cause for suspicion.) Let’s call him Jim.
Jim’s trying to keep up with his gigantic and growing workload, but he’s finding it hard to concentrate. This isn’t laziness; he’s used to working evenings, weekends and national holidays. The problem isn’t just the amount of work, either. It’s that Jim is deeply, grindingly demoralised.
Why? Because his university’s acting as if all that matters is money — which wasn’t why Jim went into academia — and their management style leaves much to be desired. He’s fed up with being told to do X by people who don’t seem to realise that if X had been possible, it would have already been done — especially when other parts of the system are demanding that he simultaneously perform the contradictory action Y. Treat students as units of cash, and as little gods to be propitiated at all costs. Do less admin, but fill in this truckload of forms. Do more research, but give us a whole new set of teaching modules. Oh, and by the way, we’ve changed the system, again, without asking your opinion — so it’s an utter mess; make sure you have it sorted out by the beginning of term.
The work dumped on Jim by others – which needs doing yesterday, of course – is either teaching or admin. The criteria for promotion are purely based on research: the research he’d love to do, if he could only find the time. He can’t see how he’ll ever progress, because the university won’t fund replacements for the several colleagues who’ve left or gone off sick, so there’s no prospect of the deluge letting up. He’s signed up to jobs websites. So, he suspects, has everyone else in his beleaguered department.
As if that weren’t enough, Jim has a much more urgent fear than lack of promotion. Somewhere, he knows, his fate as a researcher is being decided.
Who are the judges? What do they know about him?
Our unlucky academic has no idea.
Are they his professional rivals? Has someone told tales about him? Do they remember his ill-judged remark at a meeting? Has he been too outspoken, too political?
Perhaps. Perhaps not. He can’t be sure.
On what grounds will they judge him?
He doesn’t know that either. They say there are rules, but the rules are so vague, who knows how they’ll be interpreted? The trial is secret, and there’s no appeal. Maybe they just don’t like his work, or him, irrespective of his smart and scholarly writing.
Have his colleagues been judged?
He doesn’t know. No one’s talking, from fear, or shame, or because they don’t yet know their fate.
What if the judgement goes against him?
Again, the judges aren’t talking; among the rest, dire rumours circulate. It may mean demotion, hardship, humiliation – the end of his life as an intellectual. He may be given lesser work, or he may be forced out of his job, thrown on society’s dust heap, his years of specialist training gone to waste.
Is this tormented individual in Stalinist Russia or Khmer Rouge Cambodia? Of course not! He’s not going to be shot, is he? He faces no beatings, no torture. Just the sweating anxiety of a secret, uncontestable judgement.
In academia, the freedom to think is essential. Yet in recent years, a venomous combination of ideology, inertia, and political convenience has crushed UK academics under an enormous workload and the heavy hand of over-management. If you’ve heard talk of intellectual freedom, of open science, of academia as a playground of ideas where truth is spoken to power, forget it. Judgement without appeal, lack of open process and accountability, and an increasingly controlling bureaucracy: this is how our academic institutions are operating as they decide upon their submissions for the Research Excellence Framework, the enormously expensive process that decides who gets what funding, next time round.
Where we most need freedom and open information, the chains are tightening and the veils being drawn. So much is old news.
However, it’s even worse than we believed – because as the REF, that academic Day of Judgement, looms ever larger, so does the ghost of a man so strongly associated with un-freedom that the last place you’d expect to find his ideas put into practice is a university. Critiqued, yes, for politics, history and other humanities courses often use Uncle Joe Stalin as target practice. But adopted as a strategy? And yet, when it comes to deciding which academics get entered for the REF, it seems some universities — because some are worse than others — have taken a tip or several from the Stalinist manual.
Researchers are assessed, by someone else in their university, as REF-able or not. If their work is not deemed sufficiently exciting, revolutionary, ground-breaking, or world-leading, they will not be included in the university’s REF submission. Are the assessors experts in all the work they’re reading? Not necessarily. Within the university, Jim could be rated by someone who knows very little about his work. Indeed, he may be the only scholar in his field at his institution. In which case, what of the supposed peer review?
The scope for conservatism, personal grudges, or downright prejudice is horrifying. Bizarrely, it’s worse than in the actual REF itself, because the national REF panels which make the final, overall judgements are picked for specialist expertise. So if Jim’s work is difficult, unfashionable, or interdisciplinary, he’s in trouble. Let’s imagine that he works on the history of women’s rights in Victorian England. Top-notch male academics have been known to write about women, but it’s generally not a fashionable field. Now, if he’d chosen something twentieth-century, preferably about war …
Worse still, academics tend to see themselves as highly-trained rational thinkers. Could this lead them to overestimate their capacity to make unbiased judgements? Is a sixty-year-old male professor of modern history, who isn’t interested in anything before 1900 and thinks research on women’s rights a waste of time, going to be a fair judge of Jim’s work? The professor thinks so. He may not know about the recent research, in the sciences, which showed that merely putting a woman’s name on a CV – let alone in the title of a paper – was enough to discount its worth in the eyes of academics.
But if the assessor marks a REF submission down, even though it was published in the field’s top journal, what can Jim do about it? He may not even know who assessed him, let alone what out-of-date beliefs they hold. He’ll never know whether another institution might have reacted differently. And because there’s no feedback, there’s no need for the assessors to justify their decisions.
Does it matter? Yes. If Jim’s not REF-able, he’s second-class, and the heart of his academic career may just have been torn out.
University management have been given a stranglehold on research. They can crush minority interests, less fashionable fields, or difficult, unpopular researchers. Bureaucracy has its hands round academia’s throat, and the worst of it is, it’s the universities themselves — and again, some are worse than others — who are doing this to their own researchers.
That’s no surprise, of course. It’s a pattern that crops up again and again: the bosses (here the government) are bad, but their subordinates, the university managers who have to implement their decisions, are worse.
What’s so depressing is that this situation could be improved without the need for a full-scale REF rebellion. It’s easily sorted. Just force all institutions to submit all their researchers to the REF, declare the criteria by which they’re judging ‘world-class’ work, and allow an appeal process. Or else boycott the damn thing, and break its Stalinist grip before it wrecks what’s left of academic freedom.
Or alternatively, pile on the kind of dumb management that treats people as units of production and nurtures the system but not the front-line worker. Tell staff how inadequate they are, how they’ll have to change, how all that matters is student satisfaction and bringing in money. (What about staff satisfaction?) Keep on driving the most original, rebellious minds out of UK academia, and treat creativity as if it can be snagged by a sufficiently detailed form.
The UK’s intellectual output used to be one of our greatest contributions to world culture. Where’s the evidence that current micromanagement is improving it?
Meanwhile, demoralised academics like our unlucky Jim look set to spend ever more time performing to meet REF demands, insofar as they find time to do research at all. That means concentrating on less interdisciplinary work in more established fields, which will enhance conservatism and make academia even more fashion-driven than it is already. It also means ever fewer hours on the kind of learning, teaching and free thinking that Jim and his colleagues went into their profession to do.
If a thinktank had been paid to devise a demonstration of how not to foster UK academia, they could hardly have done a better job than this.