What are academics for?

July 25, 2013 at 1:58 pm | Posted in Uncategorized | 5 Comments
Tags: , , , , ,

Medeleeff by repin

Why do we have these people? We see plenty of them; they’re always in the media. They talk, a lot. But what’s the point of academics?

We know why we have universities and colleges. They’re safe houses, quarantining intellectuals so that the rest of us only have to put up with them at a distance, buffered by radio, TV or internet. People who don’t really fit are collected into pleasant refuges where they have others like themselves to keep them company. We’re OK with that. One mark of a good society is how it treats its academics.

All we ask in return is that they teach some vaguely useful stuff to kids to help them find jobs, while allowing the dear little things to a) make the friends who’ll help them out in later life and b) get the partying (mostly) out of their systems.

Oh, and there’s paperwork — and boy, do academics whinge about that. Like no one else faces hateful bureaucracies.

But why have academics at all? Can’t the teaching be done by teachers, instead of people who’d really rather be researching the sex lives of protons, or some such?

Traditional answers to this question are as follows:

1) Academics provide the creative sparks which drive innovation

Isaac Asimov‘s short story ‘Profession‘ makes this claim. In its future Earth, children are selected for specific jobs based on their brain structure, and programmed with the necessary knowledge. Asimov asks how, in such a system, creativity is sustained, and links creativity to rebelliousness and to — in conventional terms — failure.

Governments make this claim too, but they don’t see the need for rebelliousness, and they do appear to see creativity purely as a money-making tool.  Oddly, this doesn’t stop my government piling up obstacles, like the costly, cruel and inane Research Excellence Framework. Poor management, a plethora of regulations, and soaring numbers of students and managers — but not academics — could have been designed to stamp out the creative impulse. Either our masters are stupid, or they’re not listening to front-line people, or ideology can trump both evidence and economics. My guess would be it’s a mix of the latter two.

2) Academics create, store and pass on knowledge

Computers can do that. They used only to store stuff, but online teaching’s becoming ever more common, despite its problems. And data-crunching algorithms are being loosed on big datasets to find new ideas. At present, we still need academics to programme the crunchers and interpret the results. Students also seem to prefer being taught by humans. But part of the anxiety cascading through academia at the moment may be due to a feeling that they are undervalued, and may become superfluous, in a system whose overriding ethos is about money.

3) Academics speak truth to power

I wish. Those who do generally don’t get a hearing, and most don’t. Academics may be creative, but that doesn’t make them natural activists. Besides, the system comes down increasingly hard on troublemakers. (The current enthusiasm for open access publishing is, paradoxically, making things worse: because universities have to pay the author fee, they’re now deciding who gets published, giving them a hefty motivational lever for irksome faculty.)

4) Academics help us become better citizens

Do we have any evidence for this? As an academic myself, I’d like to believe that we’re gatekeepers, helping people make sense of life’s complexities. Human beings do seem to enjoy understanding stuff, judging by the kind folks who’ve thanked me for ‘making it so clear’ about brain research. So maybe academics provide that particular reward.

Could it be that our education systems haven’t yet succeeded in starving the desire to think out of everyone, and our politics haven’t yet brought us all to believe that the only things that matter in life are cash and career?

5) Academics are part of the entertainment industry

In darker moments, I wonder: is this what it’s come down to? Watching the news, and the newsreaders’ expressions as they report on some new scientific finding, it’s sometimes hard not to hear a patronising tone. ‘Just look at what these inventive people have come up with now, girls and boys! Whatever will they think of next? And now, here’s Sally with the weather.’

Academics are more and more judged by impact: whether they can get a TV tie-in, or ‘engage the community’. But the community has plenty else to think about, so the intellectuals compete to offer light relief from the daily grind. And yet, some of the things they’re saying may really matter. Is ‘infotainment’ always appropriate?

*******

Creators, interpreters, truth-tellers, guides, or entertainers: what are academics for? What should they be for? Are their purposes changing, and if so, for better or worse? And who’s driving that change?

Maybe that’s what academics are for! Asking questions …

Unlucky Jim and the ghost in the academic machine

July 18, 2013 at 11:27 am | Posted in Uncategorized | 1 Comment
Tags: , , , , ,

Picture a middle-aged man in a small, unlovely office. He’s a thinker, a writer, an intellectual: he’s an academic at a UK university. (That’s already cause for suspicion.) Let’s call him Jim.

Jim’s trying to keep up with his gigantic and growing workload, but he’s finding it hard to concentrate. This isn’t laziness; he’s used to working evenings, weekends and national holidays. The problem isn’t just the amount of work, either. It’s that Jim is deeply, grindingly demoralised.

Why? Because his university’s acting as if all that matters is money — which wasn’t why Jim went into academia — and their management style leaves much to be desired. He’s fed up with being told to do X by people who don’t seem to realise that if X had been possible, it would have already been done — especially when other parts of the system are demanding that he simultaneously perform  the contradictory action Y. Treat students as units of cash, and as little gods to be propitiated at all costs. Do less admin, but fill in this truckload of forms. Do more research, but give us a whole new set of teaching modules. Oh, and by the way, we’ve changed the system, again, without asking your opinion — so it’s an utter mess; make sure you have it sorted out by the beginning of term.

The work dumped on Jim by others – which needs doing yesterday, of course – is either teaching or admin. The criteria for promotion are purely based on research: the research he’d love to do, if he could only find the time. He can’t see how he’ll ever progress, because the university won’t fund replacements for the several colleagues who’ve left or gone off sick, so there’s no prospect of the deluge letting up. He’s signed up to jobs websites. So, he suspects, has everyone else in his beleaguered department.

As if that weren’t enough, Jim has a much more urgent fear than lack of promotion. Somewhere, he knows, his fate as a researcher is being decided.

Who are the judges? What do they know about him?

Our unlucky academic has no idea.

Are they his professional rivals? Has someone told tales about him? Do they remember his ill-judged remark at a meeting? Has he been too outspoken, too political?

Perhaps. Perhaps not. He can’t be sure.

On what grounds will they judge him?

He doesn’t know that either. They say there are rules, but the rules are so vague, who knows how they’ll be interpreted? The trial is secret, and there’s no appeal. Maybe they just don’t like his work, or him, irrespective of his smart and scholarly writing.

Have his colleagues been judged?

He doesn’t know. No one’s talking, from fear, or shame, or because they don’t yet know their fate.

What if the judgement goes against him?

Again, the judges aren’t talking; among the rest, dire rumours circulate. It may mean demotion, hardship, humiliation – the end of his life as an intellectual. He may be given lesser work, or he may be forced out of his job, thrown on society’s dust heap, his years of specialist training gone to waste.

Is this tormented individual in Stalinist Russia or Khmer Rouge Cambodia? Of course not! He’s not going to be shot, is he? He faces no beatings, no torture. Just the sweating anxiety of a secret, uncontestable judgement.

In academia, the freedom to think is essential. Yet in recent years, a venomous combination of ideology, inertia, and political convenience has crushed UK academics under an enormous workload and the heavy hand of over-management. If you’ve heard talk of intellectual freedom, of open science, of academia as a playground of ideas where truth is spoken to power, forget it. Judgement without appeal, lack of open process and accountability, and an increasingly controlling bureaucracy: this is how our academic institutions are operating as they decide upon their submissions for the Research Excellence Framework, the enormously expensive process that decides who gets what funding, next time round.

Where we most need freedom and open information, the chains are tightening and the veils being drawn. So much is old news.

However, it’s even worse than we believed – because as the REF, that academic Day of Judgement, looms ever larger, so does the ghost of a man so strongly associated with un-freedom that the last place you’d expect to find his ideas put into practice is a university. Critiqued, yes, for politics, history and other humanities courses often use Uncle Joe Stalin as target practice. But adopted as a strategy? And yet, when it comes to deciding which academics get entered for the REF, it seems some universities — because some are worse than others — have taken a tip or several from the Stalinist manual.

Researchers are assessed, by someone else in their university, as REF-able or not. If their work is not deemed sufficiently exciting, revolutionary, ground-breaking, or world-leading, they will not be included in the university’s REF submission. Are the assessors experts in all the work they’re reading? Not necessarily. Within the university, Jim could be rated by someone who knows very little about his work. Indeed, he may be the only scholar in his field at his institution. In which case, what of the supposed peer review?

The scope for conservatism, personal grudges, or downright prejudice is horrifying. Bizarrely, it’s worse than in the actual REF itself, because the national REF panels which make the final, overall judgements are picked for specialist expertise. So if Jim’s work is difficult, unfashionable, or interdisciplinary, he’s in trouble. Let’s imagine that he works on the history of women’s rights in Victorian England. Top-notch male academics have been known to write about women, but it’s generally not a fashionable field. Now, if he’d chosen something twentieth-century, preferably about war …

Worse still, academics tend to see themselves as highly-trained rational thinkers. Could this lead them to overestimate their capacity to make unbiased judgements? Is a sixty-year-old male professor of modern history, who isn’t interested in anything before 1900 and thinks research on women’s rights a waste of time, going to be a fair judge of Jim’s work? The professor thinks so. He may not know about the recent research, in the sciences, which showed that merely putting a woman’s name on a CV – let alone in the title of a paper – was enough to discount its worth in the eyes of academics.

But if the assessor marks a REF submission down, even though it was published in the field’s top journal, what can Jim do about it? He may not even know who assessed him, let alone what out-of-date beliefs they hold. He’ll never know whether another institution might have reacted differently. And because there’s no feedback, there’s no need for the assessors to justify their decisions.

Does it matter? Yes. If Jim’s not REF-able, he’s second-class, and the heart of his academic career may just have been torn out.

University management have been given a stranglehold on research. They can crush minority interests, less fashionable fields, or difficult, unpopular researchers. Bureaucracy has its hands round academia’s throat, and the worst of it is, it’s the universities themselves — and again, some are worse than others — who are doing this to their own researchers.

That’s no surprise, of course. It’s a pattern that crops up again and again: the bosses (here the government) are bad, but their subordinates, the university managers who have to implement their decisions, are worse.

What’s so depressing is that this situation could be improved without the need for a full-scale REF rebellion. It’s easily sorted. Just force all institutions to submit all their researchers to the REF, declare the criteria by which they’re judging ‘world-class’ work, and allow an appeal process. Or else boycott the damn thing, and break its Stalinist grip before it wrecks what’s left of academic freedom.

Or alternatively, pile on the kind of dumb management that treats people as units of production and nurtures the system but not the front-line worker. Tell staff how inadequate they are, how they’ll have to change, how all that matters is student satisfaction and bringing in money. (What about staff satisfaction?) Keep on driving the most original, rebellious minds out of UK academia, and treat creativity as if it can be snagged by a sufficiently detailed form.

The UK’s intellectual output used to be one of our greatest contributions to world culture. Where’s the evidence that current micromanagement is improving it?

Meanwhile, demoralised academics like our unlucky Jim look set to spend ever more time performing to meet REF demands, insofar as they find time to do research at all. That means concentrating on less interdisciplinary work in more established fields, which will enhance conservatism and make academia even more fashion-driven than it is already. It also means ever fewer hours on the kind of learning, teaching and free thinking that Jim and his colleagues went into their profession to do.

If a thinktank had been paid to devise a demonstration of how not to foster UK academia, they could hardly have done a better job than this.

Two myths, one method: the yin-yang of science

June 24, 2013 at 11:45 am | Posted in Uncategorized | 1 Comment
Tags: , , , , , , ,

It’s often said that the key to science is the scientific method of testing hypotheses against reality using experiments.

It’s less often said that this looks a lot like what human brains do with the neural patterns which represent memories, beliefs and ideas. When presented with incoming stimuli which conflict with the stored patterns, the patterns can change to reflect the new information. Doesn’t mean they always do (see below), but they have the really rather remarkably cool capacity to learn from experience.

It’s also less often said that one scientific method gives rise to two rather different ways of thinking about science. They’re ‘myths’, if you like, about what science is and what it does. By ‘myths’, I don’t mean they’re false; rather that they capture — and vastly simplify — two contrasting aspects of the immensely complex set of activities we call ‘science’ (or, slightly less inaccurately, ‘the sciences’). In my book The Brain Supremacy, I call these two ways of thinking, for convenience, the ‘Athenian’ and ‘Promethean’ myths, after the two ancient Greek supernatural entities most often associated with science.

Traditionally, the scientific method was represented as a linear process, where A. Scientist dreamed up an Hypothesis, conceived an Experiment which would test its Truth (or Falsehood), performed the Experiment (science these days is more and more about performance), collected the resulting Data, and Analysed them (I know, I know, everyone else treats data as singular) to determine whether the Hypothesis had survived the test, or was doomed to be hurled on the waste heap of dead ideas.

Nowadays the method tends to be seen as more of a circular or looping process, whereby A. Scientist is constantly reading the literature, getting ideas for experiments and discussing them with colleagues, running and modifying experiments, and analysing data in the light of the latest research.

(Or, more realistically, teaching and marking, fighting the admin hydra, sitting on committees, writing endless grant proposals, and squeezing in a little research in any spare moments.)

The modern view acknowledges, in other words, that running an experiment, or analysing data, may be as fruitful a way of generating new hypotheses as are reading the literature, thinking, sleeping, and having a bath — all traditional sources of ‘Eureka!’ moments.

Whether you prefer lines or loops, for scientists the crunch point of the method is the experimental test, which is why such a huge amount of time, effort and money goes into experimental design.

However, where the two myths arise is at the point after the experiment has been run and the results are in.

In the case where the hypothesis has survived, there’s no problem. But if the data challenge the hypothesis, what then?

Athene was the goddess of wisdom, and on the Athenian view of science, the wise thing to do is to accept reality’s verdict and ditch the hypothesis. If your beliefs about the world have been proved false, you abandon them, don’t you?

In theory, maybe, and sometimes in practice — but not always. Experiments are expensive. Before you discard a project to which you’ve committed so much, you’re going to make as sure as you can that the world’s unwelcome answer (yes, unwelcome; scientists aren’t robots) can’t be explained away by a technical fault, a design failure, or some quirk in your hypothesis which you hadn’t previously noticed.

If it can’t, then the good Athenian scientist will shrug her shoulders, maybe swear a little, and start over. Athene adapts her beliefs about the world to the world as she finds it. If her hypothesis that, say, people always make rational decisions is disproved by experiment, she’ll stop regarding people as rational decision-makers. She values accuracy, and prefers to keep it real.

There is, however, an alternative response: the Promethean one. This is the technological approach, and it has been the source of much of what we value about our fabulous modern civilisation. Prometheus, who in legend brought fire to mankind, isn’t content to sit back and adapt to how the world is. He’s a world-shaper; he changes reality to fit his ideas of how it should be. If the world doesn’t match his beliefs about it, he’ll adjust the world to fit. Confronted with the failure of his hypothesis about human rationality, he’s quite likely to set up an education programme to teach people how to think more rationally (or at least write a book denouncing their failure to do so).

Clearly, science as currently practised is and needs to be a mix of the two approaches. Science needs to keep it real, of course, but it also needs to give us tech toys, medical advances, better housing, etc. Imagine making the case to a government funding agency for a purely Athenian science; without the technological applications, would there be much chance of getting the large amounts of taxpayer spend which science needs? Especially given that (in the UK at least) the science budget is, shall we say, already on the stingy side.

It may be more controversial to argue that the balance, driven by the constant demands for ‘value-for-money’ and the media/impact agenda, may have tilted a little too far towards Prometheus, but that case has been made.

The argument seems persuasive because of what we know of human nature. Remember, human brains don’t just test beliefs against reality in labs; they do it whenever new stimuli challenge their ideas. If those ideas are important, cherished notions, then the human response tends to veer away from the purely Athenian.  People defend their core beliefs as if they are part of their identity — as if an attack on the idea is an attack on the person. They get heated, stressed, sometimes abusive; or they practise denial and withdraw. World-shaping in order to protect the most sacred ideas of a person, or a group, can result in anything from great art to politics to murder. They’re dangerous things, strong beliefs, and they aren’t usually given up lightly — as a scientific hypothesis should be.

To weaken a strong belief, to ease humans away from the deadliest aspects of world-shaping towards a more Athenian acceptance that some ideas are better jettisoned, you have to find ways of making them care less about those ideas. Being Athenian about a belief is easy when you don’t give a damn about it; not being Promethean’s awfully hard when you do. That’s the challenge, for scientists and for everyone else. We all have beliefs it would hurt us to abandon.

Western culture tends to tackle the problem in three ways. It offers an ideological cornucopia, on the assumption that human passion’s a limited resource, so the more diverse beliefs, and multiple identities, a person has, the more weakly each of them will be held. And it offers a powerful competitor idea, in the form of money, to distract people away from other faiths. Finally, some beliefs and identities are, in effect, banned, by being rendered socially unacceptable. In mainstream British culture, for all the talk of free speech, there are certain ideas a citizen would be foolish to admit to holding.

Two myths, one method. The ancient legends of Athene and Prometheus are still relevant (as indeed is much of Greek thought) to how we see ourselves, and our science, today.

Ten books for stimulating thoughts about science

May 9, 2013 at 1:29 pm | Posted in Uncategorized | 1 Comment
Tags: , , , , , , , , , , , ,

‘Which books did you find helpful for thinking about science?’

Here’s my answer, off the top of my head: ten thought-provoking, intriguing, sometimes obstreperous works.

Note that they’re deliberately not science books, and some are only tangentially ‘about’ science. (Maybe I’ll try science books the next time I feel the need for a ‘top ten’ list.)

Note too that I don’t have images for all of them; this is because I borrowed some — a strategy I’d recommend if you’re not sure you’ll want to read and reread. If you do want to buy them, please consider either buying from an independent bookshop like Blackwells, or helping me out by going via my Amazon store. Thanks!

These books set readers a challenge: to think differently about science (and life), to see it from a new angle, and most of all to put it in a context. Even for professional scientists, it’s hard not to spend all your time and energy reading about research, doing it, teaching it, keeping up with the literature, coping with admin, etc. But that leaves no time for stretching and shaking up thought patterns.

“What is this life if, full of care, we have no time to stand and stare?” the poet W.H. Davies asked. Each of these ten books can give you time, if you can make time for it, to stop and take a good clear look at science. Through the empathetic magic which only reading can offer, they also hold up different lenses with which to view a science-haunted world. All ten are fascinating reads, brim-full of ideas.

Two of them are, I must add, horrendous and gruelling, but this is a blog of suggestions, not instructions, and I trust my readers. You’re grown-ups, after all. You’ll know yourselves and act accordingly.

For some of these ten books, the connection with science is obvious (or blinding, since among them you’ll find several science-fiction classics). For others, the author’s targeting not science itself, but the assumptions which tend to accompany it. Science may seem a delight to people like me, but there are many who view it as harbouring some very nasty notions indeed, and some seriously old-fashioned prejudices too. To rebut the criticisms, you first have to acknowledge their existence, as some of the writers on this list certainly do.

Note that I’m not arguing for or against here, not recommending nor endorsing, just offering a list of books that challenged me with new ideas. Some I love; some were a real struggle. They aren’t books I necessarily agree with; where would be the fun in that? They’re books which made me think — one of the greatest services one mind can give to another.

May they do the same for you.

***

1) John Wyndham — The Midwich Cuckoos

Cover image of John Wyndham's book The Midwich CuckoosJohn Wyndham seems very out-of-fashion these days. So much for fashion. His ‘logical fantasies’, as he called them, are thoughtful, well-constructed, humane explorations which reward even frequent re-reading (my copies are falling apart from years of use). Best-known is The Day of the Triffids (which has been filmed, badly, and done on TV, also badly), but any of his other work is well worth a look. I’d love to see a good film version of The Chrysalids, The Kraken Wakes, or Trouble With Lichen, or even a good updating of Triffids or The Midwich Cuckoos. Wyndham’s fully aware of the powers and perils of science, and you feel he’s longing for humanity to wake up and really get to grips with it, but he also likes people, and that warmth makes The Midwich Cuckoos a wonderful read.

2) Julien Benda — The Treason of the Intellectuals (La Trahison des Clercs)

From fiction to another kind of insight. This slight but hugely thought-provoking book rages against the control-freakery, utilitarianism, and techno-optimism that can accompany the gift of higher education.

PP. 151-2: “the man who loves science for its fruits commits the worst of blasphemies against that divinity.”

PP. 153-4: “Let me also point out their devotion to the doctrine (Bergson, Sorel) which says that science has a purely utilitarian origin — the necessity of man to dominate matter, “knowledge is adaptation”; and their scorn for the beautiful Greek conception which made science bloom from the desire to play, the perfect type of disinterested activity.”

Science as play may not sit well with today’s economaniac ‘impact’ agenda, but studies of creativity suggest Benda might have a point.

3) Ludwig Wittgenstein — Philosophical Investigations

Cover image of Wittgenstein's Philosophical InvestigationsI said in one of my books that Wittgenstein should be required reading for all neuroscientists, and I think there’s a good case for extending that to all the sciences. There are few better antidotes to the disturbingly closed-minded arrogance that can, alas, come with feeling you have privileged access to reality. Wittgenstein teaches us to question. He’s elliptical, sidelong, infuriatingly tentative: the very opposite of dogmatically secure. A brilliant philosopher, and an outsider at least twice over, he reminds scientific colleagues that doubt, uncertainty and awareness of one’s cognitive limitations are part of the human condition — and that includes science.

4) Robert Lifton — The Nazi Doctors

This meCover image of Robert Lifton's book The Nazi Doctorssmerising, profound and deeply humane work, by a highly influential psychiatrist, is the darkest book on my list: unsurprisingly, given its terrible subject matter. It makes a point no scientist should forget — that science, which can be used to do such good, can also lead people into doing appalling harm. Lifton makes clear that the scientific approach was neither an add-on nor a cover for Mengele and his ilk; instead, it was key to their intentions. Though Nazi science was undoubtedly ‘bad science’ in the moral sense, we cannot say it wasn’t science at all.

That means we can’t just say, ‘Oh, that was then, it could never happen nowadays!’ and assume that all scientists are rational, beneficent angels. Instead we should say, ‘OK, ‘doing science’ is one of many excuses used to maintain unjust systems and even justify atrocities, so how do we make sure that can’t happen again?’

When I come across an especially maddening techno-optimist, blithely convinced that science knows best, that history and human nature are irrelevant, and that the right research can fix any problem (and, worse, that this means we don’t really need to worry about the problems), Lifton’s is the book I wish I could make them read, mark, learn and inwardly digest.

5) Isaac Asimov — The Early Asimov

AsimovCover image of 'The Early Asimov' is best known for his ‘Robot’ stories and his Three Laws of Robotics, but I also like his early short-form work. It may be rougher, but it gives you a great sense of the fun in science, even though he’s not afraid to look at its moral implications. He’s one of the writers who did most to inspire my love of science — and I’m sure many other people’s too. I’m the lucky owner of a 3-volume edition of his early writings. A particular favourite is ‘The endochronic properties of resublimated thiotimoline’; a very academic joke, but great fun.

6) Sandra Harding — Science and Social Inequality

Sandra Harding is one of the most highly-regarded feminist critics of science, and her book asks some very awkward questions. Anyone who could find a way of making senior science men take her arguments on board would deserve a medal, especially as for ‘females’ one can easily substitute other minorities — in which most sciences are deplorably deficient. When was the last time you came across a publicly gay scientist, or one who was disabled and wasn’t Stephen Hawking? An argumentative book, this, and good to argue with. You don’t have to agree with everything Harding says in order to have your mind usefully stretched by her ideas. For example:

P. 10: “Conventionally, science has been thought of as fundamentally a set of statements or sentences — the laws of nature, observation sentences, and the like. Yet this way of conceptualizing it obscures how social and political values and interests seem to flow out of scientific work “behind the backs” of the scientists. The representation list account seems to absolve the scientific enterprise of any responsibility for the various politics that flow from its representations.”

P. 64: “Imagine if every science department contained the proportion of “science critics” to scientists that there are of literary critics to creative writers in English departments.”

7) Talking of mind-stretching, if you really want an intellectual workout, try M.R. Bennett and P.M.S. Hacker’s Philosophical Foundations of Neuroscience. If Wyndham’s a brief stroll and Benda a jogging mile, this is a marathon. The authors lay into the sloppy thinking found in far too many neuroscience articles, with the exemplary hope of persuading researchers to think better. Not much sign of it working, so far, but you can’t fault them for effort.

PP. 77-78: “The relationship between a (conscious) belief and an unconscious belief, for example, is not akin to the relationship between a visible chair and an occluded chair – it is not ‘just like a conscious belief only unconscious’, but more like the relationship between √1 and √-1.”

P. 209: “Emotions are neither brain states nor somatic reactions”.

And there’s plenty more where those two snippets came from.

8) A. E. van Vogt — The Weapon Shops of Isher

Alfred vCover image of works by A. E. van Vogtan Vogt, like Asimov a giant from US sci-fi’s golden age, fell out of the mainstream years ago.  No one I’ve asked lately has even heard of him. Yet his books, such as Slan, The World of Null-A and The Pawns of Null-A, are fast-paced thrillers which also ask riveting questions about the relationships between technology and power. They brim with technological wonders: totipotency, similiarisation, mind control and much besides. His voice is distinctive, and his politics, for a British kid growing up in a soft-liberal household, were eye-opening; they’re especially clearly set out in The Weapon Shops.

9) Donatien de Sade — Justine

Yes, nonCover image of works by the Marquis de Sadee other than the notorious Marquis de Sade, so be warned, if you attempt this monster of a book you’ll have to wade through a good deal of sexual and other violence, including rape, paedophilia and murder. The best that can be said about Justine‘s delight in human perversion is that 120 Days of Sodom is worse.

But … in among the tediously revolting depictions is one of the clearest, most honest, and most challenging presentations ever seen of the doctrine of nihilism. Sade takes modern secular arguments — including some which underpin such central scientific ideas as evolution — to their logical extreme, and says sardonically, ‘Is this really what you meant?’ He’s disgusting, sure, but he’s also smart. Unfortunately, the philosophy’s so embedded in the crud that you can’t easily skip the one to get to the other.

“One must never appraise values save in terms of our own interests. […] there is no rational commensuration between what affects us and what affects others; the first we sense physically, the other only touches us morally, and moral feelings are made to deceive; none but physical sensations are authentic”.

“virtue is not some kind of mode whose value is incontestable, it is simply a scheme of conduct, a way of getting along, which varies according to accidents of geography and climate and which, consequently, has no reality”.

“man has not been accorded the power to destroy; he has at best the capacity to alter forms, but lacks that required to annihilate them: well, every form is of equal worth in Nature’s view; nothing is lost in the immense melting pot where variations are wrought […] and whatsoever be our interventions in this process, not one of them, needless to say, outrages her, not one is capable of offending her. […] Why! what difference does it make to her creative hand if this mass of flesh today wearing the conformation of a bipedal individual is reproduced tomorrow in the guise of a handful of centipedes? Dare one say that the construction of this two-legged animal costs her any more than that of an earthworm, and that she should take a greater interest in the one than in the other? If then the degree of attachment, or rather of indifference, is the same, what can it be. to her if, by one man’s sword, another man is transspeciated into a fly or a blade of grass? When they will have convinced me of the sublimity of our species, when they will have demonstrated to me that it is really so important to Nature, that her laws are necessarily violated by this transmutation, then I will be able to believe that murder is a crime”.

So, realism about feelings + moral relativism + realism about nature = it’s OK to murder someone. Oops.  To defend science, we have to be able to answer Sade’s challenge.

(Meanwhile, ‘transspeciated’ is quite a word.)

10) Kurt Vonnegut — Galapagos

From Sade’s Cover image of Vonnegut's Galapagoscold, ferocious humour to this short novel — also brilliant, but far more humane. It’s Vonnegut, so it has his unmistakeable riotous style. It’s about evolution, sort of, but it’s also about human folly (a favourite theme of this particular author). It’s weird and zany and teasing, making you ponder as well as making you laugh. Plus the narrator’s a decapitated ghost. Enjoy.

Finally, if you’ve any suggestions for other stimulating books, do let me know.

Videos about cruelty

April 29, 2013 at 11:57 am | Posted in Uncategorized | Leave a comment
Tags: , , ,

Cruelty book cover‘April is the cruelest month’, the poet T S Eliot tells us. It seems an odd remark to make about spring, especially in this year here in Britain, with spring so late and longed-for. And yet mood disorders, hospital admissions, heart disease, even suicide statistics show a definite peak at this time of year.

Cruelty also fluctuates. Domestic violence charities, for instance, say Christmas is always a bad time. But there’s no time when someone, somewhere, isn’t suffering because of someone else’s cruelty.

The first thing most people say about cruelty — after their initial horrified/disgusted/angry/unwillingly fascinated reaction — is ‘Why?’ As a guide to possible answers, I’ve done a set of short introductory videos about cruelty, now available on YouTube.

By the way, it’s OK to feel the fascination as well as the horror, anger and disgust. It doesn’t mean you’re cruel yourself, deep down; it means you’re human. Cruelty, in evolutionary terms, is a significant threat. Evolution didn’t bargain for books and blogs and video games and movies. Our brains evolved to react to other people’s cruel behaviour — including the fictional kind — as if to a dire and imminent danger. We find it hard to tear our eyes and minds away, even as we’re repelled, because concentrating on a threat was better for our ancestors than ignoring it and hoping it would go away.

Ignoring it and hoping it’ll go away, incidentally, is one of the two most popular strategies for dealing with cruelty. The other is reacting with even more cruelty. Both have repeatedly failed. That’s why I wrote a book about cruelty: because the only other strategy we have is to try to understand why cruel behaviour happens.

It’s a good strategy. It’s worked for other kinds of human sickness. (We lucky Westerners forget how many diseases, from malnutrition to cholera, children used to die of who now don’t.)

Cruelty is much more about sickness, failure and inadequacy than it is about evil, glamour and excitement. That ice-cool psychopathic killer sexing-up a movie? In real life he started out as pathetic and miserable, and he stayed childish till the final shoot-out. To treat cruelty as evil is tempting, but unhelpful. Calling something evil won’t help us see what’s causing it, whereas delving into the backgrounds of serial killers might. Getting away from the idea that cruel people are unfathomable also reduces our fascination with them, converting it to pity (or contempt). Paradoxically, that helps focus attention on victims, who are often given much less thought than their attackers.

Breathing through a scented handkerchief fails to protect against cholera, but cleaning up the water supply saves lives. To decide to clean up the water, however, you first have to understand how cholera spreads. John Snow, the father of epidemiology, looked at disease patterns scientifically, and his understanding is still saving lives today.

Likewise, curing the malaise of cruelty requires us to understand what makes it spread and flourish, or decline. Science may or may not be able to convince politicians to act in ways which reduce cruelty, but without science, we’ll never know how best to stop it.

Here’s the playlist.

A pronunciation guide to great men (and the obligatory token woman)

April 3, 2013 at 1:04 pm | Posted in Uncategorized | 8 Comments
Tags: , , , , , ,

If you’re like me, when you’re reading about someone it helps to know how their name’s pronounced. Just seeing it on the page isn’t enough; there’s always that slight mental hiccup as your inner voice tries to say the word, and fails.

And for some of the great names who’ve shaped the history of Western thinking (see the slideshow below), pronunciation ain’t so easy to figure from the letter mix. If you’re learning about them in lectures and seminars, well and good, but what if your first acquaintance is purely visual?

To help out, and maybe raise a smile along the way, I’ve done a little guide to some of the trickiest stars of philosophy, psychology and literary theory. If there are other names, from other disciplines, whom you’d like to see included, let me know!

Continue Reading A pronunciation guide to great men (and the obligatory token woman)…

The Atheist Credo

March 22, 2013 at 12:54 pm | Posted in Uncategorized | 9 Comments
Tags: , , , ,

I’ve just been listening to Alain de Botton on the radio, talking about his favourite music and his book Religion for Atheists. He argues — and he’s not the first to do so — that religion has some good ideas, if you strip out all that supernatural gunk. I paraphrase. Rather than dismissing it wholesale, atheists should therefore borrow the good bits.

(So it’s OK to love Bach, as de Botton does. Indeed, I never came across an atheist intellectual who doesn’t love Bach, though there may be some.)

Since statements of faith are an important part of religious practice, I’ve taken de Botton’s proposal one step further to come up with an atheist credo (an ‘I believe’ for non-believers). Since such apparent basics as prosody and rhythm have important effects on mood and atmosphere, and since it’s nearly Easter, I’ve modelled my attempt on the Christian equivalent. After all, they’ve had centuries to get it right.

The Atheist Credo

I believe in one method

of data, hypothesis, and experiment

which was conceived by ancient Greek thinkers,

born in the Age of Enlightenment,

suffered under superstition

is struggling under religion

is bound to make people’s lives better

and will one day bring about a perfect world.

Mind Control and Big Neuro

March 19, 2013 at 12:54 pm | Posted in Uncategorized | Leave a comment
Tags: , , , , , ,

Cover of BrainwashingThe Brain Supremacy book coverThe recent clutch of Big Neuroscience stories in the media (see my previous post) has raised an old concern in some of the media’s less — shall we say? — scientifically reputable outlets, like Esquire magazine. Could fearsome-sounding neuroscience technologies like nanoscience, optogenetics, deep brain stimulation and so on ever be used for mind control? The idea is frightening — and intriguing.

Some neuroscientists (not all) are instantly dismissive. Sample reaction on Twitter:

Mind control? Neuroscientists WISH this was in the realm of possibilities. More sensationalist BAM b.s. via @noahwg esquire.com/the-side/featu…

— Emilie Reas (@etreas) February 22, 2013

‘Mind control? I wish!’ is an understandable response, but is mocking humour the best response to anxiety? That’s a serious question, and the answer depends on the humourist’s goal. Is it to preserve status and protect neuroscience’s reputation, or is it to ease the concerns of people who take the prospect of imminent brainwashing seriously? The media do sensationalise, but they don’t do so at random; they generally know what works. People fear being manipulated.

Of course some researchers will react dismissively. Who wants their shiny new science tainted by association with the sordid cruelties of early brainwashing research? But before you dismiss the idea, bear in mind that the leading scientists behind the Brain Activity Mapping Project, which US President Obama hopes to back with $3 billion funding, raise the issue of mind control themselves as one of the difficult ethical problems which may arise in the course of their research (their article in Neuron is here).

The Frankenstein stereotype of scientists as seeking to dominate nature remains influential, and in brain research, of course, ‘nature’ means ‘us’. If you don’t know the gritty details which make neuroscience research so painstaking and difficult, it’s easy to imagine the worst.

And the pressure on scientists to hype up the ‘impact’ of their work, stretching steps to advances, advances to breakthroughs, and breakthroughs to exciting challenges, is not helpful either. If one neuroscientist’s press release says he’s used fMRI to decode what somebody’s thinking, is it quite fair for another’s blog to sneer at those poor fools who fear that the government may soon be reading their minds?

To be clear: the issue of mind control may arise. No way can we do it yet, and no researcher knows whether it will be possible or not in the future. The brain’s really hard, but science is littered with people who said ‘never’ and were proved wrong, so the opinions for and against are matters of personality and faith, not secret knowledge. Cynics will sneer, optimists hope, pessimists dread, psychopaths plot, and geeks plough on regardless — and entertaining though this all may be, it’s not science.

Meanwhile, there’s more danger of mind control from watching TV ads too long than from your local neuroscience lab. Even if precision brain control does become a real possibility, my own expression of faith is that you probably have more to fear from your (nonlocal) megacorporations, government and the military than your friendly neighbourhood brain researchers.

Having written books about both brainwashing and the future of neuroscience technologies, I herewith add my tuppenceworth on why we find the topic so enthralling, in the form of a short video about the ancient dream, and modern science, of mind control.

I hope you’ll find it useful.

The brain supremacy’s on its way with big neuroscience

February 27, 2013 at 1:39 pm | Posted in Uncategorized | 3 Comments
Tags: , , , , ,

The Brain Supremacy book coverLike the Big Genetics of the Human Genome Project before it, Big Neuroscience has gone mainstream. In his recent State of the Union address, US President Barack Obama mentioned brain research, and it’s thought that his next budget will seek $3 billion funding for the Brain Activity Map Project, an ambitious attempt to use nanotechnology and genetics to investigate brain function. The way in which the excitement of many connected neurons gives rise to coordinated patterns of brain activity is not well understood, and the project hopes to start small and work up, one step at a time, to the human condition.

$3 billion! That should ease the physics envy somewhat.

The brain supremacy is on its way. Brain research has been moving up the science hierarchy for a while. The European Union has announced a large dollop of funding for another Big Neuro project, this time to build a computer model of a brain. While the President was setting his neurohare loose, ably assisted by the New York Times, the BBC confusingly chose to highlight another big brain mapping project, the Human Connectome Project. Big Neuro, big news.

Just to clarify, the Human Connectome Project has been going for a while, and its aim is to study the physical connections between brain areas (their structural connectivity). The new Brain Activity Map Project aims to study how brain areas interact (functional connectivity). Since you can in principle have a link between two neurons that is not used, or one that is created or that dies off, structural and functional connectivity aren’t the same.

Also, re terminology: the ‘connectome’ is the set of all the links between neurons in an organism, and was first used to refer to physical links. So you can have a connectome (i.e. a structural connectome, the wiring), and you can have a functional connectome, a list of which bits communicate.

Oh, and the Brain Activity Mapping Project ought to be abbreviated to BAMP, by analogy with the Human Genome Project (HGP) and the Human Connectome Project (HCP). But everyone’s calling it BAM, thereby proving that even scientists are susceptible to the irrational urge to prioritise sound over sense. Or maybe it’s all those happy memories of comic-book superheroes …

Human brains have around 86 billion neurons, roughly the same numbers of glia, and more neurotransmitters, hormones and receptors than you could shake a stick at. Human brains, therefore, are not where the new project will start. Instead, it will spend an immense amount of taxpayer cash on animal research. The brains to be mapped will be those of worms, flies, small mammals, possibly primates. Like I said, one step at a time.

Nonetheless, if you spot a neuroscientist with an unaccustomed swagger, chances are those 3 billion dollars will be why.

For more comment from the online community, try the following:

Nucleus Ambiguous

Knight Science Journalism

Mo Costandi on ‘connectome-ism’

Mind Hacks

Videos about brainwashing

February 26, 2013 at 12:17 pm | Posted in Uncategorized | 3 Comments
Tags: , , , , ,

I’ve done a set of short introductory Cover of Brainwashingvideos about brainwashing, now available on YouTube.

In them, I’m arguing that brainwashing is not magical super-tech. Nor is it entirely propaganda. It relies on human psychological weaknesses — or, to put it another way, on the remarkable flexibility of our brains.

I talk about what brainwashing is and whether it really exists, its history, types of brainwashing, and the five core psychological techniques of brainwashing. (That last one’s a bit longer.)

For a taster, here’s the first one, on ‘What is brainwashing?’

« Previous PageNext Page »

Blog at WordPress.com.
Entries and comments feeds.

%d bloggers like this: