Archive

Posts Tagged ‘science’

Is it or isn’t it?

Comet ISON, I mean. It went whizzing around the Sun on (US) Thanksgiving, and fizzled, thus ending the hopes of amateur astronomers like me for a December show.

XKCD comic

Except it didn’t, quite.

This ESA/NASA Solar and Heliospheric Observatory timelapse image shows the bright comet heading in, and something heading back out. (Remember that a comet’s tail points away from the Sun no matter which way it’s going.)

soho_c3_timelapse_new_0

While it looks as if ISON won’t be visible, watching the science unfold over the past few days has been utterly fascinating. Most people don’t get to see data come in and science happen nearly real-time, being exposed only to the articles written after everything is known. This blog post especially highlights the joy and frustration.

Karl Battams writes there:

And I just want to end on this note: not long after comet ISON was discovered, it began to raise questions. Throughout this year, as many of you who have followed closely will appreciate, it has continued to confuse and surprise us. For the past few weeks, it has been particularly enigmatic and dynamic, in addition to being visually spectacular. This morning we thought it was dying, and hope was lost as it faded from sight. But like an icy phoenix, it has risen from the solar corona and – for a time at least – shines once more. This has unquestionably been the most extraordinary comet that Matthew and I, and likely many other astronomers, have ever witnessed. The universe is an amazing place and it has just amazed us again. This story isn’t over yet, so don’t stray too far from your computer for the next couple of days!

Phil Plait has done his usual good job summarizing the ups and downs and ups and downs of ISON-watching, with his post from yesterday offering video and analysis.

David Levy famously said, “Comets are like cats: they have tails, and they do precisely what they want.” Definitely.

Even through my disappointment, I’ve found the real-time science a lot of fun to watch: the data coming in, the changing interpretations, the frantic scientists trying to figure out what to say to the inquiring public. More science-fictional scientists should behave like this!

Rat Telepathy: Let There Be Nuance

The scientists developing rat telepathy have an aim. They want to answer the question: Can mammal brains be trained to communicate with each other electrically? Research shows that it can work in rats, at least. Much remains to be seen: Will it work in other mammals? In humans? For now, the electrical communication is one-way; is ‘telepathic’ repartee possible? Is it possible for different species to communicate effectively brain-to-brain?

We know from other research that some mammal brains can control machines designed for that purpose. But what about feedback – sensory information simulating touch transmitted from prosthetic to brain, for example? Current science seems to indicate it’s possible. It’s certainly an easy leap to make in fiction, but any scientist worth her weight in pipettes will tell you that it’s usually a few orders of magnitude harder to manage in real life.

Navigating the administrative and regulatory obstacle courses between the lab bench and the clinic alone is costly in terms of time and money, yes, but it also takes a toll on the heart. The business of science is hard on people. It consumes researchers in much the same way biologists burn through reagents, and at metaphorically the same rate. It’s dimensions harder for scientists who experiment on animals. Not only are the mountains of paperwork piled higher and the pitfalls dug deeper, the researchers are human. Animal lovers and vegans among their number.

How can anyone tolerate animal testing? For that matter, how can anyone eat meat? It’s all down to our capacity for cognitive dissonance. Hypocrites! Idealists!

It’s complicated. It’s hard and it should be. We should be suspicious of over-simplification, even in our own fiction. We should look close, listen carefully, and imagine with depth. In our writing, we should resist the ‘mad scientist’ trope. For a fun change of pace, avoid pursuing plot devices to their logical extremes. Instead of painting science as the villain, how about shining a light on the tensions that emerge when budget constraints – sequestration, anyone? – force post-docs to compete with their mentors for increasingly limited federal funds? Why not examine the consequences of alowing basic science to languish while throwing money at the few headline-making scientists so adroit at standing on giant shoulders that they achieve celebrity status? What happens to a civilization after a generation of quiet giants is lost?

And what of the tender-hearted scientist? She’s no fool. She would never release lab animals into the wild. Instead, her data is better because she stacks the deck in favor of her furry subjects. They have the best care possible under the circumstances, and it shows in how well her test results stand up to peer review. Her work informs others’ and ultimately, the care provided in her vivarium becomes one high standard to which others’ are upheld. She is loved and hated. As can happen to any normal person, she becomes known in certain circles for something other than what she intended. It’s a pain, but one she knows she’s lucky to have. Her social media presence is trolled by animal rights protesters and zealous novelists alike. In the end, she’s offered few choices: Embrace celebrity or obscurity. Chase research dollars or abide by evidence-based principles… What’s a tender-hearted scientist to do?

There is room in any given story for both technological advancement and, oh, the humanity! We can navigate the inelegant intersection of rat telepathy and animal rights. Why don’t we? We could place blame with ignorant writers, lazy readers, or publishers who aim for the lowest common denominator, but the real answer is beautifully more complex. It encompasses everything from public funding for STEM education through social stigma for being unironically enthusiastic science nerds. It defies gender binaries and thumbs its nose at sterotypes. There is no overly simple answer.

So, let’s look beyond the obvious extrapolations from this most recent piece of sensational science news. Let there be nuance.

Scientists Gone Wild

“He had the greatest mind since Einstein, but it didn’t work quickly. He admitted his slowness often. Maybe it was because he had so great a mind that it didn’t work quickly… I watched him hit that ball. I watched it bounce of the edge of the table and move into the zero-gravity volume, heading in one particular direction. For when Priss sent that ball toward the zero-gravity volume – and the tri-di films bear me out – it was already aimed directly at Bloom’s heart! Accident? Coincidence? …Murder?” The Billiard Ball – Isaac Asimov

In “The Billiard Ball”, first published in the March 1967 issue of If, Asimov presents a story in which scientific competition rises to the level of murder. Maybe.  Asimov understood that scientists are human beings, and can be arrogant, petty, cruel, and filled with hatred. These traits, in turn, can make for a compelling science fiction story. And if you’re looking for inspiration, there’s plenty to be found.

Lord Kelvin, a brilliant mathematician and physicist, accused Wilhelm Roentgen, who announced the discovery of X-rays in 1895, of fraud. He argued that the cathode-ray tube, which Roentgen had used in his discovery, had been in use for a decade, and therefore if X-rays actually existed, someone would have already discovered them. True, he eventually came around and apologized, but calling a fellow scientist a fraud is pretty serious.

But Kelvin was actually the least of Roentgen’s attackers. Roentgen had borrowed a cathode-ray tube from physicist Philipp Lenard, who had been exploring fluorescence using cathode-ray tubes before Roentgen, although he failed to pursue its origins or photographically document his findings. Lenard became angry that Roentgen hadn’t acknowledged his work in developing some of the technology that lead to Roentgen’s discovery, and for years he both demanded credit for the discovery of X-rays while simultaneously (and wrongly) arguing that they were just a kind of cathode ray with new properties instead of a different phenomenon. Lenard’s attacks on Roentgen lasted until Lenard died in 1947, and because of the attacks, Roentgen left orders for all his papers concerning X-rays prior to 1900 burned, unopened, upon his death. Lenard went on to be an early member of the Nazi party, an advisor to Adolf Hitler, Chief of Aryan physics, and a fierce opponent of Albert Einstein and “the Jewish fraud” of relativity.

Then there’s English inventor and scientist Robert Hooke. Hooke was a polymath, and is often referred to as the English Leonardo Da Vinci. He discovered Hooke’s Law (the extension of a spring is proportional to the applied force), contributed to knowledge of respiration, insect flight and the properties of gases, coined the term “cell” to describe the individual units making up larger organisms, invented the universal joint and the anchor escapement in clocks and numerous other mechanical devices, his work on gravitation preceded Newton’s, his Micrographia was the first book on microscopy, his astronomical observations were some of the best seen at the time, and he was an architect of distinction and a Surveyor for the City of London after the Great Fire.

But he was also an ass, especially when it came to Isaac Newton.

Hooke and Newton were involved in a dispute over the idea of the force of gravity following an inverse square relationship to define the elliptical orbits of planets, as well as Newton’s theory of light and colors. In 1672, Newton was elected Fellow of the Royal Society of London, and his first letter on Light and Colors was read to the Society. Hooke, at the time a respected senior scientist and Curator of Experiments for the Royal Society, attacked Newton’s theory, and also claimed that he had invented a reflecting telescope before Newton (Newton had actually invented it in 1668). Newton fought back, and won, but in In January 1676, Hooke again attacked Newton, alleging that Newton had plagiarized Hooke’s Micrographia, which contained Hooke’s own theory of light.

Despite the attacks, Hooke and Newton corresponded and in private correspondence, Newton had shared calculations that, he believed, showed that the path of a body falling to Earth would be a spiral. Unfortunately, Hooke realized that Newton’s argument only held true if the body were precisely on the equator, and in the more general case the path would be an ellipse. In 1679, just after Newton’s mother had died, Hooke exposed the error to the Royal Society, and after briefly responding to Hooke, he stopped writing anyone for over a year.

In 1686, when the first book of Newton’s ‘Principia’ was presented to the Royal Society, Hooke claimed that Newton had obtained from him the “notion” of “the rule of the decrease of Gravity, being reciprocally as the squares of the distances from the Center”. Only the diplomatic intervention of Edmund Halley persuaded Newton to allow the publication of the final volume of the Principia trilogy, with Halley telling Newton that Hooke was merely making a public fool of himself, and Newton removing every reference to Hooke in the volume.

But before you feel too bad for Newton, don’t, because Newton could be just as much of an ass as Hooke was.

John Flamsteed may not be a name that’s familiar to you, but was Astronomer Royal, and over 30 years had measured the positions of thousands of stars with a precision far exceeding anything undertaken before him. When Newton needed observations on the ‘double’ comet of 1680, he turned to Flamsteed, who provided him with the observations. There were some small errors in the data Flamsteed sent to Newton, and Flamsteed attempted to make amends by carrying out some of the calculations that Newton needed for himself. Newton, however, Newton caustically informed Flamsteed that he needed his observations, not his calculations. Feeling mistreated Flamsteed threatened to withhold his data.

Newton needed these calculations for a new section he was planning for the second edition of the Principia, around 1703, on a “Theory of the Moon”, so using his courtly influence, he persuaded Queen Anne’s husband, George, to commission a royal star catalogue, to be printed by the Royal Society. Flamsteed could hardly refuse this commission from his direct employer; but the moment he handed his draft data over to the Royal Society it was certain to go straight to Newton, who now dominated there. Flamsteed stalled, publishing the data as slowly as possible, and making certain it wasn’t the data that Newton needed. When Flamsteed argued with Newton over an error in Newton’s measurement of the size of stars in Opticks, Newton deliberately excluded Flamsteed from the discussions about the publication of his catalogue, and his request for a £2,000 grant to purchase a new telescope was rejected under Newton’s influence. In 1708, Prince George died, and the star catalogue project died with him. In retaliation, when Flamsteed’s membership of the Royal Society lapsed in 1709, Newton refused to renew it, effectively expelling Flamsteed.

But Newton wasn’t through with Flamsteed. He needed Flamsteed’s data, and by 1711, had persuaded Queen Anne to take up the mantle of sponsor of her late husband’s project. In a  note to Flamsteed in 1711, Newton threatened that “[If you] make any excuses or unnecessary delays it will be taken for an indirect refusal to comply with Her Majesty’s order.”

The matter came to a head with the eclipse of 4 July, 1711. Observations of the eclipse would be invaluable to Newton’s calculations, but Flamsteed refused a direct order to observe it. He was ordered to explain himself before a panel of the Royal Society, and the council that was to stand judgment over Flamsteed was selected by the President of the Royal Society (Newton) and consisted of Newton and two of his most loyal supporters. The council, to no one’s surprise, ordered the immediate publication of all Flamsteed’s hard-won data.

Flamsteed’s masterwork, Historia Coelestis, was finally published in 1712, against Flamsteed’s wishes and without his involvement. The following year, Newton issued the second edition of his Principia, compete with a lunar theory based on Flamsteed’s data.

Scientists gone wild, indeed.

References

1. Asimov, Isaac, 1986. The Edge of Tomorrow. New York, NY: Tom Doherty Assoc Llc

2. Kevles, Bettyann H., 1998. Naked to the Bone: Medical Imaging in the Twentieth Century. New York, NY: Basic Books.

3. Chapman, Allan, 2004. England’s Leonardo: Robert Hooke and the Seventeenth-Century Scientific Revolution. New York, NY: Taylor & Francis.

4. Clark, David H. and Clark, Stephen H. P., 2001. Newton’s Tyranny: The Suppressed Scientific Discoveries of Stephen Gray and John Flamsteed. New York, NY: W. H. Freeman

5. Grant, John, 2007. Corrupted Science: Fraud, Ideology and Politics in Science. Wisley, Surrey, England: Facts, Figures & Fun

Realism in combat: perceptual distortions

Writing combat sequences and traumatic events is always a challenge. There are plenty of questions and answers out there about the mechanics of sword fighting, bare-handed combat, or guns. Most of us can extrapolate our more ordinary experiences with adrenaline to that sort of situation, to get those flavorful extra details: cold sweat, pounding pulse, hands shaking. But it turns out there are even more options than those.

Perceptual distortions are common, in combat situations. The following can be used for dramatic effect or to set up conflicts because of differing accounts of what happened. According to the numbers reported, it’s not unusual to experience more than one of the following distortions — in fact, it would be more unusual to not experience any.

Listed from most common to least common, as reported by Artwohl & Christensen in 1997.

Above 50%:

  • Diminished sound: Not to be confused with being deafened by the noise of gunshots or whatever else is going on. This is sound being actively screened out by the brain. It may be a case of sounds being lowered in volume, it might be a complete blockage of all noise, or it might be selective editing of certain noises (such as gunshots.)
  • Tunnel vision: The brain can actively screen out visual information, too, so as to narrow one’s focus to the most important (threatening) thing on hand.
  • Automatic pilot: This is why soldiers and police officers drill certain sequences of actions into becoming reflexes — because when one’s conscious brain shuts down under the tidal wave of adrenaline, that’s what still works. If your character hasn’t trained his automatic pilot to fight…?
  • Heightened visual clarity: This is why some combat pilots can describe, 50 years later, the look on the face of the enemy pilot they shot down. Adrenaline can burn images onto the brain.
  • Slow motion time: In addition to being a cool movie effect, this can actually seem to happen. Some swear that they saw the bullets zipping by, and who’s to say they didn’t?
  • Memory loss: In addition to the sights and sounds that the brain might edit out, the entire memory can be simply lost. Or is it only misplaced and waiting to burble up in a nightmare…

Below 50%:

  • Dissociation: Some get the sensation of watching themselves from a distance, in these situations.
  • Intrusive, distracting thoughts: Perhaps it’s thoughts of loved ones, one’s god/goddess, or “did I leave the oven on?”
  • Memory distortions: Not lost memories, but incorrect ones. This is part of why eyewitnesses are not as reliable as we wish they were.
  • Accelerated time: Blink and you missed it. Or was your brain editing stuff out?
  • Intensified sounds: Terror can crank everything up to eleven. This can make the situation even more overwhelming, and maybe accounts for the character losing nerve and running.
  • Temporary paralysis: This was relatively rare, but terrifying. How quickly the subject can realize it isn’t real will improve his chances of survival.

Science asks questions

Science in My Fiction is growing: new contributors will be coming on-board over the next few weeks to write about geology, paleontology, culture, technology.

But what have we been missing? Are there general areas you’d like to see us include, or specific topics that you think would be awesome to include?

Evolution: very important. Also awesome.

We’ll always be all about the science and the fiction, but that’s an enormous area. I’d love to hear from you our readers on what you think is most interesting and valuable.

That Shy, Elusive Rape Particle

Note: This article originally appeared on Starship Reckless

[Re-posted modified EvoPsycho Bingo Card -- click on image for bigger version]

One of the unlovely things that has been happening in Anglophone SF/F (in line with resurgent religious fundamentalism and erosion of democratic structures in the First World, as well as economic insecurity that always prompts “back to the kitchen” social politics) is the resurrection of unapologetic – nay, triumphant – misogyny beyond the already low bar in the genre. The churners of both grittygrotty “epic” fantasy and post/cyberpunk dystopias are trying to pass rape-rife pornkitsch as daring works that swim against the tide of rampant feminism and its shrill demands.

When people explain why such works are problematic, their authors first employ the standard “Me Tarzan You Ape” dodges: mothers/wives get trotted out to vouch for their progressiveness, hysteria and censorship get mentioned. Then they get really serious: as artists of vision and integrity, they cannot but depict women solely as toilet receptacles because 1) that has been the “historical reality” across cultures and eras and 2) men have rape genes and/or rape brain modules that arose from natural selection to ensure that dominant males spread their mighty seed as widely as possible. Are we cognitively impaired functionally illiterate feminazis daring to deny (ominous pause) SCIENCE?!

Now, it’s one thing to like cocoa puffs. It’s another to insist they are either nutritional powerhouses or haute cuisine. If the hacks who write this stuff were to say “Yeah, I write wet fantasies for guys who live in their parents’ basement. I get off doing it, it pays the bills and it has given me a fan base that can drool along with me,” I’d have nothing to say against it, except to advise people above the emotional age of seven not to buy the bilge. However, when they try to argue that their stained wads are deeply philosophical, subversive literature validated by scientific “evidence”, it’s time to point out that they’re talking through their lower digestive opening. Others have done the cleaning service for the argument-from-history. Here I will deal with the argument-from-science.

It’s funny how often “science” gets brandished as a goad or magic wand to maintain the status quo – or bolster sloppy thinking and confirmation biases. When women were barred from higher education, “science” was invoked to declare that their small brains would overheat and intellectual stress would shrivel their truly useful organs, their wombs. In our times, pop evopsychos (many of them failed SF authors turned “futurists”) intone that “recent studies prove” that the natural and/or ideal human social configuration is a hybrid of a baboon troop and fifties US suburbia. However, if we followed “natural” paradigms we would not recognize paternity, have multiple sex partners, practice extensive abortion and infanticide and have powerful female alliances that determine the status of our offspring.

I must acquaint Tarzanists with the no-longer-news that there are no rape genes, rape hormones or rape brain modules. Anyone who says this has been “scientifically proved” has obviously got his science from FOX News or knuckledraggers like Kanazawa (who is an economist, by the way, and would not recognize real biological evidence if it bit him on the gonads). Here’s a variation of the 1986 Seville Statement that sums up what I will briefly outline further on. It goes without saying that most of what follows is shorthand and also not GenSci 101.

It is scientifically incorrect to say that:
1. we have inherited a tendency to rape from our animal ancestors;
2. rape is genetically programmed into our nature;
3. in the course of our evolution there has been a positive selection for rape;
4. humans brains are wired for rape;
5. rape is caused by instinct.

Let’s get rid of the tired gene chestnut first. As I’ve discussed elsewhere at length, genes do not determine brain wiring or complex behavior (as always in biology, there are a few exceptions: most are major decisions in embryo/neurogenesis with very large outcomes). Experiments that purported to find direct links between genes and higher behavior were invariably done in mice (animals that differ decisively from humans) and the sweeping conclusions of such studies have always had to be ratcheted down or discarded altogether, although in lower-ranking journals than the original effusions.

Then we have hormones and the “male/female brain dichotomy” pushed by neo-Freudians like Baron-Cohen. They even posit a neat-o split whereby too much “masculinizing” during brain genesis leads to autism, too much “feminizing” to schizophrenia. Following eons-old dichotomies, people who theorize thusly shoehorn the two into the left and right brain compartments respectively, assigning a gender to each: females “empathize”, males “systematize” – until it comes to those intuitive leaps that make for paradigm-changing scientists or other geniuses, whereby these oh-so-radical theorists neatly reverse the tables and both creativity and schizophrenia get shifted to the masculine side of the equation.

Now although hormones play critical roles in all our functions, it so happens that the cholesterol-based ones that become estrogen, testosterone, etc are two among several hundred that affect us. What is most important is not the absolute amount of a hormone, but its ratios to others and to body weight, as well as the sensitivity of receptors to it. People generally do not behave aberrantly if they don’t have the “right” amount of a sex hormone (which varies significantly from person to person), but if there is a sudden large change to their homeostasis – whether this is crash menopause from ovariectomy, post-partum depression or heavy doses of anabolic steroids for body building.

Furthermore, as is the case with gene-behavior correlation, much work on hormones has been done in mice. When similar work is done with primates (such as testosterone or estrogen injections at various points during fetal or postnatal development), the hormones have essentially no effect on behavior. Conversely, very young human babies lack gender-specific responses before their parents start to socialize them. As well, primates show widely different “cultures” within each species in terms of gender behavior, including care of infants by high-status males. It looks increasingly like “sex” hormones do not wire rigid femininity or masculinity, and they most certainly don’t wire propensity to rape; instead, they seem to prime individuals to adopt the habits of their surrounding culture – a far more adaptive configuration than the popsci model of “women from Venus, men from Mars.”

So on to brain modularity, today’s phrenology. While it is true that there are some localized brain functions (the processing of language being a prominent example), most brain functions are diffuse, the higher executive ones particularly so – and each brain is wired slightly differently, dependent on the myriad details of its context across time and place. Last but not least, our brains are plastic (otherwise we would not form new memories, nor be able to acquire new functions), though the windows of flexibility differ across scales and in space and time.

The concept of brain modularity comes partly from the enormously overused and almost entirely incorrect equivalence of the human brain to a computer. Another problem lies in the definition of a module, which varies widely and as a result is prone to abuse by people who get their knowledge of science from new-age libertarian tracts. There is essentially zero evidence of the “strong” version of brain modules, and modular organization at the level of genes, cells or organ compartments does not guarantee a modular behavioral outcome. But even if we take it at face value, it is clear that rape does not adhere to the criteria of either the “weak” (Fodor) or “strong” version (Carruthers) for such an entity: it does not fulfill the requirements of domain specificity, fast processing, fixed neural architecture, mandatoriness or central inaccessibility.

In the behavioral domain, rape is not an adaptive feature: most of it is non-reproductive, visited upon pre-pubescent girls, post-menopausal women and other men. Moreover, rape does not belong to the instinctive “can’t help myself” reflexes grouped under the Four Fs. Rape does not occur spontaneously: it is usually planned with meticulous preparation and it requires concentration and focus to initiate and complete. So rape has nothing to do with reproductive maxima for “alpha males” (who don’t exist biologically in humans) – but it may have to do with the revenge of aggrieved men who consider access to women an automatic right.

What is undeniable is that humans are extremely social and bend themselves to fit context norms. This ties to Arendt’s banality of evil and Niemöller’s trenchant observations about solidarity – and to the outcomes of Milgram and Zimbardo’s notorious experiments which have been multiply mirrored in real history, with the events in the Abu Ghraib prison prominent among them. So if rape is tolerated or used as a method for compliance, it is no surprise that it is a prominent weapon in the arsenal of keeping women “in their place” and also no surprise that its apologists aspire to give it the status of indisputably hardwired instinct.

Given the steep power asymmetry between the genders ever since the dominance of agriculture led to women losing mobility, gathering skills and control over pregnancies, it is not hard to see rape as the cultural artifact that it is. It’s not a sexual response; it’s a blunt assertion of rank in contexts where dominance is a major metric: traditional patriarchal families, whether monogamous or polygynous; religions and cults (most of which are extended patriarchal families); armies and prisons; tribal vendettas and initiations.

So if gratuitous depictions of graphic rape excite a writer, that is their prerogative. If they get paid for it, bully for them. But it doesn’t make their work “edgy” literature; it remains cheap titillation that attempts to cloak arrant failures of talent, imagination and just plain scholarship. Insofar as such work has combined sex and violence porn as its foundation, it should be classified accordingly. Mythologies, including core religious texts, show rape in all its variations: there is nothing novel or subversive about contemporary exudations. In my opinion, nobody needs to write yet another hack work that “interrogates” misogyny by positing rape and inherent, immutable female inferiority as natural givens – particularly not white Anglo men who lead comfortable lives that lack any knowledge to justify such a narrative. The fact that people with such views are over-represented in SF/F is toxic for the genre.

Further reading:

A brief overview of the modularity of the brain/mind
Athena Andreadis (2010). The Tempting Illusion of Genetic Virtue. Politics Life Sci. 29:76-80
Sarah Blaffer Hdry, Mothers and Others: The Evolutionary Origins of Mutual Understanding
Anne Fausto-Sterling, Sex/Gender: Biology in a Social World
Cordelia Fine, Delusions of Gender
Alison Jolly, Lucy’s Legacy: Sex and Intelligence in Human Evolution
Rebecca Jordan-Young, Brain Storm: The Flaws in the Science of Sex Differences
Kevin Laland and Gillian Brown, Sense and Nonsense: Evolutionary Perspectives on Human Behaviour
Edouard Machery and Kara Cohen (2012). An Evidence-Based Study of the Evolutionary Behavioral Sciences. Brit J Philos Sci 263: 177-226

Empty your memory trash can? (This action cannot be undone)

PKMzeta is shaping up to be a single, target-able protein in the brain responsible for reconsolidating memories. Discover ran a three part article on it and there was a recent article in Wired, too — the original scientific papers are behind subscription walls, unfortunately.

In brief, reconsolidation is a maintenance process for long-term memories. We think our memories are firm and unchanging, but plenty of studies have proven that they aren’t. They shift a little each time we remember them, each time we reconsolidate them, and over time those shifts add up. (And they’re often inaccurate to begin with, but that’s another issue.)

PKMzeta is a protien that hangs out in the synapses between neurons and maintains a particular ion channel so that the neuron is able to receive signals from the neighbors. Without PKMzeta, the number of those particular ion channels drops and the neuron becomes less sensitive to nearby activity.

Block the PKMzeta when a memory is undergoing reconsolidation and the memory will fade.We already have one drug (propranolol) that does this, and there are sure to be more.

There are tons of questions still to be answered, of course. And there are tons of possible uses and abuses of such a thing. This is such a gold mine of science fiction possibilities that I’m sure I don’t have to list them. But I would like to bring up one.

“This isn’t Eternal Sunshine of a Spotless Mind-style mindwiping” the article in Wired says. That may be true, but it also does not address an excellent question that movie poses (if you haven’t seen it, I recommend it.) The question being: you can remove the memories associated with a bad relationship with a person, but what about the underlying attraction that drew you to that person in the first place? One of the implications I got from that movie was that the two of them were stuck in a cycle of attraction, falling apart and voluntary mind-wipes.

For “person,” above, substitute anything you like. Kittens. Drugs. Street racing. World domination… like I said, a gold mine of possibilities here.

YouTube Is The New Substitute Teacher

School, like most of everyday life, is at times boring and occasionally a waste of time. We can place blame for that squarely upon the education system and teachers, or share it with parents if we’d like to keep diplomacy in the PTA. But although it’s true that the adults who shape and deliver education as we know it are largely responsible for what we learn and how well we learn it while we are children, we have nobody but ourselves to blame for allowing ignorance to persist after we grow up.

No matter how dreadful your education experience was as a child, if you reached adulthood literate enough to use the internet, then you should find developing a passing acquaintance with basic science concepts both convenient and entertaining. The idea that learning should be fun and easy is so compelling that YouTube is positively swarming with video bloggers enthusiastically sharing knowledge.

Because I am a science enthusiast and a lifetime devotee of independent study, I’ve compiled a video playlist of some of my recent favorites in that genre. To eliminate some common misconceptions, the playlist opens with the definition of science. From there, it builds from some interesting basics about water and carbon, covers some of the science frequently botched by Hollywood and in other fiction, and demonstrates that girls plus math equals win. Then follows a musical interlude, but it’s all science, so it’s all good. The last few are a sampler of videos posted by universities and science publishers for viewers who prefer productions with bigger budgets.

Now all you have to do is watch and learn.

Traumatic Brain Injury

Last spring, I put out a call on my public journal for topic suggestions. A friend of mine and traumatic brain injury [Wikipedia] (TBI) survivor suggested I explore what TBI [Mayo Clinic] has taught us.

Like many of the topics I’ve written about here, I had much to learn before I could begin. Once I researched TBI [Neurologic Rehabilitation Institute at Brookhaven Hospital], I had difficulty breaking the vast topic [Open Directory] back down into a streamlined piece. I have my former editor, Kay Holt, to thank for some of the links I will be including and also for the flow of the piece. As usual, the links will take you to articles that explore the main and related topics more thoroughly. Please have a look beneath the surface.

Read the rest of this entry »

All Aboard The Science Bandwagon

The crime rate may be down, but there are still plenty of villains to catch. Fortunately, science is on the case. That’s true in real life, where physicists devise more accurate ways to interpret blood spatter, and mathematicians analyze the patterns in gang violence to help solve old crimes and suppress future criminal activity. And it’s true on television, where forensic science has developed a vast and squeeing fandom.

Predictably, that fandom overlaps speculative fiction fandom quite a bit, but sadly, television science appears to have eclipsed science in sci-fi altogether. While it’s wonderful that scientists and Hollywood are forging new alliances for the sake of conjuring realism and as a canny method of reminding the masses that science is relevant to their interests, it’s disheartening to watch literature surrender that influence one sparkly vampire at a time.

No, it’s worse than disheartening. It’s uninteresting. And it’s unhealthy for speculative fiction to eschew – even disdain – science. Reading science-less sci-fi is like eating a junk food diet. How can the genre with science in its name be taken seriously if it’s about as intellectually nutritive as a Twinkie? Was it inevitable that television would eventually surpass literature as inspiration as well as entertainment?

Wonder of wonders, TV viewers like a little science in their fiction! Given the overlap between television audiences and people who read books, it’s probably safe to assume that readers also like a little science in their fiction. We should get back on that bandwagon.