Hello everyone,

I'm afraid I haven't had much time to dedicate to the ol' blog recently. I've been hard at work co-writing Op Eds on child development in preparation for the release of my upcoming book. The first one came out yesterday and has gotten a bit of press.

Check it out in The Huffington Post and The Guelph Mercury.
 
 
Genetic science brought into focus many of the murky and formless suspicions about inheritance that human have held, in various tenebrous forms, for millennia.  We've long known that lineage matters-- concepts crudely resembling our modern understanding of genes harken back to the ancient Greeks, and certainly even our earliest ancestors noticed the predilection of children to resemble their parents-- but the exact mechanism involved in passing along these traits eluded us until relatively recently.

With the discovery of DNA-- or, more accurately, the discovery of DNA's importance in heredity-- scientists felt they finally had the slippery business of inheritance nailed down. Genes accounting for everything, the believed, and with that assumption came a certain amount of fatalism. Everything about us, we were told, was hardwired from conception. Athleticism, intellect, skill, it all crystallized beneath a veneer of predestination. Why bother doing anything at which you're not genetically-inclined to excel?

I don't intend to belittle the important role genes play in our development. They are a key element of what makes us us. But they are not the only element. 

More Than Just Genes

The phrase Nature versus Nurture describes a debate that raged for much of the twentieth century between nativists and behaviourists regarding whether certain traits are inborn or learned. Said war has reached something of a ceasefire. While many of the particulars remain the subject of dispute, scientific opinion concedes that the answer involves a blending of the two camps.

Many excellent and interesting books exploring these issues have been published recently. David S. Moore and Matt Ridley each wrote a treatise on the subject, emphasizing the nuanced interplay of genes and early childhood experiences and its effect on human development. 

Eva Jablonka and Marion J. Lamb took it one step-- or perhaps I should say two steps-- further in  Evolution In Four Dimensions: Genetic, Epigenetic, Behavioral, and Symbolic Variation in the History of Life. The book, as its title suggests, proposes a "four-dimensional" model of human inheritance in which each factor complements the other, its axis defining but one group of aspects from which the organism as a whole is assembled.

Two of the dimensions, genetic and behavioural, more or less resemble the old nature-nurture dichotomy. The third and forth dimensions are more unusual. The epigenetic dimension refers to a collection of molecular switches attached to our genes and capable of turning them on or off (more about that in a later post). The symbolic dimension refers to our species' unique capacity to transmit information through print. Text, argue Jablonka and Lamb, is simply another form of inheritance, serving as it does to convey skills, ideas, and behaviours to a subsequent generation.

A Numbers Racket

You can agree with these authors or not. No one is saying any one of them has all the answers. The point is that, among the innumerable mysteries of human inheritance, a few things are certain. One is that genes are important. Another is that they aren't all-important. 

I like to think of genes as established probabilities. Based on your genetic blueprint, you can predict, with varying degrees of accuracy, certain parameters for your physical and mental development. 

Some of these parameters are pretty firm, such as height. If you are genetically inclined to be somewhere between 5'-2" and 5'-8", you probably aren't going to crack 6'-6", no matter how nourishing your diet or robust your stretching regime.  

Other parameters are more malleable. IQ, for instance. Genes have been shown to have some impact on IQ, but these effects really only appear in affluent homes. Among more impoverished families, where a child's access to the basic needs of intellectual development (books, decent schools, etc.) is less certain, the environment takes center stage

Does that mean beyond providing a modest level of security, support, and stimulation, children's environments have no effect on their physical or mental development?

Not at all.

Sum of Your Parts

The truth is, much of what makes us who we are is still unknown. The general topography is pretty well mapped, but the finer points, the ponds and streams and hillocks, are more or less uncharted space. 

In a few instances, genes really do run the show. Cystic fibrosis is a good example. If you inherit the gene, you get the condition. But in the vast majority of cases, things aren't so simple. Conceding to genetic determinism is a lot like playing the lottery, losing, and then quitting your job and dropping out of school because you're not rich. You didn't get the golden ticket, sure, but there are plenty of paths to the top. Sometimes genes lead the way, other times a good upbringing. There's no right answer.

The world would be a pretty boring place if there was.
 
 
Picture
As a place to explore, the surface of the Earth is pretty played. There have been few if any big boons for land explorers in the last hundred years, and with the advent of GPS technology and global satellite mapping, the idea of hacking through a jungle canopy in search of some great lost city seems downright anachronistic.

We have new frontiers now. Space. The ocean floor. The unfathomable micro-cosmos of subatomic particles. And, perhaps the strangest and most entrancing shores of all, the human brain.

It's a Jungle In There

Despite a litany of Latin terminology, state-of-the-art fMRI technology, and hundreds of neuroscience programs around the world, much of the brain remains shrouded in mystery. We do not yet understand how memory works, what emotions or consciousness are, or why we dream. We fumble through the brain's almost limitless connections-- many people know the oft-touted statistic that there are as many neurons in the human brain as stars in the galaxy-- hoping to discover how the whole thing links together. A blueprint of consciousness.

We aren't there yet. Our maps are rough and sketchy and filled with yawning blank stretches where There Be Dragons. But the first few outlines of that strange and distant neural shore have already been charted.

And as far as those preliminary sketches go, Dr. Axel Visel has drafted a particularly useful one.

The Gene Atlas

The cerebrum is the home of the "higher" brain functions, by which I mean those associated with  conscious thought. In mapping the human genome, scientists have discovered most of the genes responsible for building the cerebrum; however, genes alone aren't enough to construct a mind, and the supplementary materials responsible for starting the process have, until recently, been less well understood.

Certain sequences of DNA do not consist of genes in the traditional sense. Rather, they code for  products that encourage or inhibit the coding of other DNA sequences. One of these products is called a gene enhancer. Though closely partnered with the genes they modify, gene enhancers are not necessarily located anywhere near them. They can be upstream, downstream, buried in introns or chilling out on an entirely different chromosome. This can make them difficult to find.

Dr. Visel knows this, which is why he and his team of researchers developed their atlas of gene enhancers, an electronic database chronicling the location and identity of hundreds of gene enhancers found within the human cerebrum.

Cross-Referencing

But what's the big deal about gene enhancers?

Basically, knowing every gene coding for the cerebrum isn't enough; too much of our DNA lends itself to other pursuits besides protein production (the task that makes a gene a gene and not simply a string of nucleotides) and too many problems stem from this mysterious code (as many as half, according to Dr. Visel). Without factoring in enhancers, silencers, and other intron-dwellers, we're only getting part of the story.

A better knowledge of DNA's less celebrated functions-- enhancers, silencers, and whatnot--  means a better knowledge of the brain as a whole. And knowing the location and output of certain gene triggers will allow researchers to more effectively search for, and perhaps even remedy, the causes of neurological disorders such as schizophrenia, autism, and epilepsy.

Of course, Visel's atlas is no panacea. But the  insight into the brain's inner workings gleaned from his database will likely prove a valuable tool in future research. 

 
 
Picture
Researchers at Johns Hopkins University have discovered a new way to protect certain immune cells against HIV infection. Here's how it works.

DNA consists of long strands of molecules called nucleotides, which consist of three elements: a sugar, a phosphate, and a nucleobase. Sugar and phosphate are vital parts of the nucleotide, but it's the nucleobase that gives the molecule its distinct "personality." 

The Genetic Alphabet

Nucleobases come in four varieties: cytosine, guanine, adenine, and thymine (or C, G, A, and T for short). Together, the four nucleobases comprise our genetic alphabet. Our cells use them to "read" our genes and carry out their instructions, which generally involve building a specific protein product.

But the transition from DNA to protein is not a one-step process. To deliver its message, a gene must first create a copy of itself that the enzymes responsible for building proteins can consult, utilize, and ultimately discard. This short-lived genetic missive is called RNA. Think of DNA as the master copy of a blueprint and RNA as the printout given to the builders on the ground, who smudge, crumple, stain, and tear it up mercilessly as they work.

DNA and RNA are both made up of C, G, and A nucleotides, but T is exclusively a DNA product.  When a T is called for, RNA subs in a fifth nucleotide called uracil (or U). Though not part of DNA's alphabet, the odd U occasionally finds itself tossed into the mix, often as an unprepared substitute for a T.

This could cause serious problems if left unchecked, but genes have a lot of fail-safes  One of them is the enzyme hUNG2, which peruses the genetic code for unwanted U nucleotides and snips out any of them it sees. Another enzyme keeps the U population in check in an attempt to cut down on the number of U-T swaps that need to be made in the first place. Work smarter, not harder.

The Nucleotides of War

Unlike bacteria, which are living beings in their own right, viruses straddle the border between the animate and the inanimate. They are essentially bits of genetic matter in protein sheathes, and they can survive solely within the cells of other living things.

Once a virus penetrates one of your cells, it hijacks your enzymes and uses them to replicate itself. The tools of cell regeneration-- the very mechanisms responsible for keeping you alive-- are turned against you, falling into the virus' grubby, infectious hands. They behave like microscopic barbarians, storming the gates of a cell, butchering its inhabitants, plundering its armoury, and using the ruins as a base of operations from which to plan further raids.

But if these tiny intruders are going to use your own enzymes against you, why not leave them a few booby traps?

Amy Weil, a graduate student at Johns Hopkins, found that some immune cells do just that. These cells-- called resting cells-- don't reproduce, and so lack the enzymes responsible for clearing out unneeded U nucleotides. They do, however, have plenty of hUNG2.

The result? When HIV invades these cells and tries to reproduce, it gets a nasty surprise. Its once pristine genetic code becomes riddled with uracil. Clogged with incompatable nucleotides and unable to reproduce, the virus is soon discovered by hUNG2, which gleefully chops it to bits.

Going Nuclear In the Nucleus

Dr. James Stivers, the professor overseeing the process, claims the damage wreaked on viruses by hUNG2 is impressive. "It's like dropping a nuclear bomb on the viral genome," he said. 

Of course, the hUNG2 punch has its limitations, the most obvious being its ineffectiveness in a replicating cell where uracil levels are kept in check. In that situation, it would be the rightful dna the hUNG2 chops to bits, and the cell would destroy itself long before a virus got the opportunity.

But for non-replicating immune cells, it offers a tantalizing possibility for treatment. It will be interesting to see where this research heads.

It may be fighting dirty, but when it comes to viruses, dirty is pretty justified.

 
 
Picture
This Tuesday, I was invited to Carleton University to attend the launch of their FEI Tecnai G2 Ultra High Resolution Field Emission Transmission Electron Microscope. The event included a snack table complete with cheese plate (score!), a few brief talks by department heads and a representative from Environment Canada, and a tour of the facility.

The microscope itself is a sleek, imposing, and strangely elegant piece of machinery. For a layman unfamiliar with the trappings of a hard science research lab, it's hard to tell exactly what it is on first glance. It dominates the tiny, informal room like a science fiction supercomputer, the findings of its extraordinary eye displayed by a pair of computers stationed nearby. 

I watched it for some time. On the screen floated a grainy grayscale image awash with curiously uniform static. Silver dots and lines formed complex patterns. Our guide pointed to the display and said "We're looking at a highly magnified piece of silver. The dots you see are individual silver atoms."

Atoms. I'd never seen an actual image of them before-- not one displayed live, anyway. They trembled as the guide spoke, the sound waves from his voice crashing against the highly delicate machine.

It was a pretty cool experience, but it made me realise that, although I knew it was physically possible to see atoms, I didn't have the slightest idea how. What exactly is an electron microscope, and how can it let us see something so small that even bacteria stand as giants before it?

A Finer Lens

A conventional or "light" microscope as you might see in a high school biology lab works, to put it very simply, by stacking two convex lenses atop one another. These lenses "bend" the light shining up from the microscope's stage, thus magnifying the image and allowing the human eye to perceive materials and organisms far smaller than it otherwise could.

However, light microscopes have their limitations, the most notable being the wavelength of visible light. As light waves within the visible spectrum oscillate every 400-700 nanometres, anything smaller than this cannot be properly seen through a light microscope. Hardly a problem when dealing with larger cells, but on a molecular level, a max resolution of 400 nanometres leaves a lot of stuff out of the equation (an atom's size, including its electron orbit, ranges from  0.1  to 0.5 nanometres).

For centuries, this limitation made the atom a purely theoretical construct, something that had long been postulated about-- permutations of atomic theory, albeit deeply flawed ones, extend as far back as 400 BCE-- but never properly seen. It wasn't until the beginning of the 20th century, when the cathode ray became marketable, that the idea of using something with a wavelength far smaller than light to produce images became practical. That something was electrons.

Put simply, an electron microscope shoots a beam of electrons through a specimen, which can be anything from a prepared metal to a piece of organic matter. By meticulously observing the reactions of these electrons, the microscope constructs a highly accurate image of the specimen at a maximum resolution that far exceeds what a light microscope is capable of.

Sweat the Small Stuff

Obviously, the electron microscope exponentially increased our capacity to understand-- and tinker with-- the basic building blocks of matter. Carleton's new purchase can reach resolutions of 0.144 nm, allowing it to see not just atoms, but atomic bonds. 

It's the most powerful microscope in Ottawa at the moment, and its uses are not wholly academic. Environment Canada intends to use the microscope for a variety of important projects, and the microscope will be made available to other clients as well.

So if you feel like taking a closer look-- a much, much, much closer look-- at that mole on your arm, give them a shout. I'm sure they'll hook you up (if the price is right).

 
 
On April 3rd, 1950, MacMillan Publishers-- a venerable and well-respected press known for releasing textbooks and the influential periodical Nature-- published a book by Immanuel Valikovsky entitled Worlds In Collision
Picture
Though he was a psychoanalyst by trade, Velikovsky's theories ventured far outside the confines of the human mind. He proposed a radical restructuring of our understanding of the solar system, upturning centuries of scientific understanding at a single stroke-- or trying to, anyway.

Using a hodgepodge of vague scientific claims, myths, and literary allusions, Velikovsky claimed in Worlds In Collision that the planet Venus dislodged from Jupiter sometime around 1500 BCE, narrowly avoided a collision with Earth, and settled into its present orbit around the sun.

The near miss left our planet literally shaken, changing our orbit and axis and inciting a host of natural catastrophes recorded in the myths and stories of various societies around the world.

A Schism of Opinion

Scientists rejected Velikovsky's claims out of hand, and with good reason-- his understanding of basic scientific principals was often flat wrong, his predictions stolen or too vague to test, and his claims contradicted by far more established and well-verified theories. 

But the public weren't so quick to dismiss him. Worlds In Collision topped the New York Times bestseller list for eleven weeks (no small feat for a book rife with footnotes, obscure literary allusions, and grad school diction), and factions supporting Velikovsky sprouted up in big cities and college campuses around the world. His specter loomed large on the fringes of scientific discourse well into the seventies, occasionally bolstered by a new bit of press, until Velikovsky died and the public lost interest.

For anyone who studied astronomy or physics, Velikovsky's enduring legacy was both maddening and deeply puzzling. His acceptance by much of the public spoke to an indifference, even a downright distrust, of scientific orthodoxy. When considering a subject rooted in astrophysics, why would so many people take the word of a psychiatrist over hundreds and hundreds of astronomers and physicists?

Perception

Well, for one thing, Velikovsky was an intelligent man. He was educated and well-spoken, with the poise and bearing of an intellectual. He possessed an earnest confidence in his theory that never overextended into arrogance. His medical credentials gave him a veneer of scientific credibility-- he may not have studied anything even remotely related to the subject on which he spoke, but his doctorate was enough to get him into the lobby of the Ivory Tower. From there, finding a free balcony from which to spout his bizarre iconoclastic gospel was a fairly simple matter. Toss out a few buzzwords, cloak your argument in references to myths and legends, and you have a pretty convincing bit of theater. As with all great stories, you wanted to believe it. 

But the biggest key to his success was, arguably, the care he took not to upend any beliefs the public held too dear. Venus exploding from Jupiter's belly is big and exciting. Esoteric complaints about Venus' atmospheric makeup and Newton's laws of motion got lost in the razzle-dazzle. If Velikovsky's theory had attempted to refute the existence of God, it would have raised some serious hackles. But it didn't. 

God On Your Side

In fact, Velikovsky shrewdly tipped his hat to Christianity, claiming the catastrophes outlined in Worlds In Collision lent scientific credence to the miracles outlined in the Old Testament. Perhaps his most notorious example was that Venus' passing close to Earth temporarily froze our planet on its axis, suspending the sun long enough for Joshua to conquer Gibeon

By using a seemingly scientific claim to reinforce a biblical passage, Velikovsky garnered his theory a great deal of social capital. Worlds In Collision transformed from an absurd hallucination by a crackpot pseudoscientist into a serious academic tome debated by two parties on more or less equal intellectual footing. On one side stood the Godless elites, their smug faces wrinkled with disdain; on the other was an earnest, intelligent, well-spoken underdog with the bible on his side. 

Brilliant marketing.

Branding Facts

The Velikovsky Affair, as it colourfully came to be known, reflects a larger problem within scientific orthodoxy. It is a crisis of reputation. The more we learn, the larger and deeper and abstract our thinking, the harder it becomes to accurately convey these ideas to the public. They become insular concepts, and the public is told they must accept them without reflecting on the evidence from which they came. 

As a result, people not educated in a given discipline aren't likely to spot a fake based on his or her ideas alone. Much of Velikovsky's "evidence" was provably wrong, but most of us-- myself included-- would struggle to prove it wrong ourselves without the aid of a more informed party-- namely, a physicist or astronomer or expert in the given field.

But how do we know the expert is right? Because he's an expert? That's hardly a proof capable of sustaining the pressures of scientific rigour. Yet without a point of reference, fact and fiction can become hopelessly blurred. The truth (as the public sees it) then pivots on marketing. And the truth has never been the successful marketer's modus operandi.

This can be a serious problem. Take, for instance, climate change. Scientists are more or less unanimous in their opinion that burning fossil fuels has had a real and detrimental impact on Earth's climate. By contrast, a quarter of Americans believe climate change to be an unproven theory. 

Why do they believe this? Because believing it might mean making some unpleasant changes to their lifestyle? Because a politician or celebrity or crazy uncle told them so? Because Al Gore's a filthy commie and everything he says is a lie? Who knows? Whatever the reason, what the latest evidence suggests about an issue and how people feel about that issue are by no means causally linked.

And when it comes down to making changes and implementing solutions to serious, perhaps even life-threatening problems, evidence takes a back seat to public opinion every time.

Science For the People

Science works best when it treats every hypothesis with absolute scrutiny, never dismissing an idea because it sounds wacky or clashes with the status quo. Unfortunately, no research department on Earth has the resources for that. 

So what's the solution? I'm not sure there is one. A greater focus on scientific literacy in schools would help. but I think the bigger issue might actually be on science's end. We need to educate, not simply inform. Pronouncements on high might sound good, but they're easy enough to refute. There'll always be another guy with a bushier beard and a taller mountain making pronouncements of his own.

 
 
Well that was a little oversold, wasn't it?

It's December 22nd, 2012. For most of us, it's a day like any other. We open our facebook pages and twitter feeds, bracing ourselves for one last barrage of Mayan apocalypse memes, then go about our business, savoring the few blissful moments of sanity and circumspection before the next oddball cranks out his own spin on the rapture and the media once again over-reports on it. 

But a few people are sincerely shocked. They emerge bleary-eyed from their bunkers and basements, probably thinking much the same thoughts as victims of every other doomsday hoax. A mixture of relief, embarrassment, and anger-- either at the shysters who swindled them or at themselves for getting caught up in the hype

Panic For Sale: All Your Money OBO

Fear is a marketplace-- an awful big one, in fact-- and like any marketplace, there are buyers and sellers. The sellers don't interest me much. I know what motivates them: greed, a desire for attention, or simply the joy of winding people up. Either way, their behaviour is, though often unethical, at least comprehensible.

Much more interesting (to me, at least) are the buyers. Folks who invest wholesale in the concept of global destruction, often spending their life savings on survival gear, elaborate parties, or simply getting the word out about the rapture.

Admittedly, some of this behaviour can be seen as a sort of semi-ironic kitsch, an excuse for extravagance masquerading as some Prospero-like end of the world jubilee. Supposed safe havens like Rtanj, Serbia and Bugarach, France played up their reputations as oases of calm in a sea of coming chaos, and though I'm sure some people bought it, many more were probably just having a good time.

But in the fervor sandwiched between every apocalyptic announcement and the alleged big day of which it warns, dozens, hundreds, or even thousands of people fall prey to would-be soothsayers and opportunistic entrepreneurs  eager to take their money in exchange for fleeting promises of safety, salvation, and even post-rapture pet care
Picture
It's in Revelations, people!
My Oh Mayan

The so-called Mayan Doomsday is an especially interesting example of this phenomenon, for a number of reasons:
  • It isn't based on the bible. Not all doomsday predictions stem from Christianity, but most of the "successful" ones (i.e. those that snag headlines and followers) use it to their advantage. Harold Camping constructed his narrative of global destruction on the Bible, a text so swollen with cultural significance that even the murkiest and most half-baked of its interpretations can carry a lot of weight for believers. Even the Heaven's Gate cult referenced Revelations in order to lend a little "credibility" to its movement.
  • Mayans didn't buy it, so why should we? Most, if not all, of the people who freaked out about the supposed "Mayan Doomsday" don't subscribe to a Mesoamerican religion, don't even know anything about said religions, and would probably consider them sacrilegious if they did. Mayans themselves think the whole thing is pretty hilarious. You're making us look like chumps in front of the Mayans, guys. Knock it off.
  • The "prophecy" the entire event is supposed to be based on is simply the result of a poor translation. The Mesoamerican Long Count Calender does note the conclusion of a b'ak'tun (or era) on December 21, 2012, but the majority of scholars consider this to be a transition point between b'ak'tuns, not a prediction of end times
I don't even want to get into the absurd specifics of how the world was supposed to end-- there's no planet Nibiru, people-- but suffice it to say, the Mayan doomsday is a great example of how easy people are to manipulate. We've been tricked before, and we'll be tricked again.

If the Mayan apocalypse has taught us anything, it's that critical thinking is seriously undertaught and undervalued. Until we make these skills a top priority in schools, these same claims are going to keep appearing and people are going to keep getting swindled.

Of course, a savvy, skeptical populous might actually start holding politicians accountable for things, so consider it a low priority in the elementary school curriculum.

In the New Year: an even bigger lie.
 
 
First, a disclaimer: I have yet to see Peter Jackson's take on The Hobbit. I certainly intend to, though I'll admit to feeling more than a little trepidation about it.

It's not that I think Jackson's movies are bad. The Lord of the Rings was a lot of fun, and as a fan of the books I enjoyed seeing some of my favourite scenes acted out on the big screen. The action scenes were a bit bombastic and over-long for my tastes, but I know that's what people want to see, and a completely faithful film adaptation of a 1,500 page epic packed to the gills with history, physical descriptions of landscapes, and arcane lore would not fare all that well at the box office.

But The Hobbit is different. It's not supposed to be a big story. The Lord of the Rings is a long, sprawling, episodic narrative. It doesn't just deserve multiple films; it requires them. The Hobbit, meanwhile, is a fable. It's light, playful, and filled with a child-like sense of discovery and wonder. 

Most importantly, it contains only one narrative arc.

The Hobbit has a single protagonist, the eponymous hobbit Bilbo Baggins, and the entirety of the book belongs to him. Characters support and oppose him, bully him and bolster him, walk with him for a while and leave to live their own lives outside the confines of the text, but the story remains Bilbo's. We get a few short asides where other characters are given the spotlight, but these are brief expository scenes used to ferry the story along, not separate narratives equipped with their own trajectories. They don't become new stories. They fill the main story in.

All of this is to say that The Hobbit doesn't need three movies. It shouldn't even need two. A single film could do it easily, at that film needn't be longer than two hours.

Jackson's first film is almost three.

This worries me, and because my blog is supposed to be about scientific research, I'm going to use some numbers to explain why.

By the Numbers: A Mathematical Proof of Excess

The Hobbit: An Unexpected Journey has a total running time of 169 minutes. Judging by how Jackson handled The Lord of the Rings series, we can probably assume that the following two movies will be the same length, if not longer.

169 minutes X 3 movies = a total running time of 507 minutes (8 hours 27 minutes). The first edition of The Hobbit spans 310 pages.

507 minutes / 310 pages = 1.64 minutes of film for each page of the book.

The average person can read approximately 300 words per minute. At 95,022 words long, a typical adult could read The Hobbit cover to cover in 5 hours and 17 minutes. That means watching the films back to back would actually take the average person three hours longer than reading the book.

If efficiency is your aim, The Lord of the Rings films are a much better strategy. The series clocks in at 558 minutes (178, 179 and 201 minutes for Fellowship, Two Towers, and ROTK, respectively) versus 1,571 pages in the books' respective first editions.

558 minutes / 1,571 pages = 0.35 minutes/page (extended edition numbers: 683 minutes / 1,571 pages = 0.43 minutes/page)

With a total word count of 455,125, the books take roughly 1820.5 minutes-- or just over 30 hours-- to read. Compared to the book, even the extended editions of the movies (683 minutes, or 11 and a half hours) seem speedy.

So What?

Of course, I'm drawing subjective conclusions from objective data. The numbers prove that The Hobbit contains more footage per word of text than did The Lord of the Rings. What they can't prove is whether or not this is a bad thing. What's more, I'm aware that the films will draw on other source material, namely The Silmarillion, in order to flesh out the story. 

My question is why? Why dedicate so many hours of film to a single children's book? Why bring in a bunch of subplots when Tolkien didn't consider them relevant enough to the story to include them in the first place? Sure, it'll be nice to learn a bit more about the Necromancer ("Who is this guy?" I remember thinking at fourteen. "Surely he'll appear at some point." But nope.), but beyond that, I just don't see the point.

The Hobbit is a wonderful story as it is. It's quick and breezy and fun, a hero's journey with a lush and complex world skirting about its edges. The Lord of the Rings takes its time examining that world in detail. The Hobbit simply tromps through it. That's what I love about it.

I love The Lord of the Rings too, but for different reasons. The two stories have different tones, styles, and strategies for telling their stories. A director should play to each book's strengths, not shoehorn a previously successful formula into source material where it doesn't fit.

And the numbers show that Jackson is attempting to weave a wall-sized tapestry with what amounts to maybe a towel's worth of thread. It's lovely thread, but there simply isn't enough of it. Don't dilute it with outside fibers. Let the story be.

Bonus: Here's the running time/page length ratio for a few other sci-fi and fantasy novels and their film adaptations

Do Androids Dream of Electric Sheep? (released in film as Blade Runner):
117 minutes (1982 release) / 210 pages = 0.55 minutes per page (approx. 30 seconds)

Watership Down:
101 minutes / 413 pages = 0.24 minutes per page (approx. 15 seconds)

Harry Potter and the Deathly Hallows:
276 minutes (combined running time of both films) / 759 pages (US first edition) = 0.36 minutes per page (approx. 20 seconds)

2001: A Space Odyssey:
141 minutes / 221 pages =  0.63 minutes per page (approx. 35 seconds)

Dune:
137 minutes / 412 pages = 0.33 minutes per page (approx. 20 seconds)





 
 
Meet the Kim Kardashian of rhesus macaques.

Stumbling stylishly into public consciousness in a shearling coat, Darwin-- or The Ikea Monkey, as he has come to be known-- rocketed to internet fame, spawning a host of memes and setting twitter aflame with delight, bemusement, and snark.

The story itself is still unfolding, and has been covered to death elsewhere-- The Star has a couple of good articles about it, if you want to learn more-- so I'll leave all that aside. Instead, I'd like to talk about the species for which the adorable Darwin has become an unwitting representative.

All In the Family

Rhesus macaques are a species of old world monkey. They are brown or grey in colour with long pink faces, and stand a mere half-metre tall when fully grown. An adult male macaque weighs about 8 kg, while a female weighs only 5.

Extremely social animals, rhesus macaques live in tight-knit communities called troops. Troops are matrilineal and largely female. Males are few in number, and invariably come from neighbouring troops, whereas female macaques generally descend from female macaques of the same troop.

Males born into the troop are treated well until they reach adolescence (age 4 or 5 in macaques), at which point the troop's dominant males chase them away. Exiled and at constant risk from predators, adolescent macaques form gangs and roam the countryside, living a semi-nomadic existence at the outer fringes of macaque society.

These teenage runaways live lives of savage bravado, undergoing a fierce sort of Darwinian trial by fire. The mortality rate among the gangs is many times that of the troops themselves, and the adolescent males are, understandably, eager to rejoin a troop. 

How do they do this? Much the same way that female macaques improve their social standing within the troop: by picking up on, and responding appropriately to, the thousands of subtle cues that make up macaques' social lives.

In this sense, macaques are very human-like.They form close familial bonds as humans do, observing a different but no less rigid code of behaviour. Top macaques engage in a constant struggle for power, endlessly sizing up rival families and waiting for the chance to topple a dynasty and take its place at the top. Male adolescent macaques, meanwhile, live in the moment. They tend to be brash, reckless, and preternaturally drawn to risk.

Sound familiar?

The similarities don't end there. Macaques get depressed, anxious, and impulsive, much like humans. They even suffer from alcoholism.

We're Not So Different, You and I

Of course, we share many of these traits with most of our primate cousins. And though macaques are our relations, sharing 93% of our DNA, they are distant indeed when compared to other old world monkeys like chimpanzees, who share a whopping 95-98% (the exact number is disputed). To reach the common ancestor of macaques and man, you'd have to travel 25 million years back in time; with chimps, the trip would be 6 million years, a trifle by geological standards.

Yet there is one trait that is, among primates, unique to humans and macaques. It's a tricky sort of thing to pin down, a tenacious tendency I'll call adaptability. 

Take a group of apes or chimps or baboons from their natural habitat and drop them unceremoniously on a new continent. Pay no mind to the climate, geography, flora, or fauna with which this adoptive home provides them. Give them no familiar food, no prefabricated shelter, no help of any kind. Odds are they'll be dead in a year, victims of a culture shock so potent and profound it cut through their intelligence, instinct, and strength, severing the cord of hope that tethered them to existence.

Humans don't have this problem. In fact, we've performed this very experiment on ourselves over and over again for hundreds of thousands of years. The only reason we stopped is because we ran out of empty places to inhabit (and even then we spent an embarrassing set of centuries conquering territories and kicking out their former residents so as to give ourselves a fresh shot at settling in an alien land).

While a little less proactive in their expansion, macaques are no less adaptable. Their native territory extends an impressive distance, stretching from the deserts of Afghanistan through the jungles of the Indian subcontinent, the forests of Bhutan, and the Himalayan foothills of Nepal, all the way to the eastern coast of China. 

Macaque troops have thrived in rural Maryland, the arid plains of Texas, the swamps of Louisiana, and the Subtropical climes of the San Fernando Valley. Granted, these American colonies operate under human supervision. 

But don't think macaques couldn't get along without us. 

In 1938, a Florida businessman known as "Colonel Tooey" released a troop of rhesus macaques into the Silver River State Park in order to give his jungle cruise attraction a touch of exotic flavour. Flash forward seventy years and the macaques are still there. Their numbers have expanded to the point where they have become the bane of local farmers, who have repeatedly attempted to exterminate them without success.

Think of that. An invasive species of primate taking over land and causing no end of trouble for the indigenous population.

Again, sound familiar?
 
 
The human genome. Is there any scientific phrase more evocative? We can picture it in an instant, the famed double helix descending through the generations like a corkscrew ladder. Climb down beyond the last rung and you'll land with a plop in the primordial ooze from which we-- and everything else that is, was, or ever will be alive on this planet-- sprang a billion years ago.

Because of this, perhaps, genes have captured our collective imagination. They are to science fiction of the last twenty years what nuclear power was to science fiction of the forties and fifties: a deep, mysterious, vaguely ominous prospect, the gateway to both a brave new world of scientific achievement and a dystopian realm of apocalyptic nightmare.

And certainly, genetic science has done a lot for us. It has allowed us to assess our individual risk for a whole host of diseases, from cancer to heart disease to alcoholism. It has revolutionized the ways in which we fight many of these diseases. Heck, I even got paid to write a book about it. So if you ask me, genetic science is pretty swell.

But considering how fascinated we the public are with our genes, it seems strange that we don't know all that much about them. Or perhaps the bigger problem isn't that we don't know much about genes, but that most of what we think we do know is at least partially wrong.

This subject deserves more than one post, so expect subsequent entries into the Genes Unzipped series (as well as a TV show tie-in, movie franchise, and unappetizing brand of Gene-e-O's breakfast cereal). For now though, I'll focus on a single issue: that of the oft-touted phrase "gene for [insert headline-catching disease, skill, or behaviour here]." 

The tricky thing about this "fact" is that it seriously oversimplifies a nuanced issue by implying that a gene "causes" something, meaning that a tiny sliver of an organism's genotype is single-handedly responsible for a complex problem in its phenotype*. 

This is especially misleading when applied to behavioral traits like alcoholism, drug addiction, and ADHD, which, though certainly heritable and possessed of a genetic component, are determined in large part by the environment.

Consider the Dopamine Receptor D4 (or DRD4) gene. I have a lot to say about this notorious little gene, and it may very well become the subject of its own post. For now, suffice it to say that DRD4 comes in a number of different varieties called alleles. One of these alleles is called the 7-repeat (so named because it contains a nucleotide sequence that repeats 7 times), an infamous mutation that stands accused of countless crimes against humanity.

Studies have tied DRD4's 7-repeat allele to addiction, disorganized attachmentADHD, aggression, and excessive risk-taking, among other conditions. Many if not most of these studies have proved a strong correlation between 7-repeat and every one of these behaviours.

The only problem is, 7-repeat didn't cause a single one of them. And the studies never claimed they did.

What 7-repeat does is increase a person's susceptibility to the environmental factors that promote these behaviours. Children with 7-repeat alleles are more likely to suffer from addiction or excessive aggression or ADHD, but only when they grow up in abusive, neglectful, impoverished or otherwise "at-risk" homes. 

For example, if you have the 7-repeat allele of the DRD4 gene and you do heroin, you are statistically more likely to become addicted to it than someone with a different version of the gene. But 7-repeat doesn't perch on your shoulder like a cartoon devil, goading you into leaving your nice suburban home, hitchhiking to the rough end of New York City, and picking up some smack. If you're never exposed to drugs, and if you grow up in a supportive, nurturing environment where the emotional disturbances that lead to drug use never occur, no gene can force you to start using. 

The 7-repeat allele isn't the culprit; it's an accomplice. An accessory to the crimes of inciting all manner of troublesome behaviour, perpetrated by substandard learning environments. Saying a gene causes a certain behaviour is like saying owning a car causes drunk driving. It makes it a lot more likely to occur in the right circumstances, but if you don't drink the booze in the first place, it ain't gonna happen whether you own a car or not.

News articles trumpeting the discovery of a "gene for" this or that are often misleading, the result of poetic license or simply lazy journalism. There are diseases that truly stem from a single genetic mutation, and have essentially no environmental cause-- cystic fibrosis would be the most well-known of them-- but for the most part, diseases are caused by a combination of the two.

All this is not to say that genetic testing isn't important. It is, for all sorts of reasons, But it's important not to succumb to genetic determinism, where every disease and misfortune and challenge stems from the chromosomal soup swimming about your cells. 

We can't just look inside ourselves for answers .We have to look outside, too.



*phenotype refers to the sum total of an organism's observable characteristics. It stands in contrast to the genotype, which simply means the heritable characteristics coded into our genes. The phenotype encompasses the genotype plus any and all epigenetic and environmental influences that contribute to development. The genotype is the blueprint; the phenotype is the building. You need a blueprint to build a building, but you also need a foundation, building supplies, and workers to put it together.