Rice University logo

Archive for the ‘crackle – campus research’ Category

Sleep Deprivation: Normal Lifestyle or Dangerous Epidemic?

by: Susan Xie

Imagine an epidemic severe enough to impair memory and cognition, increase the risk of occupational and automobile injury, and alter typical brain responses to the extent where they resemble those of people with psychiatric disorders.1 After less than a week of experiencing these effects, otherwise healthy people succumb to a pre-diabetic state, a negative emotional outlook, and an inability to function normally in day-to-day activities.3,4 Furthermore, several large-scale studies from all over the world have reported an association between this epidemic and heart disease, high blood pressure, stroke, and obesity.3 With health risks this substantial, it is only logical that we would want to take all possible measures to prevent ourselves from becoming the next victims. But what if this so-called “epidemic” already runs rampant, especially on college campuses? What if our own habits and lifestyles naturally enable it to flourish and gradually claim student after student?

If you have ever stumbled back to your dorm early in the morning after camping out in Fondren Library all night, guzzling energy drinks or coffee to get yourself through a cram session, you are probably already well-acquainted with this bane of all things productive: sleep deprivation. If there is one thing we know for sure about sleep, it is that attaining rest is absolutely crucial, no matter how elusive or impossible it seems on some days. After all, biologically speaking, animals are most vulnerable when sleeping, yet every animal that has been studied thus far requires sleep, indicating that its importance outweighs the evolutionary risks of periodically losing consciousness.3 A well-established demonstration of the basic necessity of sleep was completed through a series of studies done in the 1980s. When a group of rats was kept awake indefinitely, individual rats started dying after only five days from sleep deprivation—the same amount of time that the animals would have lasted had they instead been subjected to food deprivation.3 Considering that sleep was shown to be just as essential to survival as sustenance, it should not be surprising that humans spend about one-third of their lives sleeping—time that, rather than being viewed as a repository of extra hours that we can access on busy days, should be valued as a prerequisite to achieving our full potential and performance levels.

However, asking college students to consistently get a full night’s sleep will seem more like a taunt for many rather than advice that can be feasibly followed. Even as students shoulder increasingly heavier loads of academic, extracurricular, and job commitments, they might still harbor reservations toward reducing the amount of quality time devoted to friends or plain procrastination. Under these conditions, the daily requirement of 7-9 hours of sleep for healthy adults becomes nothing more than an ideal. Even worse, one particularly busy day has the potential to initiate a vicious cycle: when we get inadequate rest for one night, we become less productive the next day and are thus forced to stay up later again to finish our work. Further complicating matters is the fact that students routinely overestimate how quickly they can finish assignments, when in reality, projects typically require twice the number of predicted days to complete.8 Much of the time, however, we are not even completely aware of how impaired or inefficient we have become. In fact, staying up late is often considered a “badge of honor” among college students, with the all-nighter regarded as a major “rite of passage.”8

Still, it does not take extensive scientific investigation to figure out that depriving ourselves of sleep is ultimately detrimental. All health risks aside, there is still the question of whether substituting sleep with late-night studying has true academic worth. While most students will contend that it is worthwhile to sacrifice sleep for extra time to learn and review material before an exam or to finish a paper, studies indicate that pulling all-nighters is associated with slightly lower GPAs, decreased alertness and delayed reactions, increased tendencies to make mistakes, and impaired abilities to think, process, and recall information.1,8 These are all factors that compromise a student’s overall performance and somewhat defeat the purpose of staying up later in the first place.

The effects of sleep deprivation also have serious implications regarding learning and memory. In a study conducted by Matthew Walker, director of the Sleep and Neuroimaging Lab at the University of California, Berkeley, college students who had been awake for more than 24 hours performed 40% worse when memorizing lists of words than they would have with a night of sleep.3 Additionally, Walker found that sleep actually enhances memories—after a full night of rest, students not only came back the next day feeling refreshed, but also performed better than they did the day before. After experimental subjects learned and repeatedly typed a random sequence of numbers, they were tested at different times of the day to determine the extent and effectiveness of learning. The group that learned the sequence in the morning and was tested 12 hours later exhibited about the same performance level. On the other hand, the group that learned the sequence late in the day and was tested after a night of sleep showed a 20- 30% improved performance.3 Therefore, according to Walker’s findings, the notion that we must stay awake longer to get more work done is misguided and counterproductive. Giving into the temptation of sleep is not necessarily yielding to weakness; rather, it is making a rational decision that is most conducive to accomplishing the maximum amount of work in the long run.

Aside from curbing students’ academic potential, sleep deprivation also warps our personalities and strains our interaction with others. After only one night of complete or even partial sleep deprivation, people exhibit much stronger negative emotions the next day and are likelier to remember bad experiences as opposed to positive ones.4 According to a key study conducted by radiologist Seung-Schik Yoo of Brigham and Women’s Hospital, people who had been deprived of sleep for 35 hours showed much greater activation of the amygdala (a primitive part of the brain that controls emotional arousal) when viewing such upsetting images as pictures of mutilated animals.4 This change in brain activity in response to lost sleep is so significant that even after several nights of quality sleep, people still have a “horrible bias shift” regarding their memories of the day following insufficient sleep.4 Not surprisingly, sleep deprivation can lead to tension, depression, and confusion; it is also associated with a generally lower satisfaction with life, an increased number of absences from classes or work, and a heightened risk of inadvertent injuries and death.8 Currently, the definite long-term effects of chronic sleep deprivation on learning, emotion, social relationships, and health have not been completely elucidated, but problems controlling impulses and emotions, coupled with sleep deprivation, are likely to lead to a “negative spiral” of fatigue, fluctuating emotions, and risky behavior.2

So now that we know the grave extent of this selfinflicted epidemic, where do we go from here? Is there a cure, or are there at least preventative measures? If so, are they within our reach? Most of us can manage one or two late nights well enough with caffeine and a sufficient amount of ambition, interspersed with adequate physical activity to keep us awake, but such arrangements are never long-term. The most important consideration to keep in mind is that sleep deprivation has a cumulative effect. After just a single night of 4-6 hours of sleep (not to mention anything less than that amount), people already begin to experience difficulties in remembering information, thinking quickly, and reacting in a timely manner; thereafter, each additional night of sleep deprivation only contributes an added burden to the growing sleep debt.3 At some point, these deficits accumulate to an extent where the only effective cure is to—you guessed it—sleep away the cognitive impairment. This is especially true of people who are chronically sleep-deprived. Even though they tend to harbor the erroneous belief that they have trained themselves to work continuously and function perfectly fine with fewer hours of sleep per night, they actually overestimate their own limitations considerably, much like people under the influence of alcohol, and exhibit no such convenient adaptation to their hectic lifestyles. 3

From these findings, it seems that no man-made alternative will ever fully emulate the profound effect that sleep has on various aspects of our mental and physical health. Every single one of us would most likely pounce at the opportunity to schedule some sort of session into our busy agendas that promises to improve our emotional outlook, decrease stress, boost memory and cognition, revive our physical wellbeing, and promote alertness and efficiency in handling everyday tasks. In fact, if a full night of sleep is unachievable, “power naps” can do just that. Taking the time to nap for about 20-30 minutes during the day boosts our working memory, information retention, alertness, and stamina; best of all, it eliminates drowsiness, counts toward the average total of 7.5-8 hours of sleep we need daily, and helps us make the most of limited time.3,5 So next time you need a boost, reaching for that energy drink may not be the best answer—especially not when you have a healthier, more effective option that is completely free of charge. In battling the ravages of this so-called epidemic, sleep is indeed one of the most underused, yet powerful, tools in our arsenal. Now, it is up to us to determine for ourselves whether groggily wading through that assignment would truly be more productive than catching a bit of proper, refreshing shut-eye.


1. Breus, Michael J. Sleep Habits: More Important Than You Think. http://www.webmd.com/sleep-disorders/guide/importantsleep- habits (accessed 10/25/09), article from WebMD. http://www.webmd.com/ (accessed 10/25/09).
2. Carpenter, Siri. Sleep deprivation may be undermining teen health. Monitor on Psychology [Online] 2001, 32, 9. http://www. apa.org/monitor/oct01/sleepteen.html (accessed 10/25/09).
3. Finkelstein, Shari (producer). The Science Of Sleep. 6/15/08. http://www.cbsnews.com/stories/2008/03/14/60minutes/ main3939721.shtml (accessed 10/25/09), article from CBS News. http://www.cbsnews.com/ (accessed 10/25/09).
4. Foreman, Judy. Sleep deprivation and negative emotions. 8/3/09. http://www.boston.com/news/health/articles/2009/08/03/ sleep_deprivation_and_negative_emotions/ (accessed 10/25/09), article from The Boston Globe. http://www.boston.com/ news/ (accessed 10/25/09).
5. Marten, Sylvia. How to Power Nap at Work. 4/9/07. http://www.spine-health.com/blog/ergonomics/how-power-nap-work (accessed 11/26/09).
6. Myers, David G. Psychology: Ninth Edition; Worth Publishers: New York, 2008.
7. Student Health Services, Texas A&M University. Sleep and the College Student. 9/08. http://healthed.tamu.edu/pdfs/General/ sleep.pdf (accessed 10/25/09).
8. Yahalom, Tali. College students’ performance suffers from lack of sleep. 9/17/07. http://www.usatoday.com/news/ health/2007-09-16-sleep-deprivation_N.htm (accessed 10/25/09), article from USA Today. http://www.usatoday.com/ (accessed 10/25/09).

Fruitless, Behavioral Genetics

by: James Liu

Nearly a century ago, Morgan, Sturtevant, Bridges, and Mullur, pioneers in Drosophila melanogaster (fruit fly) research, used this animal model to elucidate the chromosome model of heredity. They ultimately determined that chromosomes indeed carry genes and are responsible for sex determination. Since then, D. melanogaster has been one of the most widely and thoroughly studied animal models in biology. Much of this is due to the ease in working with D. melanogaster in the lab: its short generation time, lack of genetic recombination in males, and so on. Furthermore, advances in genetic manipulation in generating mutants (P-element insertions and irradiation) has allowed fly researchers to study the function of many genes in Drosophila.

One fascinating field that has gained momentum in fly research is the convergence of behavioral studies and molecular genetics–looking at how genes individually and acting in cohort, influence fly behavior. Determining the link between genetic defect and behavioral alteration is much more complex than merely knocking down a gene and observing the resulting defect. Researchers have studied the relationship between genes and behavior at a holistic perspective, beginning with behavioral defect in these mutations, down to the neurological aberrations, and further down to the molecular changes in these mutations.

Fruitless (fru), for example, is one of these genes. In fruit flies, it is well known as one of the sex determination genes. It was first identified in viable male fru mutants that demonstrated abnormal mating behaviors. Mutant fru males failed to distinguish between male and female mates, whereas mutant fru females are unaffected.1 Various fru mutants that ranges from complete abolition of the gene to merely alteration of the gene have since been created. The “degree” of fru mutation has shown to affect specific steps in males courting behavior. Severe deficiency in fru can abolish male mating behavior entirely. 2 Less severe mutations have resulted in abnormalities in the male’s ability to produce the correct “song” during the mating ritual.3

Fru is also expressed in the central nervous system (CNS) and its expression is the highest when the fly is at its pupal phase. It has been speculated that sensory information transmitted to the CNS is processed by neurophils, the cells that express fru. At a cellular level, fru may be involved in shaping sex-specific neuronal circuitry essential for proper mating behavior.4 However, there remains much to be uncovered in the neurological role of fru in influencing D. melanogaster behavior.

Much is yet to be accomplished in understanding how genes affect behavior. D. melanogaster is an efficient model for this type of study because of the nature of D. melanogaster mating. Mating is divided into specific steps, each requiring a set of visual, tactile, olfactory, and auditory cues. Studying mutants in courting behavior has provided many interesting behavioral abnormalities. But we have yet to make a clear link between how genes coordinate neuronal development and circuitry, which ultimately influences behavior. Many neuronal diseases that we observe today–Alzheimer’s, Parkinson’s, Huntington’s– all have a genetic component of the disease. Thus understanding how genes can affect neurodevelopment and ultimately influence behavior can provide important lessons for understanding how our own genes affect behavior.


1. Solkolwski, M. B., Nature Reviews 2001, 2, 879-890.
2. Goodwin S. F.; Taylor B. J.; Villella A.; Foss M.; Ryner L. C; Baker B. S.; Hall J. C., Genetics, 2000, 154, 725-745.
3. Villella, A.; Gailey, D. A.; Berwald B.; Ohshima S.; Barnes P. T.; Hall J. C., Genetics, 1997, 147, 1107-1130.
4. Baker. B.; Taylor B.; Hall J., Cell, 2001, 105, 13-24.

Programmable Nano Bio Chips

by: Eva Thomas

We would not have advanced in medicine if it were not for the diagnosis of conditions such as diabetes, cancer and other diseases through technology. Biosensors are devices that provide information about analytes, the chemicals to be investigated within the human body. The information provided by the biosensors can be analyzed to give diagnoses. These biosensors act with the same processes as the senses of smell and taste in that receptors on the tongue and in the nasal cavity bind specifically to detect chemicals. In the field of bioengineering, there is a movement towards miniaturized design which is driven by the necessity to be more cost-effective through the mass production of disposable devices. Beyond lower cost and increased availability, nanoscale assay technology offers rapid analysis and less waste of sample volumes.

Jesse V. Jokerst and John T. McDevitt have teamed up to research the creation of nanoscale diagnostics. Jokerst recently obtained his doctorate degree in chemistry with McDevitt at the University of Texas – Austin before McDevitt moved to Rice University. They condensed an entire clinical laboratory into a chip to form a Programmable Nano-Bio-Chip (PNBC) or what they have dubbed the “Lab- On-A-Chip.”6, 2, 1 The goal of the McDevitt lab at Rice University is to optimize this device by making it more accessible and affordable and by discovering more applications of the biosensors. The advantages of programmability include its modularity (with interchangeable parts), flexibility and the ability to process and learn new biomarker signatures, which are indicators of various conditions. The nanochip achieves large effects overcoming noise within the body even at nanoscale particularly because of the parallelization of the assay.1

The device was first designed as an electronic “taste chip”, as it emulates taste buds in its recognition of various substrates.1 A set of design criteria was considered during the development of the nano biosensors. A low cost of diagnosis would make the process more accessible. The employment of the device should be simple and easy to use. The device should be as small as possible to decrease invasiveness. The ability to test for multiple analytes would make the device more flexible and efficient. A quick reaction with a short turn around time would allow for a quicker response from the device.2 The device is intended to be $1 per test with a one-time use, disposable labcard. However, as the field of nanotechnology is still growing, there are difficulties in reaching these goals. The PNBCs are entirely self contained which make it different from other nanosensors which require laboratory support in clinics. Bioreactions occur and are subsequently measured on the chip.2 Agarose beads, acting as spongy capture agents, are conditioned, perhaps with antibodies, to be sensitive to the desired analytes.5 The 3D nature of the beads allows for more rapid isolation of the analytes than planar arrays of other assay methods2 Microfluids are implemented with the beads in order to both treat the sample and detect the target molecules.1 Fluorescent dye can be added as an indicator; the strength of fluorescence of the beads can be correlated with the concentration of the analyte.2 The device can be modified from a chemical processing unit to a cellular processing unit by replacing the panel with beads with a polycarbon membrane which acts as a filter at a larger scale than the chemical processing unit with beads.2 The membrane-based cellular processing can be used for cell counting and typing.3

The McDevitt lab also examines biomarkers, analytes and the conditions that they indicate. The PNBCs can already be applied to a wide array of analytes: pH, electrolytes, short polypeptides, metal cations, sugars, biological cofactors, cytokines, toxins, proteins, antibodies, and oligonucleotides.1 A PNBC can be implemented with saliva to diagnose acute myocardial infarction by using the biomarkers C-reactive protein, myoglobin and myeloperoxidase.4

The field of nanochips shows a lot of promise in all of its medical applications and the effect it could have on healthcare. The PNBCs are shown to have applications in HIV monitoring, chest pain diagnosis and gynecological cancer screening.6 The programmable nano bio chips have potential cancer and cardiac applications.1 The use of the PNBCs with the Electrocardiogram (ECG) is shown to be more reliable than ECG alone when diagnosing cardiac abnormalities. 4 The modular design gives it the ability to apply to new tests and increased interest in the nano-bio-chip will lead to progress in all diagnostic research.1 Improvements in microchip production will allow for even smaller, less invasive biochips and decrease in material costs are also to be expected.2 The nanochip achieves large effects even at nanoscale.1 The PNBCs have global implications because an affordable, mass-produced clinical test could revolutionize diagnosis worldwide.2 The PNBC is self-contained analysis method meaning that it can be used for diagnosis even in underdeveloped areas lacking laboratory supplies. A PNBC designed for HIV detection is particularly needed in developing nations.


1. Jokerst, J. V.; Floriano, P. N.; Christodoulides, N.; McDevitt, J. T.; Jacobson, J. W.;Bhangwandin B. D. Programmable Nano-Bio-Chip Sensors: Analytical Meets Clinical Analytical Chemistry. 2010, 82, 4533-4538.
2. Jokerst, J. V.; McDevitt, J. T. Programmable nano-bio-chips: multifunctional clinical tools for use at the point-of-care Technology Report: Nanomedicine. 2010, 5, 143-155.
3. Jokerst, J. V.; Camp, J. P.; Wong, J.; Lennart, A.; Pollard, A. A.; Floriano, P. N.; Christodoulides, N.; Simmons, G. W.; Zhou Y.; Ali, M. F.; McDevitt, J. T. Nano-Scale Control of Sensing Elements in the Programmable Nano-Bio-Chip Small. 2010, 1-41.
4 Floriano, P. N.; Christodoulides, N.; Miller, C. S.; Ebersole, J. L.; Spertus, J.; Rose, B. G.; Kinane, D. F.; Novak, M.J.; Steinhubl, S.; Acosta, S.; Mohanty, S.; Dharshan, P.; Yeh, C.; Redding, S.; Furmaga, W.; McDevitt, J. T. Clinical Chemistry. 2009, 55, 1530-1538.
5 Jokerst, J. V.; Raamanathan, A.; Christodoulides, N.; Pollard, A. A.; Simmons, G. W.; Wong, J.; Gage, C.; Furmaga, W. B.; Redding, S. W.; McDevitt, J. T. Nano-biochips for high performance multiplexed protein detection: Determinations of cancer biomarkers in serum and saliva using quantum dot bioconjugate labels Biosensors & Bioelectronics. 2009, 24, 3622-3629.
6 Jokerst, J. V.; Bhagwandin, B. D.; Jacobson, J. W.; McDevitt, J. T. Clinical applications of a programmable nano-bio-chip Clinical Laboratory International. 2009, 6, 24-27.

Two Photons Are Better Than One

by: Rahul Kamath

An Exciting Way to Image Neuronal Change Over Time

Andy feels his legs touch the bedpost when he wakes up in the morning. He also feels them when he sits down in his wheelchair every day. The problem is, Andy’s legs are both amputated. He is experiencing a condition known as phantom limbs; although an individual is missing an arm or a leg, he or she still feels its presence. Scientists believe that this condition is caused by the activation of underlying somatosensory cortex connections in regions adjacent to the “arm” or “leg” section of the brain.3 In other words, when one part of the body is touched or stimulated, the patient feels as if the amputated, non-existent body part is “touched” as well.

But studying such an issue proves quite difficult. In fact, many diseases that involve changes in the brain are tough to observe because of the minute changes in brain tissue and neural circuitry. However, there is a relatively new method that is helping to uncover exactly what happens in a diseased brain: two-photon microscopy.

This technique involves the absorption of two photons of light by target molecules; as these molecules leave their excited state, they emit light. This emitted light is then utilized to observe minute and particular images of tissue. Microscopes that utilize single photon absorption techniques provide less resolution and inferior spatial imaging, and have the potential to damage the sample.2 The reason for the late arrival of twophoton microscopy to the research scene was the previous unavailability of extremely high powered lasers.2

The most interesting and no doubt revolutionary aspect of this microscope, concerning neural tissue in particular, is its ability to image in vivo tissue, tissue from live organisms. For example, by using surgical procedures, a clear glass lid can be placed over a mouse’s brain in an area where observation is desired. This allows the microscope’s laser to easily image the surface of the brain without harming the mouse.

In addition, the same tissue can be imaged over lengthy periods of time, opening up the possibility for longitudinal studies that track changes in neural structure. For example, the figure above shows a set of image slices taken by the microscope. The images have been overlapped onto one another in order to give a full picture of the neurons in that region of the brain. The clarity is great enough that one can observe a dendritic spine, a small protrusion from a neuron’s dendrite, shown by the yellow arrow and enlarged in panel D. Scientists could therefore determine if this spine were to stop fluorescing properly due to a lack of calcium in the cell. The other arrows in the image mark the location of various parts of the neuron that can also be tracked over long periods of time.

This type of imagery is valuable for other reasons as well. With normal microscopes, neural tissue readily scatters light, resulting in poor imaging. The two-photon microscope employs longer wavelength and near-infrared light when scanning tissue, eliminating this problem4 In addition, scientists have developed techniques through which high resolution images can be obtained from live specimens. These images are usually free of motion artifacts or problems in the image due to movements by the organism under investigation.1

The two-photon microscope’s advanced imaging technique allows for the proper study of underlying cortical brain function in organisms, without the exacerbating usage of sedatives. Any sort of anesthetic would normally interfere with background processing that occurs in neural tissue and hinder attempts to gain a better understanding of the methods with which neurons integrate and work together.

The only thing we can do now for Andy is to try and explain to him the cause of his problem. Hopefully with advances like the two-photon microscope, the cause and reason for conditions such as these can be determined, leading the way to potential therapeutic and maybe even prophylactic measures, especially for conditions such as Alzheimer’s and Rett’s.

Unnatural Numb3r

by: Thomas Sprague

the modern neuroscience of an ancient cognitive capacity

In 2551 BC, one of the most profound engineering accomplishments of the ancient world was made. 2.3 million stones, each weighing over 100,000 pounds, were carefully extracted, smoothed, and hauled up a complex system of ramps to create a lasting monument for the pharaoh Khufu: the Great Pyramid at Giza. For thousands of years, the 450 foot-tall structure stood as the tallest man-made monument in the world. Even now, its construction remains so precise that you cannot slide a postcard between two stones. All of this was accomplished without any modern engineering equipment—no calculators, computers, or advanced geographic surveying technology. Our unique advanced cognitive abilities – those that enable engineers and architects to undertake such endeavors – go back thousands of years. But where do these faculties come from?

When we consider those things that separate humans from nonhuman primates, it is easy to focus on human-specific capacities like language and music. These profound behavioral adaptations are certainly the result of immense evolutionary progress. But, as we all learned in high school biology, we share about 96% of our genetic material with our closest living relative, the chimpanzee. If all of our species’ new cognitive proficiencies are in fact brand new developments in the human genome, then that miniscule genetic change would have needed to go quite a long way. But mother nature is smarter than this – modern neuroscience is unveiling how the story of our escape from the jungle into the vast networked civilization we know today could be told in terms of adaptation rather than innovation: why reinvent the wheel when instead you could add tires and use a new alloy?

In the past twenty years, neuroscientists around the world have begun to examine where our understanding of and ability to work with complex symbolic numerical and mathematical concepts originated. To what extent can our numerical skills be called “unique” from the rest of the animal kingdom? What was it that changed – possibly within that 4% of genetic code – that adorned our brains with the capacity to count and comprehend precise numerical quantities?

First, let’s look more deeply into where our numerical knowledge originated from an evolutionary perspective. To do so, we can examine approximate numerical cognition in humans compared with other animals. Consider a small group of chimpanzees exploring a treacherous part of the jungle inhabited by a different hostile group. How does this group of explorers make a decision whether or not to engage in conflict with the hostiles? A basic numerical competence is required on the part of the animals to make such a judgment – the invaders must compare the number of members among their group to the number of enemies they may need to fight. In experimental settings, a group of chimpanzees does not approach a simulated intruder (signaled by a fake call played from a speaker) unless the group numbers three or more.1 Again, mother nature is efficient: three appears to be the number of adult male chimpanzees needed to kill another chimpanzee.2

But it’s not only monkeys which show this kind of numerical understanding – fish, bees, cats, dogs, and human infants show similar preferences for “more”, especially when detecting novelty or making decisions.3-6 These findings appear to be rather simple and self-explanatory – of course animals can distinguish more from less – but they tell us something quite deep. Understanding number, at least in an approximate sense, is not something that makes humans special.

Numerical ability, then, is clearly present throughout the animal kingdom. So what is different about the numerical cognition of humans compared to that of other animals? As mentioned above, human infants and many animal species can be remarkably good at making greater/fewer judgments about relevant objects in the world. But monkeys, unlike people, do not approximate pi or build enormous monuments to dying leaders. It instead appears to be an extension of this evolutionarily-conserved approximate-numerical system that results in the mathematical knowledge of number found across much of human civilization.

What evidence exists to suggest number is something the brain treats in a special way? Those who have taken a cognitive psychology class are likely familiar with the general (though not universal) understanding that the fusiform gyrus, a strip of cortex across the rear underside of the brain is responsive to images that require fine visual expertise to discriminate. This part of the brain responds especially strongly when viewing an upright image of a face, but also when someone with expertise for identifying objects, such as classic cars or species of birds, views images of cars or birds, respectively. The brain is smart – why waste precious space and energy building a smattering of parts and pieces that all accomplish the same function on different kinds of inputs? Instead, mother nature appears to have found a way to allocate resources such that a single chunk takes care of the critical operation, and performs this task on different kinds of information received – in the previous example, discrimination of fine details of visual images, whether faces, birds, or cars.

Numerical cognition appears to work the same way. Early neuroimaging experiments, mostly using functional magnetic resonance imaging (fMRI), a technique whereby researchers can peek inside the skull and see where blood is flowing as a subject performs a task, found that a small area on both sides of the brain called the intraparietal sulcus (IPS) responded especially strongly when subjects performed tasks related to numbers. Whether the numbers were presented as dot patches, spoken or written words, or arabic numerals, the same part of the brain about 4 inches diagonally above and behind each ear responded robustly.7-9,5 This piece of cortex was responsive to number, regardless of how the subject received this information.

Even more interesting is a follow-up experiment in which a team of researchers posed several cognitively-relevant questions regarding an image which consisted of 2 different Arabic numeralswith different sizes, brightness, and numerical values. Each subject was asked to determine which numeral was bigger in size, brighter in color, and larger in value. Even though the questions were quite varied, the subjects’ brains responded the same way to all these kinds of judgments.10

But what do judgments of size, brightness and number have in common? It turns out that the brain may have found a way to represent and compare specifically the magnitude of a stimulus in an abstract fashion, without regard to which kind of magnitude is being compared. 9 The IPS, like the entire brain, sits inside the pitch-black attic of the skull, with its only source of information being the thousands and thousands of cellular wires carrying signals from other parts of the brain and the sensory organs. It has no idea whether the signals it receives are coming from the ears or the eyes, or whether the information encoded is about brightness, size, or number. In a sense, where the information is coming from doesn’t really matter – the same neural algorithm can discriminate any of these different examples of magnitude information.

Given that our sense of number may just be a particular implementation of a more abstract sense of “how big” in general, it could plausibly follow that individuals that have a more keenly-attuned magnitude comparison system, that is, subjects who are better at comparing the number of objects presented during a quick display, are more likely to succeed at grasping more advanced mathematical concepts. This indeed turns out to be the case – in a study of 64 high school students, those students who had performed better in math courses early in school performed better when asked to determine which of two dot fields contained more dots, a common test of the abstract number system.11 Though it is not clear whether these students had better abstract numerical abilities because they had better symbolic math training or if they instead were better at symbolic math as a function of their better abstract numerical abilities, this does tell us that symbolic math skill and abstract numerical abilities are correlated. Symbolic math is not a separable skill, but rather sits on top of the ancient numerical abilities present across the animal kingdom.

How, then, do we learn our more precise concept of mathematical number? Though children of very young ages can reliably discriminate between 2 numbers of the proportion 2:1, it takes longer before they are able to make more finegrained distinctions. Abstract numerical ability is early and innate – but an understanding of numerical quantities, such as “5,” “13,” and “42” rather than qualities such as “a few,” “some more,” and “a lot” takes time. In the Presidential Lecture at the 2009 Society for Neuroscience annual meeting in Chicago, IL, Elizabeth Spelke of Harvard University argued that explicit verbal counting, which requires persistent practice and which animals never acquire, is the “missing link” between the abstract numerical abilities of the animal kingdom and the precise mathematical skills found exclusively in humans. At first, Spelke says, human infants can only indirectly represent quantity – for example, an infant would have an idea that two balls were more than one ball, but would not know that two balls were two balls (they would be understood as “ball and ball”, not “two balls”). It is not until the operation of counting is learned, typically verbally, that the concept of natural number emerges. Once the understanding is in place that each succeeding number, with its own linguistic label, is one more than the previous number, the entire realm of natural numbers becomes available to the child. They have gained access to a secret known only to humans which allows for equally-precise representation of any quantity – from 4 marbles to 133 blocks.

Some cultures never acquire such understanding of countable natural number, much less more advanced symbolic mathematical concepts. Thus, they should only be able to make numerical decisions through the abstract number system. Stanislas Dehaene from INSERM in Paris, France set out to understand the way Amazonian tribes, who do not have words for numbers greater than 5, represent and make decisions about number. Each subject was asked to indicate where a number, as indicated by a quantity of physical objects presented to the subject, should fall on a number line. Performance suggested that number was represented in a logarithmic fashion, rather than the linear representation afforded by precise natural numbers.8,12 Performance of infants and animals trained to make numerical discriminations, along with recordings of single neurons in behaving monkeys and fMRI measurements in humans, also hint at a logarithmic representation of abstract number.5,13,14 Even without explicit numerical understanding, the brain can still make roughly accurate judgments of relative quantity, which are sufficient in most situations.

It thus appears that a relatively small adaptation allowed humans to implement an old but powerful quantity-evaluation system in a new important way, giving rise to feats like the pyramids of Giza. But the ancient Egyptians built the pyramids with none of the advanced technologies we know and cherish today; similarly, species around the world live and thrive without erecting edifices or calculating interest rates. Nevertheless, the primitive magnitude-estimation system is more than enough for survival among nonhuman species, and the ancient Egyptian construction techniques were more than suitable for building massive monuments. The numerical abilities we have now, despite their apparent uniqueness, may just be icing on the evolutionary cake.


1. Wilson, M.L., Hauser, M.D., & Wrangham, R.W., Does participation in intergroup conflict depend on numerical assessment, range location, or rank for wild chimpanzees? Animal Behaviour 61 (6), 1203-1216 (2001).
2. Wrangham, R.W., Evolution of coalitionary killing. Yearbook of Physical Anthropology 42, 1-30 (1999).
3. Agrillo, C., Dadda, M., Serena, G., Bisazza, A., & Chapouthier, G., Use of Number by Fish. PLoS ONE 4 (3), e4786 (2009).
4. Gross, H.J. et al., Number-based visual generalisation in the honeybee. PLoS ONE 4 (1), e4263 (2009).
5. Nieder, A., Counting on neurons: the neurobiology of numerical competence. Nat Rev Neurosci 6 (3), 177-190 (2005).
6. Thompson, R.F., Mayers, K.S., Robertson, R.T., & Patterson, C.J., Number coding in association cortex of the cat. Science 168 (3928), 271-273 (1970).
7. Cantlon, J., Platt, M., & Brannon, E., Beyond the number domain. Trends Cogn Sci (2009).
8. Dehaene, S., Dehaene-Lambertz, G., & Cohen, L., Abstract representations of numbers in the animal and human brain. Trends in Neurosciences 21 (8), 355-361 (1998).
9. Walsh, V., A theory of magnitude: common cortical metrics of time, space and quantity. Trends Cogn Sci (Regul Ed) 7 (11), 483-488 (2003).
10. Pinel, P., Piazza, M., Le Bihan, D., & Dehaene, S., Distributed and overlapping cerebral representations of number, size, and luminance during comparative judgments. Neuron 41 (6), 983-993 (2004).
11. Halberda, J., Mazzocco, M.M.M., & Feigenson, L., Individual differences in non-verbal number acuity correlate with maths achievement. Nature 455 (7213), 665-668 (2008).
12. Pica, P., Lemer, C., Izard, V., & Dehaene, S., Exact and approximate arithmetic in an Amazonian indigene group. Science 306 (5695), 499-503 (2004).
13. Nieder, A., Freedman, D.J., & Miller, E.K., Representation of the quantity of visual items in the primate prefrontal cortex. Science 297 (5587), 1708-1711 (2002).
14. Piazza, M., Izard, V., Pinel, P., Le Bihan, D., & Dehaene, S., Tuning curves for approximate numerosity in the human intraparietal sulcus. Neuron 44 (3), 547-555 (2004).
15. Dehaene, S., Izard, V., Spelke, E., & Pica, P., Log or Linear? Distinct Intuitions of the Number Scale in Western and Amazonian Indigene Cultures. Science 320 (5880), 1217-1220 (2008).

Implementing a DNA Library to Explore the Sequence Dependence of Dimerization of BNIP3-like Transmembrane Domains

by: Kushagra Shrinath

For the past decade here at Rice, the MacKenzie lab has been studying transmembrane proteins (proteins that span a biological membrane) and the self-association of alpha-helices within these transmembrane proteins. Standard biophysical and biochemical assays are carried out on point mutants of important transmembrane proteins such as BNIP3 (a protein that promotes apoptosis), Syndecan (a receptor protein linked to various growth factors), and Glycophorin A (a protein that spans the membrane of a red blood cell). By assaying how different point mutants affect the dimerization of the transmembrane proteins, it is possible to deduce both the importance of the location of residues to dimerization, as well as how different residues at particular locations affect dimerization. Due to the critical role that transmembrane domains play in cellular signaling, figuring out motifs that cause dimerization is an important first step to help regulate abnormal cell signaling which can lead to metabolic and immunological disorders, as well as cancer. In this paper we will discuss transmembrane proteins, measures of studying them biochemically, and recent studies in the MacKenzie lab involving DNA libraries.

Transmembrane proteins are integral in the structure and function of cells. They constitute 27% of all proteins made by humans1 and serve a variety of functions in cell signaling and regulating proper ion concentrations. Although very important in biology, transmembrane proteins are not simple to study. Their helical structure, and moreover the helix-helix interactions caused by this structure, is a result of the amphiphilic (polar and non-polar) environment in which they reside. This amphiphilic environment makes biophysical and biochemical studies of transmembrane proteins difficult.5 Moreover, the accuracy of efforts to use detergent micelles to simulate the environment is unknown.

An assay called TOXCAT established by Russ and Engelman however, allows for in vivo measurements of an important facet of transmembrane domains: their ability to dimerize.6 By fusing ToxR, a transcription factor dependent on dimerization, to the transmembrane domain, one is able to detect dimerization when ToxR activates a reporter gene encoding for chloramphenicol acetyl transferase (CAT) via the cholera toxin promoter (ctx). (Fig. 1). Russ and Engelman used TOXCAT to measure the effect of single point mutations on Glycophorin A and found that mutations to polar residues show specificity in TOXCAT while they are disruptive in SDS.

The ability of a transmembrane domain to dimerize is largely dependent on the sequence of the interfacial residues that are active in dimerization.4 While point mutants are helpful in deducing how single residue changes affect dimerization, they are impractical to use when trying to uncover motifs and general patterns that cause tight dimers. The number of possible mutants that can be made is astronomically large, especially when considering double and triple mutants. Furthermore, the vast majority of these mutants are not strongly dimerizing and it would not be viable to use resources to carry out assays on them. What is needed is an assay that produces numerous mutants and also selects for mutants that are strongly dimerizing. It is here where a DNA library is a valuable tool to explore the sequence dependent dimerization of transmembrane proteins. Whereas mutagenesis alters a sequence at one position, a DNA library contains several degenerate nucleotides at key positions of interest in the transmembrane protein, thereby allowing it encode for many mutants while still maintaining the overall structure of the protein the same. Furthermore, single point mutations also ignore residues that work in tandems or combinations. Instead of focusing on residues at single locations, a library allows motifs of residues to be uncovered. By using a library it is also possible to define select residues at the interfacial region and allow for cut sites that eliminate certain residues altogether. This project aims to assess the general framework that is necessary for dimerization via the use of a DNA library. This library codes for human BNIP3 which has been integral in apoptosis2 as well as C. elegans BNIP3, Glycophorin A, and Syndecan 3.

Synthetic oligonucleotides that have the same backbone but encode for different chosen amino acids at the interfacial positions of the transmembrane domain will be amplified by PCR. They will be inserted into a pccKan vector in between the N-terminal ToxR transcription factor and the C-terminal Maltose Binding Protein. The ligated vector will be transformed into E. coli and the DNA harvested will represent the library from which future screening will take place. This DNA will then be transformed into NT326 cells which allow for transmembrane domains that dimerize strongly to be resistant to chloramphenicol. By growing up the NT326 cells on increasing concentrations of chloramphenicol and sequencing a number of cells that survive at various concentrations, one can deduce the residues and motifs that favor dimerization. Finally, the sequences of interest can be assayed via TOXCAT and their relative dimerization values can be compared. Through many screenings, a large enough library can be constructed that will allow for one to see the effect of an amino acid at a single position on the transmembrane domain. In addition, comprehensive motifs that tightly dimerize will be found.

One feature that has been seen by Russ et. al. is the predominance of a GxxxG motif in dimerization.7 In the absence of a GxxxG motif, a library implemented by Dawson et al. showed that the most tightly dimerizing sequences had polar serines and threonines which are thought to contribute to dimerization through hydrogen bonds.3 This library is distinctive in that it allows both polar and non-polar residues to lie at interfacial residues to assess whether or not hydrogen bonding is in fact responsible for the tight dimerization seen in polar residues. Furthermore, this library allows the central glycine to be eliminated to uncover other tightly dimerizing motifs that lack a GxxxG motif. Results of a preliminary study involving this DNA library have shown that on the most tightly dimerizing transmembrane protein, there is always a glycine present at the central position and a strong, statistically significant presence of phenylalanine 3 positions prior. Polar residues (serine and threonine) are prevalent at the first position of transmembrane domain, while the branched amino acids leucine and valine predominate at the sixth position. These findings provide insight to general motifs that arise as a result of induced variation and selection and give us the ability to predict the dimerization strengths of similar transmembrane sequences found in biological systems. Future work will include expanding the DNA library as well as creating a new library to ask different questions by changing the residues that can be at a particular position.

The use of a DNA library is an elegant method with which important questions can be answered in a holistic and encompassing manner without expending a lot of resources. Its application with transmembrane domains will potentially reveal important motifs that lead to strong dimerization. This information can be used to make tailor-designed drugs or novel mechanisms that can disrupt or enhance the dimerization of biological transmembrane proteins. Due to the large amount of biological mechanisms regulated by the self-association of alpha-helices, this information can be vital in designing a method to treat a myriad of disorders.


1. Almén, M.S., Nordström, K.J., Fredriksson, R., Schiöth, H.B. Mapping the human membrane proteome: a majority of the human membrane proteins can be classified according to function and evolutionary origin. BMC Biology 2009, 7:50.
2. Chen, G., Ray, R., Dubik, D., Shi, L., Cizeau, J., Bleackl, R.C., Saxena, S., Gietz, R.D., and Greenberg, A.H. The E1B 19K/Bcl-2-binding protein Nip3 is a dimeric mitochondrial protein that activates apoptosis. Journal of Experimental Medicine 1997, 186, 1975-1983.
3. Dawson, J.P., Weinger, D.S., Engelman, D.M. Motifs of serine and threonine can drive association of transmembrane helices. Journal of Molecular Biology 2002, 316, 799-805.
4. Lemmon ,M.A., Flanagan, J.M., Treutlein, H.R., Zhang, J, Engelman, D.M. Sequence specificity in the dimerization of transmembrane helices. Biochemistry 1992, Dec 29;31(51):12719-25.
5. MacKenzie, K.R. Folding and Stability of α-Helical Integral Membrane Proteins. Journal of Biological Chemistry 2006, 106, 1931-1977.
6. Russ, W.P., and Engelman, D.M. TOXCAT: A measure of transmembrane helix association in a biological membrane. Proceedings of the National Academy of Science 1999, 96, 863-868.
7. Russ, W.P, and Engelman, D.M. The GxxxG motif: a framework for transmembrane helix-helix association. Journal of Molecular Biology 2000, 296, 911-919.

The Philosophy of Stem Cell Biology

by: Casey O’Gradey

A Snapshot of Work Across the Sciences and Humanities

Dr. Melinda Fagan of the Rice philosophy department views her work from a unique academic vantage point. While her position is in the humanities department, she has spent much of her academic career in the sciences, obtaining her PhD in biology before moving on to study philosophy and the history of science. While she once practiced science in a laboratory, she now approaches the field exclusively as an academic. She currently teaches a course entitled “Perspectives on Stem Cells” – a joint collaboration between the bioengineering and philosophy departments. This past semester, I had the opportunity to work as a research assistant for her project in the philosophy of stem cell biology. This relatively untouched field of inquiry piqued Dr. Fagan’s scientific and philosophical interests. Connecting central concepts of stem cell biology with key issues in philosophy of biology, she takes an interdisciplinary approach that explores the horizons of both fields – drawing upon recent experimental advances in stem cell research along with innovative philosophical approaches to scientific understanding. Through a synthesis of these two sources, Dr. Fagan’s work articulates an accurate and useful model of scientific results in stem cell biology that benefits both philosophers of science and practicing experimental biologists. The following is a short expose of what I gathered to be the aims and methods of her work.

Defining Stem Cells: An Epistemological Evaluation

While stem cells have played a prominent role in modern medical research, scientists are still struggling to precisely define their status and nature within biological theory. Because they are oft perceived as objects of great promise, it is crucial that their place in modern cell biology is understood before their medical possibility assessed. Unlocking the potential for curative medical remedies make the need for a precise understanding of stem cell extremely important and urgent.

Stem cells are typically defined by two capacities – (1) production of more cells of the same type, i.e. self-renewal and (2) production of more differentiated cell types, i.e. differentiation potential. They are further categorized by the developmental stage of their host organism – embryonic, fetal, or adult. In experimental research, the most important distinction is between embryonic and adult stem cells. Experimental practices on the two types of stem cells have different standards and methods. In her epistemic criticism of stem cell biology, Dr. Fagan focuses on the experimental results of adult stem cell research, specifically research with blood stem cells, also known as hematopoetic stem cells (HSCs). These are the only stem cells routinely used in clinical practice during bone marrow transplants and are comparatively well understood by the scientific community.

Dr. Fagan’s analysis of scientific results in this area of research takes an epistemological approach. That is, an approach that seeks to evaluate how knowledge in the field is garnered through experimental and/or social practices. Her effort in this regard locates the epistemic units of blood stem cell research and traces their history and development to the present day. Drawing on the history and sociology of science, Fagan locates a crucial turning point in the field occurring in a debate between two groups of scientists both claiming to have discovered the proper method to yield all and only blood stem cells. While debate still surrounds this issue, Dr. Fagan determines the most widely accepted and used model of blood stem cells is one of cell-lineage hierarchy.

Under this model, cells are mapped according to their development from unspecialized stem cells toward fully specialized cells within an organism. Thus, at the top of this hierarchy is the unspecialized stem cell and branching off from it the different specialized cells it can potentially differentiate into. This model, while practically useful and accurately representative of the field, brings out many philosophical problems in the field that Dr. Fagan uncovers and attempts to explain.

Questions in the Philosophy of Stem Cell Biology

The model is useful because it provides a meaningful definition of stem cells using the concepts of self-renewal and differentiation. The central question that Dr. Fagan address under this model is whether stem cells have intrinsic nature; that is, whether stem cells are real ontological entities. This is a problem found in philosophy of science and philosophy of biology. In this instance, the concern is that the model fails to take into account the environmental condition under which a cell can be described as a stem cell. The model tacitly assumes that cells are selfsufficient and isolatable entities when in fact that may not be the case. Philosopher of science Nik Brown writes, “Of late, the traditional notion of stem cells as a clearly defined class of intrinsically stable biological objects that can be isolated and purified, has begun to give way to a view of ‘stem-ness’ as temporary, shifting and evanescent.”

Ultimately, whether stem-ness is an intrinsic quality of acell depends upon the explanation of biological mechanisms invoking stem-ness. Because stem cell biology is a relatively new field of experimental biology, an assessment of mechanistic explanations in this field demands not only an evaluation of biological theory, but also a social analysis of stem cell experimental practices. Through a historical survey of the field’s work, Dr. Fagan determines unification as a guiding norm of experimental practice in the field.

Ultimately, Fagan concludes that a reasonable way to conceive of objectivity in stem cell biology is to distinguish between experimental and biological mechanisms. This would allow scientists to delineate between experimental interventions and the actual nature of the biological objects on which they intervene. Dr. Fagan concludes on the matter, “For a therapy to work, it is not enough that the models it is based on mesh with our aspirations and hopes. They must also successfully predict what cells will do when let loose in the body. So it is crucial that stem cell researchers be able to distinguish between features of models of cell development that reflect our interventions and aspirations, and those that reflect ‘cell intrinsic’ pathways or stable features of physiological environments.”

Dr. Fagan is keen to note in this passage that ethical and epistemic values often intersect with medical science, particularly those of public interest, like stem cell biology. The crucial aim of her project is to demonstrate the importance of this interplay for further understanding the science in which we are working and the goals we hope for it to accomplish. Stem cell biology is a unique and constitutive case of this dynamics because it is both a relatively new area of experimental biology as well as a prominent topic of medical promise.