Hellenologophobia
Not my favorite, but not THAT scary. Photo by Erin Podolak |
There are two things that I would say particularly freak me out: spiders and people jumping out of the dark. The spiders are relatively self-explanatory, I mean some spiders can kill you, they crawl on you, and they could be anywhere. People jumping out of the dark comes from the idea of things jumping out of my closet, made all the more scary by the fact that in movies bad things always happen when someone jumps out from a dark corner to attack.
***
For the record: Chromatin is a material made up of protein, RNA and DNA that the chromosomes of eukaryotes (multi-celled organisms) are made of. (Chromosomes are very small structures, found in the nucleus (center) of most cells that carries the genetic code.)
A Genome is the set of chromosomes in a cell that represents the complete set of genes and genetic information (the DNA) that it takes to make that organism. Humans typically have 46 chromosomes – 23 from the mother and 23 from the father.
Wisconsin’s Place in the History of Animal Research
I decided to apply to graduate school at the University of Wisconsin-Madison at the recommendation of my undergraduate advisor. I honestly wasn’t thrilled with the idea of coming to the midwest. I had never really considered what the cheese state was like before I applied – as a strictly east coast girl it was so far removed from everything in my life I couldn’t even imagine living here. But, when the college admission chips fell where they did, it was obvious to me that UW Madison was the clear first choice for grad school.
That being said, when I arrived in Wisconsin nearly nine months ago, I knew very little about the history of the University I was attending. I knew that UW-Madison was home to an amazing amount of scientific research, but I had no idea how rich the tradition of scientific inquiry really was. I quickly became aware of the Wisconsin National Primate Research Center (WNPRC) and notorious, and immensely important, psychology researcher Harry Harlow.
Those who follow this blog regularly know that I have written a lot of posts this semester inspired by my zoology class on human and animal behavior. It is this class that really got me motivated to learn more about animal research, and in particular UW-Madison’s role in animal research. That brought me to two books, both written by Deborah Blum a professor in the journalism school here at UW.
In 1992 Blum won the Pulitzer Prize for a series of articles on the ethical dilemmas posed by primate research. She turned this into the 1994 book The Monkey Wars. I was enthralled by the history of primate research in the United States, and am ashamed to admit how little I knew prior to reading the book. The story of Edward Taub, the Silver Spring Monkeys (named after the site of the lab in Maryland,) and the rise of PETA in 1981 had me riveted. The condensed version of that story is that PETA founder Alex Pacheco volunteered undercover in the lab of Taub, who was conducting neurological experiments on monkeys (severing the nerves to control a limb and then coaxing nerve regeneration.) The monkeys were held in filthy conditions – but there was no legal standard for research animal care at the time. Pacheco took photographs (some admittedly staged) and went to the police to have Taub arrested (which he was – for animal cruelty.)
The majority of events described in the book take place long before I was even born, and I suppose thats why I felt so removed from them. I didn’t realize I was taking the idea that animals have rights for granted until I learned about the history of animal research in this country. I knew that people are cruel to animals, but I was blissfully oblivious to the cruelty that was standard in research labs in the 1950’s, 1960’s, and 1970’s. After finishing Monkey Wars, my blissful respect for science felt somewhat dingy – and I needed more information.
The book I picked up next, to explore the history of animal research and in particular its role in Wisconsin, was Blum’s 2002 biography of Harry Harlow, Love at Goon Park. I don’t think I had ever heard the name Harry Harlow before moving to Wisconsin – yet his work is something that I reap the benefits of in my daily life. Harlow is both famous and infamous for his “mother love” and “pit of despair” (a catchy term for depression) studies. His research used rhesus macaque babies to show that children need love and social interaction – particularly touch – to function and develop normally, and that being isolated can be the cause of a complete psychological breakdown.
The reason Harlow is so controversial is that the way he studied depression and isolation from one’s mother was to psychologically “break” baby monkeys. These were horrible studies. The monkeys were taken away from their mothers and given a variety of fake substitutes to see which the babies would cling to most (warm, cloth, animated mother was the winning surrogate but cold metal mother caused psychological damage to her babies.) For the depression studies the babies were put in isolation cages for 3-6 months at a time, with no interaction at all. The monkeys suffered tremendously. The concept of love as a necessity needed to be proven, to move parental nurturing into the mainstream. But the question remains if it needed to be proven in that way.
Considering that I was surprised by just how awful the United States history of animal research is, you can imagine how shocking I found it that studies were needed to prove that mothers should hug their children. But then again, as Blum so poignantly points out, the scientific standard at the time was to isolate children for health reasons (limit the spread of bacteria & disease.) What seems so obvious to me – that animals should be well taken care of, that children should be hugged – were really revolutions within the scientific community. Looking back we can say how ridiculous it is that such assertions needed to be scientifically proven, but then again think about where we might be if these ideas had never been generally accepted.
This semester has really driven home for me just how much I owe to animals. The idea that my mom would have been condemned as a bad mother for hugging me when I cried were it not for Harry Harlow and his baby rhesus macaques makes me very appreciative of the role of animals in research. I remember so vividly crying on my Mom’s shoulder at maybe 4 or 5 years old. I remember the silky salmon colored blouse she was wearing. I remember staining it mercilessly with my tears, but I don’t know why I was crying. I do know that all I wanted was to be held, and have my hair stroked and be comforted. I can’t imagine my parents keeping me at arm’s length.
We owe a lot to the animals who started the social movement that changed the way people parented, and the researcher who brought it all to light for making society take notice; and I had no idea about either before coming to Wisconsin. While I do my fair share of whining about being in the cheese state, my experiences here have opened my mind to a lot of new concepts – particularly with regard to the role animals play in society and how we as humans should regard them.
Nuclear Legacy: Chernobyl Turns 25
The worst nuclear disaster the world has ever known, began with a trial run of an experimental cooling protocol on April 26, 1986. A power surge occurred in reactor #4 at the Chernobyl Nuclear Power Plant, near the town of Pripyat in the Ukraine (then part of the USSR.) An emergency shut down was attempted, but the situation was already out of control. Another power surge – stronger than the first – ruptured the containment vessel through a series of explosions that launched radioactive fuel and core materials into the atmosphere. When the reactor’s graphite moderator was exposed to open air, it ignited in a fire that sent a plume of smoke, ripe with radioactive material into the atmosphere.
![]() |
Map of Chernobyl’s radioactive fallout |
The plume drifted over parts of the former Soviet Union and Europe releasing into the open more radioactive material than the atomic bomb dropped of Hiroshima during World War II. The most effected regions include Belarus, Ukraine and what is now Russia – though radioactive material was detected at elevated levels throughout Europe.
The disaster killed 31 people who either worked at the reactor or were part of the emergency response crew, but the number of people who have been killed as a result of subsequent radiation exposure vary from the World Health Organization’s estimated 4,000 to the Greenpeace estimate of 200,000 or more.
The Soviet Union tried hard to downplay the April 26th fire and explosion back in 1986, but two days later on April 28th workers at the Forsmark Nuclear Power Plant in Sweden 680 miles from Chernobyl detected radioactive particles on their clothes. Sweden’s search for the source of the radioactivity (after it was determined that there was no problem at their plant) led to the conclusion that a serious incident had occurred in the western part of the Soviet Union. Chernobyl become the center of world wide attention.
On the 25th anniversary of the Chernobyl disaster, society is still dealing with the legacy of fear, misinformation, and health effects left by the destroyed power plant. Chernobyl was ranked as a level 7 disaster on the International Nuclear Event Scale (INES,) which is the highest possible ranking. The world is still reeling from March’s Fukushima nuclear disaster in Japan, the only other INES level 7 disaster in history. But Fuskushima is not Chernobyl. Fukushima has not caused the level of death and destruction as Chernobyl – and the plants were of completely different designs.
The nuclear reactors at Chernobyl were made based on a now defunct Soviet design, which had known cooling problems. The plant’s workers were testing a new cooling protocol because it was known that in the event of a power outage the system in place (back up generators, etc.) would not have been able to cool the reactors quickly enough. There has been much speculation about who is to blame for the Chernobyl incident – if it was the reactor design or if it was human error.
![]() |
Chernobyl as it is today |
The first reports out of Chernobyl blamed the workers – reporting that they didn’t have adequate training and experience, that they were operating the plant with key safety systems (like the Emergency Core Cooling System) turned off, and that they knowingly ignored regulations. However, over time the role of these accusations has been downplayed, while flaws in the design of the control rods (part of the cooling system) and the reactors ability to deal with the build up of steam has been blamed for the bulk of the incident.
The initial clean up of Chernobyl was done by “liquidators” who moved the majority of debris into the damaged reactor, which was covered in sand, lead and boric acid dropped from helicopters. A concrete enclosure was built around the damaged reactor – a task that exposed the construction workers to significant amounts of radiation.
In February I did a post on What We Don’t Know about Chernobyl – namely that the site of the damaged reactor has been without a proper containment vessel all these years. The concrete sarcophagus originally erected around the destroyed reactor is still in place, and there are cracks in it. The money was never raised to build a more permanent enclosure.
The Fukushima disaster has brought Chernobyl back into the headlines, and today on its 25th anniversary we have to stop and ask ourselves how our understanding of nuclear power has been influenced and shaped by that April day back in 1986. In the wake of a nuclear disaster many people question whether the science is really safe, but I think the question we should really be asking is whether the science, in human hands, is really safe. It isn’t an issue of nuclear power – it is an issue of what happens when people try to harness nuclear power.
Robby the Robot & The Power of Movies
Working on an article about robotics and biomimetic design, I’ve been thinking a lot lately about what it is about robots that can be so enthralling. Due to the portrayal of robots in the entertainment industry I think that many of us view robots more like the humanoid servant than a tool that humans can use to accomplish a task. But what is it that has ingrained in us the idea that the “robot of the future” will serve our every whim?
The portrayal of robots in movies and television is one of the most persuasive and widespread mediums for disseminating the idea of the robot servant. While I was interviewing robot researchers and connoisseurs, Robby the Robot from the movie Forbidden Planet kept coming up as the prime example of this ideal mechanical man. But, I had never heard of Robby. Somewhere in the back of my mind I had heard of Forbidden Planet – but I decided to look into just what is so special (for so many people) about this one movie robot.
Robby the Robot was developed in the late 1950’s, more my parents era than mine (which is firmly rooted in the late 1980’s and early 1990’s.) Designed for the 1956 sci-fi movie Forbidden Planet, Robby wasn’t the star, but he certainly stole the screen. He is one of the first examples of a robot that broke into mainstream recognition – and had lasting effects on how the public viewed robots.
The movie’s human star is Leslie Nielsen, playing commander J.J. Adams – sent to a strange planet to check up on a colony of scientists that have stopped communicating with earth. The plot is kind of twisted. I mean, it involves monsters that are completely powered by the human brain, given that a race of aliens figured out how to enhance the capabilities of the human brain so that it could hold the monsters. Twisted.
Robby is the servant of Dr. Morbius, the only scientist from the original expedition that wasn’t killed under “mysterious” circumstances. As a character in the film, Robby is actually very important – he is the first being on the planet to meet Adams’ expedition, and comes in throughout the film demonstrating his domestic abilities and his loyalty to his masters. He provides comic relief (learning how to make bourbon) and ultimately ends up a hero, short circuiting rather than following his master’s orders to murderous ends.
In searching YouTube for footage of Robby, I found this great history of his role in the film and how he became a cultural icon – even making it into the robot hall of fame (yes, there is such a thing!)
What I find most interesting about Robby the Robot is the anthropomorphic nature with which he was designed. Anthropomorphism is giving non-humans, human traits. For example, when we say that our dog feels guilty – guilt is a complex human emotion, and even if dogs do experience certain emotions they probably don’t experience “guilt” as we humans would define it. Another example (and probably my favorite) is the 1987 classic movie the Brave Little Toaster. The title says it all – talking toaster. Toasters don’t talk, let alone go on adventures that require bravery. Yet, by putting human characteristics onto a metal box, we end up with quite the heroic toaster.
The idea that Robby is part vacuum cleaner (the head) part washing machine (the body) but with arms and legs that can clearly be defined as parts of the body helps explain why Robby is so appealing. Because he “looks” like humans we understand how to gage his movements or gestures and what they mean. It makes the robot seem more real.
The fact that the robot was really a suit worn by a human shows just how human-like Robby was, despite being so complex in design and engineering. I think the comment made in the video clip – that finding out that Robby wasn’t really a working robot was like finding out that Santa Claus isn’t real – says it all. Robby set a standard of expectation for a generation of children/teens about what robots could and should be.
I think that this image of the robotic man has continued to permeate pop culture, so that even today more than 50 years since Robby was designed, we all still want a robot butler. It can be hard to accept that even though Robby seemed so real, it was really just a suit worn by an actor. We still don’t have robots so human-like that they can think for themselves or act the way that Robby does – and we probably won’t in my lifetime. But that doesn’t mean that today’s robots aren’t still useful and cool in their own right. We just have to be realistic about the capabilities of engineering – and learn to accept that a robot like Robby still exists only in the movies.
War Journalists: Casualties of Their Trade
Its hard to understand why something has the ability to punch you in the figurative gut. Something so far removed from you that it should barely register a reaction. Yet, it steals your breath anyway. That happened to me this week – with a tweet. The offending tweet (from a breaking news thread) said, “Reports: renowned war photojournalists Chris Hondros, Tim Hetherington killed in Libya. Details sketchy; awaiting more.”
I don’t know why this news struck me so very hard. These are not the first journalists to be killed in a war zone – but Tim Hetherington is the first journalist whose work I have studied to be killed so shockingly, and yet so predictably. It registered. It hurt. Not for me personally, it hurt for everyone who knew him. It hurt for the people whose lives he brought to light. It hurt for the stories he won’t get to tell. It hurt because people you admire shouldn’t die. Not like that. At least not in a perfect world. But then again, in a perfect world there wouldn’t be a profession called “War Photojournalist.”
Hetherington is best know for the documentary film Restrepo, which he directed with Sebastian Junger. It was nominated for an Oscar this past winter. I’ve read Junger’s book on the same events as the film – called War, and seen parts, though admittedly not all of Restrepo. It is about the time Junger and Hetherington spent embedded with a group of American soldiers in the Korengal Valley in Afghanistan. My post, Sebastian Junger’s War Zone ruminates on the book and on the topic of war. But looking back on it, I can’t help but feel deeply how little I know about war.
There has been a lot of coverage of Hetherington and Hondros’ deaths, but the People Magazine of it all isn’t the story I hope most people will read to find out about these men, and the circumstances under which they died. Sebastian Junger has a tribute in Vanity Fair written as a personal letter to Hetherington that drips with grief and beauty. Susan Orlean has a post in The New Yorker about Hetherington and what it is to be brave. New York Times war correspondent C.J. Chivers has a post on his personal website, Almost Dawn in Libya: Chris and Tim heading home, that pauses amidst the chaos of tragedy to thank the people and groups that tried so hard to save, and then do right by the fallen photojournalists. These are the stories that I hope people will read. It is a tall order to memorialize the fallen, but these writers give it a damn good try.
Prosthetic Devices: The Mystery of Human Design
In the course of an average day I go up and down the stairs in my apartment building, I walk to class, and I run errands – all on foot. Not having a car, or even a bike drives home just how much I rely on my legs to get me where I need to go. But what would I do, if simply getting up in the morning and walking to my destination wasn’t possible?
![]() |
via Wikimedia Commons |
The leg, knee, ankle, and foot (the lower extremity) perform two biological functions – stability and mobility. The lower extremity is designed to hold the body’s weight. According to Dr. Mark Geil, Director of the biomechanics laboratory in the department of kinesiology and health at Georgia State University, the lower extremity is amazing in its ability to support the body given our height and the relatively small surface area provided by the foot. In addition to stability, the muscles in the leg are key to making us mobile at various speeds, and over a variety of terrains and conditions.
![]() |
via US Army Flickr |
According to Dr. Geil, replicating human anatomy and function is the “Holy Grail” for biomimetic design. But finding a way around the problem posed by muscles is only part of the problem. The human leg is designed to accomplish a vast array of activities – and this diversity has proved difficult to make possible with a single prosthetic leg.
Synchrotron: The End of an Era?
I’ve said before that being back on a college campus offers so many unique opportunities. This week was no exception with the visit of Bill Blakemore, ABC News climate change correspondent, AND a trip to UW’s Synchrotron Radiation Center. I got several opportunities to talk to Blakemore, and I highly suggest checking out his show Nature’s Edge – but rather than delve into climate communication (a topic on which I could spew my opinions for hours) I want to focus on the SRC.
Today, my internal dialogue was triggered by the trip I took with my colleagues from the School of Journalism and Mass Communications, through the cows and the nothing, to tour the SRC. Located about 30 minutes from campus, the SRC is a particle accelerator that is used by hundreds of researchers each year. Now, I make no bones about the fact that I am scared of physics – but even I was able to understand and enjoy learning about what the SRC does.Whenever I leave downtown Madison, I go through the same internal dialogue: “There are cows. Where am I? I don’t belong here. There are cows. And nothing. As far as I can see. Cows and nothing. What am I doing in Wisconsin?” I hate to admit it, but I do still suffer from re-locaters remorse. I don’t dislike Madison, but seeing prairie or open fields for miles so close to town still shocks me every time.
The “radiation’ part of the name Synchrotron Radiation Center has nothing to do with nuclear radiation, what we have all been worrying about with the Japanese earthquake. Rather, radiation refers simply to the center’s main purpose – to create light for scientific experiments. If you think back to what you know about the electromagnetic spectrum, you’ll remember that there are different forms of light – visible light, microwaves, radiowaves, uv rays, x-rays, etc.
The SRC conducts a variety of experiments using the different forms of light (infrared to x-ray range) that are generated by accelerating electrons around the Aladdin storage ring. I am not going to do a better job of explaining how the ring works than the SRC does on their website, but I will say that the wave of light created by winging the electrons around needs to be contained/controlled and that is essentially what Aladdin does. It is the mechanism that harnesses the light so it can be used in experiments.
The center was opened in 1981, and has a special role as far as SRC’s go because the UW center gives visiting researchers 2-3 weeks to work on their projects, unlike the 3-4 days they might get to conduct research at another facility. Because the SRC is funded by the National Science Foundation, researchers don’t have to pay to use it – it is free. Free resources, that invest significant time in research projects, are rare these days.
They are about to become even rarer. The SRC at UW has not made it into the NSF’s new budget, which means that funding (the approximately $5 million it takes to run the center) will be cut off in August 2011. I appreciate that the SRC isn’t cutting edge. It isn’t shiny and flashy, but it still has scientific merit. The idea of the resource going dark seems like such an utter waste.
My colleague Eric, who works in outreach at the SRC and organized the JSchool’s visit, has a terrific post on his blog about the closing of the SRC and the closing of Chicago’s Fermilab – which will leave a hole in the scientific research community in the Midwest. I encourage those of you in Madison to take the time to check out the SRC before the last electron goes shooting through the Aladdin ring, and for those of you not in Madison take a look at the federal science foundation budgets – is there a resource near you that will be lost in 2011?
The reason I chose to focus this post on the SRC rather than Blakemore’s visit, is because the SRC is such a uniquely Madison, WI experience. It reminds me of why, in spite of the cows and the nothing, I came to Madison. This is the site of some extraordinary scientific research – discoveries that I find fascinating, that ignite the sense of awe and wonder about the world that I have tried so hard to cling to as I have transitioned into adulthood. Seeing the SRC’s inquiries end, while sad, makes me appreciate that I was in Madison in time to experience it for myself.
Science For Six-Year-Olds: Giant Earthworms
This is my third post for Mrs. Podolak’s first grade class at Lincoln-Hubbard Elementary School. We have been talking about animal behavior with Alex the Genius Parrot and Animals who use tools, but now to kick off the first grader’s new science unit, we are going to talk about worms.
***
When it comes to worms, there is no specimen more impressive than the Giant Gippsland Earthworm. Check out this video to learn more about this massive worm:
Giant Gippsland Earthworms are gross yet fascinating, but these worms are found in Australia. What about the worms in your own backyard? They might not be giants, but the common earthworm are still pretty impressive little animals.
The common earthworm (Lumbricus terrestris,) also called a night crawler, is found throughout North America and Europe. Compared to the Giant Gippsland Earthworm which is usually about 20cm long, the common earthworm is usually just 7-8cm long.
The earthworm has a mouth and a butt (anus). It also has a brain, a nerve cord (the way humans have a spinal cord,) a heart, and a digestive system. So we know what worms look like: they can be big or small, and they are made up of several different parts. But what do earthworms do? Earthworms are experts about dirt. They tunnel through soil making pockets of air and water which are important for plants and the microorganisms that live in the soil.
To help learn more about worms check out these Frequently Asked Questions posed by students just like you! As always, feel free to ask me questions, and I’ll get back to you as soon as I can. Good luck on your new science unit, and learning all about earthworms!
Hal Herzog, Animal Ethics & the Alien Problem
Last semester I read many more books (thus I did a lot more book reviews) than this semester which has mostly been devoted to academic research papers. But I do have two books that need reading for my zoology class on human and animal behavior with Patricia McConnell.
I finally finished the first of the two assigned books, Hal Herzog’s Some We Love, Some We Hate, Some We Eat – Why It’s So Hard to Think Straight About Animals. I’ve been reading Herzog’s book all semester, so my evaluation of it draws on a slightly disjointed memory but I think I can summarize his main point with two statements:
1. Most people choose not to (or don’t know enough to) think about their personal moral philosophy. Not thinking about how we feel about animals is what allows us to love puppies so much while we happily chow down on a Big Mac.
2. Those people who have spent a tremendous amount of time trying to discern their personal moral philosophy about animals either A. remain horribly conflicted or B. Choose a philosophy with regards to the treatment of animals that societal pressures make very difficult to implement (for example, all creatures are equal – if you save an iguana from a burning building instead of a human baby, society is not going to look kindly upon you regardless of your belief that the iguana and the baby are equals)
Herzog’s answer to his main question “why is it so hard to think straight about animals?” largely comes down to because you’re damned if you do and you’re damned if you don’t.
The book tries hard to cover a variety of topics that impact the way we feel about animals, some obvious (factory farming, animals in research, hunting) and some less so (cockfights, dog shows, gender roles.) I don’t intend to go into his arguments for and against certain behaviors, but to give an example of the kind of analysis he provides I will share the anecdote from his chapter “The moral status of mice,” on the use of animals for biological research.
Herzog frames animal research this way: Think of Steven Spielberg’s 1982 classic film ET. Remember how close Elliott and ET became, and how heart wrenching it was to see ET go back to his home planet? Well, what if there was a disease destroying the alien’s on ET’s home planet, and the reason he really came to earth was to scout out organisms of lesser intelligence to test possible remedies on. Elliott’s intelligence was far less than ET’s. So how would you feel if at the end of the movie, ET kidnapped Elliott and took him back to his home planet to live the rest of his life as the subject of research. It would save millions of aliens. But ET still essentially destroys Elliott’s life. Not really a satisfactory ending, I’d say.
So if we don’t want ET to kidnap Elliot just because he is of lesser intelligence, then what do we do when humans are like ET and mice are like Elliot? Should we experiment on mice just because they are of lesser intelligence? Previous logic would lead us to say no, we should not experiment on the mice. But yet, I’m still in favor of animal research. Philosophically, I shouldn’t be. But there is something about experimenting on a member of my own species that I find morally reprehensible. It is the reason we don’t conduct experiments on people in coma’s or with mental retardation. But if you are always putting humans first, how can you still treat animals with respect and moral standing?
I’m not here to answer the questions thinking critically about animals pose. Herzog has 280 pages of highly intelligent, moving, and entertaining explanation, and he still doesn’t answer most of them. But he will get you thinking about your own behavior, why some animals matter to us more than others, and why humans think the way we do.
It is important for everyone: meat eaters, vegetarians, pet lovers, people who avoid animals, etc. to think about why they feel the way they do about animals. I was surprised by the conflicts in my own way of thinking, and sadly I now fall into column A – thinking critically, but still horribly confused. At least I’m thinking right?
Robots on the Front Line
I’m working on an article about how robots can and more importantly, can not, be designed and programmed to function like humans. In anticipation of this, I’ve noticed that this week robots have been making headlines for their role in the war in Afghanistan. But while useful, robots are no replacement for the ingenuity, decision making, and critical thinking capabilities of real, human soldiers.
***