Thursday, June 30, 2011

'Land of fire' ignites love of science

Emory scientist Jamal Musaev celebrates July 4th, and the May 28 independence day for Azerbaijan, shown above with fireworks over the capital Baku. "Azerbaijan will always have a special place in my heart, but I'm very grateful and in love with my new country, the United States of America," he says. Credit: iStockphoto.com.

By Carol Clark

Jamal Musaev, director of Emory’s Cherry L. Emerson Center for Scientific Computation, has a worldly view of science.

He grew up in the small, mountain town of Ordubad, Azerbaijan, during the heyday of the space race between the U.S. and the U.S.S.R. “I loved physics, and from the time I was in sixth grade, I wanted to be an astronaut. That was my dream,” Musaev recalls.

Azerbaijan was founded in 1918, as the first democratic and secular Muslim republic. “It is one of the most tolerant places in the world, where about 90 ethnic groups and members of all different religions live in harmony,” Musaev says. “People in Azerbaijain have three key traditions: A love of science, music and openness.”

Notable Azerbaijan natives include Lotfi Zadeh, the developer of fuzzy logic, and Lev Landau, who won the 1962 Nobel Prize in physics.

Situated in the Caucasus region on the Caspian Sea, the country was incorporated by the Soviet empire in 1920, but regained its independence in 1991. Azerbaijan is rich in oil and gas reserves. The ancients called it the “land of fire,” due to the jets of flame that shot up from natural gas leaks. Before the Soviets took over, the Swedish Nobel brothers built one of the biggest oil companies in the world in Azerbaijan.
A natural gas fire burns in an ancient stone building in Baku, Azerbaijan. Credit: iStockphoto.com.

During World War II, Azerbaijan supplied both fuel and scientific brainpower to the Soviet military. A chemist from Musaev’s hometown, Yusif Mamedaliyev, developed a high octane aviation fuel in 1941 that allowed military airplanes to fly at higher altitudes.

Musaev gave up on his dream of becoming an astronaut, but he completed a master’s degree in physics at Azerbaijan State University in the capital of Baku. He continued his education at the U.S.S.R. Academy of Sciences in Moscow, where he became interested in physical chemistry. After just three years, Musaev earned his PhD at the academy, and then became the youngest senior scientific fellow in the history of the institution. He also met his wife, Matanat, a fellow student from Azerbaijan, who was in Moscow for a PhD in literature.

In 1991, Keiji Morokuma, a leading theoretical and computational chemist, recruited Musaev to join the Institute for Molecular Sciences in Okazaki, Japan, where he focused on investigating gas phase reactions involving transition metals.

“If living in Moscow was about 20 percent different from living in Azerbaijan, then moving from the Soviet Union to Japan was 200 percent different,” Musaev says.

In 1993, Morokuma moved to Atlanta to head up the Emerson Center for Scientific Computation at Emory. Musaev accompanied his mentor, uprooting once again to begin a new chapter in a new country. Musaev eventually took over as director of the Emerson Center, after Morokuma’s retirement in 2006.

His globetrotting career has been an interesting journey, he says, both culturally and scientifically. “I’ve had to learn three languages, in addition to my native Azeri, and also how to communicate with many scientists from different backgrounds and specialties.”

All of these experiences come to bear at the Emerson Center, where science is both inter-disciplinary and international. The big problems in science today require scientists to think outside their disciplines and collaborate across campuses and even continents. And supercomputers are increasingly crucial to both streamline and link their research.
An education and career spanning four very different countries gives Musaev a worldly view of science. Photo by Bryan Meltz.

Musaev recalls the first computer he used, in 1976, while he was a college student in Baku. “It was a Soviet model, as big as a house, and really noisy,” he says. “It was slow and not very reliable, but we were still happy to use it, because we could get information that people couldn’t get from just doing experiments.”

Today’s supercomputers have been reduced from the size of houses to large refrigerators, while their processing speeds and memory capacities have expanded exponentially. “Now we can use computers not just to explain experimental findings, but to actually design the experiments,” Musaev says.

Dozens of scholars, from Emory and different points of the globe, draw on the four large computer clusters, software library and expertise of the Emerson Center to conduct collaborative research projects. The center has played a key role in the development of two major research programs on campus: The Emory Bio-inspired Renewable Energy Center, which is seeking methods of developing sustainable fuels, and an NSF Center for Chemical Innovation, focused on stereoselective C-H functionalization, which aims to simplify drug synthesis.

“The Emerson Center brings people together,” Musaev says. “We don’t just provide the facilities and the expertise, we actually help do the science.”

Musaev travels a great deal in his role as center director, but he and his family have happily put down roots in Atlanta, he says. His daughter, Iten, graduated from Emory in 2008 and now attends law school at the University of Georgia. “Azabaijan will always have a special place in my heart, but I’m very grateful and in love with my new country, the United States of America,” Musaev says.

Related:
Water oxidation advance aims at solar fuel
NSF Center aims to simplify drug synthesis

Tuesday, June 28, 2011

Why robots should care about their looks

As one movie reviewer points out, the Autobots and Decepticons in Transformers: Dark of the Moon “drool, bleed, have whiskers and even go bald with age.”

But what's going on beneath the surface? As real-life robots become increasingly sophisticated, how will we decide if they have enough of a sense of themselves to deserve certain levels of rights?

“It will be an interesting question, and it won’t be just an intellectual one,” says bioethicist Paul Root Wolpe, director of the Emory Center for Ethics. “A whole series of experiments show that if you create a robot that moves, but just looks like a whole bunch of gears, and you give someone a sledgehammer and say, ‘Smash it,’ they’ll smash it. If you put a little furry cover on it, so now it moves but it looks organic, they won’t hit it.”

Whenever a robot is more humanoid or more animal-like in appearance, people are more reluctant to harm it.

“We already have robots that in one sense or another are being treated more like animals,” Wolpe says. “As soon as you begin to give robots the appearance of life, people begin to project onto it the feelings that they project onto life.”


Related:
Dining with machines that feel
The real origins of the X-Men

Wednesday, June 22, 2011

AIDS: From a new disease to a leading killer



In the span of just three decades, AIDS went from a new and rare disease to the fourth leading cause of death in the world. On the 30th anniversary of the epidemic, it’s worth taking stock of the past, present and future of a disease that seems here to stay.

The complexity of the virus that causes AIDS pales in comparison to the social and cultural complexities still surrounding it, says James Curran, dean of Emory’s Rollins School of Public Health. “It makes it difficult to understand why everyone isn’t tested, why everyone isn’t treated, why everyone doesn’t have access to health care,” he says.

“In every society it’s difficult to talk about sex, and in every community it’s difficult to talk about sex,” he adds. “It’s not so difficult in many churches, because they just don’t do it.”

Many HIV-infected people, both alive and dead, have made a huge difference in society’s perception of the disease, Curran says. “They’ve become role models for others in how to live their lives openly with a fatal infection that scares the hell out of people. That includes people like Larry Kramer, Magic Johnson, Ryan White, and the supporters of people who are HIV-infected like Elton John, Michael Jackson and Elizabeth Taylor.”

Click here to see the distribution of HIV/AIDS cases in the United States today.

The search for a vaccine and better treatments continues. Francoise Barré-Sinoussi, co-winner of the 2008 Nobel Prize in Physiology or Medicine for her discovery of the virus responsible for AIDS, recently spoke at Emory on some of the most promising areas of the ongoing research. You can watch her talk, in the video below. Barré-Sinoussi is director of the retroviral infections control unit at the Pasteur Institute.


Related:
Ryan White: A leader forged by AIDS
HIV/AIDS at 30

Thursday, June 16, 2011

Extra anchovies and enlightenment


Is it possible to achieve such an enlightened mental state that you feel one with everything? These days, it’s a question explored by both neuroscientists and religious leaders.

In the above video, Australian TV presenter Karl Stefanovic interviews the Dalai Lama, and puts a new spin on the age-old question. What starts as an awkward moment is transformed by the Dalai Lama’s laughter.

As a distinguished professor at Emory, the Dalai Lama is striving to blend knowledge about the human mind derived from Tibetan contemplative practices with the best of Western science. Perhaps more importantly, he's able to laugh at himself, and he spreads good cheer wherever he goes.

Kung Fu Cat? The nature of martial arts



Ever noticed how a startled cat reacts? It crouches low to the ground. The cat becomes a coiled spring, ready to move in any direction.

Kyle Albers, who graduated from Emory in May with a degree in environmental studies, notices these things. He holds a second-degree black belt in Taekwondo, studies To Shin Do, and is a student of Japanese swordsmanship.

For a senior project, Albers investigated the connections between martial arts and the natural world, and he has created a blog on the topic.

While a startled cat flexes into a crouch, or “grounds out,” he notes that humans in this situation tend to tense up and lock their knees, limiting their ability to react. The behaviors of cats and other animals are just one example of what nature can teach a martial arts practitioner.

“The adaptive cycle of a martial artist parallels to how an ecosystem adapts and evolves to changes,” Albers says. “Every attack that is successfully blocked opens paths for new counters that can be successful offensive techniques.”

Thursday, June 9, 2011

Teen brain data predicts pop song success

By Carol Clark

An Emory University study suggests that the brain activity of teens, recorded while they are listening to new songs, may help predict the popularity of the songs.

“We have scientifically demonstrated that you can, to some extent, use neuroimaging in a group of people to predict cultural popularity,” says Gregory Berns, a neuroeconomist and director of Emory’s Center for Neuropolicy.

The Journal of Consumer Psychology is publishing the results of the study, conducted by Berns and Sara Moore, an economics research specialist in his lab.


Photo credit: iStockphoto.com.


In 2006, Berns’ lab selected 120 songs from MySpace pages, all of them by relatively unknown musicians without recording contracts. Twenty-seven research subjects, aged 12 to 17, listened to the songs while their neural reactions were recorded through functional magnetic resolution imaging (fMRI). The subjects were also asked to rate each song on a scale of one to five.

The data was originally collected to study how peer pressure affects teenagers’ opinions. The experiment used relatively unknown songs to try to ensure that the teens were hearing them for the first time.

Three years later, while watching “American Idol” with his two young daughters, Berns realized that one of those obscure songs had become a hit, when contestant Kris Allen started singing “Apologize” by One Republic.

“I said, ‘Hey, we used that song in our study,’” Berns recalls. “It occurred to me that we had this unique data set of the brain responses of kids who listened to songs before they got popular. I started to wonder if we could have predicted that hit.”

A comparative analysis revealed that the neural data had a statistically significant prediction rate for the popularity of the songs, as measured by their sales figures from 2007 to 2010.

“It’s not quite a hit predictor,” Berns cautions, “but we did find a significant correlation between the brain responses in this group of adolescents and the number of songs that were ultimately sold.”

Previous studies have shown that a response in the brain’s reward centers, especially the orbitofrontal cortex and ventral striatum, can predict people’s individual choices – but only in those people actually receiving brain scans.

The Emory study enters new territory. The results suggest it may be possible to use brain responses from a group of people to predict cultural phenomenon across a population – even in people who are not actually scanned.

The “accidental discovery,” as Berns describes it, has limitations. The study included only 27 subjects, and they were all teenagers, who make up only about 20 percent of music buyers.

The majority of the songs used in the study were flops, with negligible sales. And only three of the songs went on to meet the industry criteria for a certified hit: More than 500,000 unit sales, including albums that had the song as a track and digital downloads.

“When we plotted the data on a graph, we found a ‘sweet spot’ for sales of 20,000 units,” Berns said. The brain responses could predict about one-third of the songs that would eventually go on to sell more than 20,000 units.
Brain regions positively correlated with the average likability of the song: cuneus, orbitofrontal cortex and ventral striatum.

The data was even clearer for the flops: About 90 percent of the songs that drew a mostly weak response from the neural reward center of the teens went on to sell fewer than 20,000 units.

Another interesting twist: When the research subjects were asked to rate the songs on a scale of one to five, their answers did not correlate with future sales of the songs.

That result may be due to the complicated cognitive process involved in rating something, Berns theorizes. “You have to stop and think, and your thoughts may be colored by whatever biases you have, and how you feel about revealing your preferences to a researcher.”

On the other hand, “you really can’t fake the brain responses while you’re listening to the song,” he says. “That taps into a raw reaction.”

The pop music experiment is merely “a baby step,” Berns says. As a leader in the nascent field of neuroeconomics, he is interested in larger questions of how our understanding of the brain can explain human decision-making. Among his current projects is a study of sacred values, and their potential for triggering violent conflict.

“My long-term goal is to understand cultural phenomena and trends,” Berns says. “I want to know where ideas come from, and why some of them become popular and others don’t. It’s ideas and the way that we think that determines the course of human history. Ultimately, I’m trying to predict history.

Related:
Surprising find on brains of risky teens
An inside look at outrage

Tuesday, June 7, 2011

Dawn of agriculture took toll on health

"Early agriculturists had a harder time adapting to stress," says anthropologist Amanda Mummert. 

By Carol Clark

When populations around the globe started turning to agriculture around 10,000 years ago, regardless of their locations and type of crops, a similar trend occurred: The height and health of the people declined.

Amanda Mummert
“This broad and consistent pattern holds up when you look at standardized studies of whole skeletons in populations,” says Amanda Mummert, an Emory graduate student in anthropology.

Mummert led the first comprehensive, global review of the literature regarding stature and health during the agriculture transition, to be published by the journal Economics and Human Biology.

“Many people have this image of the rise of agriculture and the dawn of modern civilization, and they just assume that a more stable food source makes you healthier,” Mummert says. “But early agriculturalists experienced nutritional deficiencies and had a harder time adapting to stress, probably because they became dependent on particular food crops, rather than having a more significantly diverse diet.”

She adds that growth in population density spurred by agriculture settlements led to an increase in infectious diseases, likely exacerbated by problems of sanitation and the proximity to domesticated animals and other novel disease vectors.

Eventually, the trend toward shorter stature reversed, and average heights for most populations began increasing. The trend is especially notable in the developed world during the past 75 years, following the industrialization of food systems.

“Culturally, we’re agricultural chauvinists. We tend to think that producing food is always beneficial, but the picture is much more complex than that,” says Emory anthropologist George Armelagos, co-author of the review. “Humans paid a heavy biological cost for agriculture, especially when it came to the variety of nutrients. Even now, about 60 percent of our calories come from corn, rice and wheat.”

"Humans paid a heavy biological cost for agriculture," says anthropologist George Armelagos.

In 1984, Armelagos and M. N. Cohen wrote a groundbreaking book, “Paleopathology at the Origins of Agriculture,” which drew from more than 20 studies to describe an increase in declining health and nutritional diseases as societies shifted from foraging to agriculture.

The book was controversial at the time, but the link between the agricultural transition and declining health soon became widely accepted in what was then the emerging field of bioarcheology.

The current review was undertaken to compare data from more recent studies involving different world regions, crops and cultures. The studies included populations from areas of China, Southeast Asia, North and South America and Europe. All of the papers used standardized methods for assessing health at the individual level and examined how stressors were exhibited within the entire skeleton, rather than a concentration on a particular skeletal element or condition.

“Unless you’re considering a complete skeleton, you’re not getting a full picture of health,” Mummert says. “You could have an individual with perfect teeth, for example, but serious markers of infection elsewhere. You could see pitting on the skull, likely related to anemia or nutritional stress, but no marks at all on the long bones.”

Adult height, dental cavities and abscesses, bone density and healed fractures are some of the markers used to try to paint a more complete picture of an individual’s health.

“Bones are constantly remodeling themselves,” Mummert says. “Skeletons don’t necessarily tell you what people died of, but they can almost always give you a glimpse into their ability to adapt and survive.”

While the review further supports the link between early agricultural practices and declining stature and health, it’s important to keep re-evaluating the data as more studies are completed, Mummert says.

"Even now, about 60 percent of our calories come from corn, rice and wheat," Armelagos says.

One confounding factor is that agriculture was not adopted in an identical fashion and time span across the globe. In some ancient societies, such as those of the North American coasts, crops may have merely supplemented a seafood diet. “In these cases, a more sedentary lifestyle, and not necessarily agriculture, could have perpetuated decreased stature,” Mummert says.

The way the human body adapted to changes we made in the environment 10,000 years ago could help us understand how our bodies are adapting now, she says.

Some economists and other scientists are using the rapid physiological increases in human stature during the 20th century as a key indicator of better health.

“I think it’s important to consider what exactly ‘good health’ means,” Mummert says. “The modernization and commercialization of food may be helping us by providing more calories, but those calories may not be good for us. You need calories to grow bones long, but you need rich nutrients to grow bones strong.”

Related:
Mummies tell history of 'modern' plague
Brain vs. gut: Our inborn food fight
Putting teeth into the Barker hypothesis

Friday, June 3, 2011

Ryan White: A leader forged by AIDS


This June is the 30th anniversary of the first report of the disease now known as AIDS. About 30 million people and counting have died of the disease that has pushed the boundaries of science, social norms and patients’ rights.

Are you old enough to remember Ryan White? He was one of the most poignant faces among the statistics in the early days of the epidemic. White died in 1990 at the age of 19 of complications from AIDS.

White was a hemophiliac who became infected with HIV through a blood transfusion. At that time, AIDS was poorly understood and the victims were largely shunned by policy makers, who considered it a gay man’s issue.

“White wanted only to be a teen-ager like everybody else, not to be a saint or a hero,” says James Curran, dean of Emory’s Rollins School of Public Health.

Curran was another key player during the early days of AIDS. When the first cases were reported, Curran was head of the Centers for Disease Control and Prevention task force that investigated what caused AIDS, and he wrote the first recommendations to limit its spread.

Doctors gave assurances that White posed no danger to his fellow students, but the youngster was pushed out of his middle school by frightened parents and administrators. Entertainers Elton John and Michael Jackson were among the few brave enough to step forward and publicly defend White.

White’s family won a lawsuit against the school’s ban, and he was eventually readmitted as a student. But the school made White eat with disposable utensils, use a separate bathroom and skip gym class. Despite his isolation, illness and young age, White remained compassionate, and became a voice for the rights of everyone suffering from the disease.

Four months after his death, Congress passed the Ryan White CARE Act, a federally funded program for people living with HIV/AIDS.

“He made a huge difference,” Curran says.

Click to check out AIDSVu, a new online tool put together by the Rollins School of Public Health, to show the geographic breakdown for HIV in the United States today.

Related:
HIV/AIDS at 30
Hopes high for HIV vaccine

Wednesday, June 1, 2011

The real origins of the X-Men


“The X-Men take up a conversation that started in its modern form after World War II, in the Nuremberg Nazi trails,” says Paul Root Wolpe, director of the Emory Center for Ethics.

The judges were not just horrified by the testimony of the horrific experiments done on humans by the Nazis, Wolpe says. They were also alarmed when the defense lawyers brought to light what was going on in the United States in regard to human experimentation.

The judges were so appalled that they wrote the Nuremberg Code, which laid out clear guidelines for human experimentation, such as working with animals first and gaining the full consent of the subjects.

“It took more than two decades before those kinds of standards began to be universally applied,” Wolpe says.

Marvel Comics, which created the X-Men, tackled many tough issues in its storylines, says Wolpe, who grew up with a love of comic books. “The X-Men became this fascinating discussion of majority-minority relationships, human experimentation and the coming genetic sophistication,” he says.

Related:
The ethics of X-Men
Blurring the lines between life forms