Saturday, June 30, 2012
Over the years, the telephone has gone mobile, from the house to the car to the pocket. The University of South Carolina's Xiaodong Li envisions even further integration of the cell phone – and just about every electronic gadget, for that matter – into our lives. He sees a future where electronics are part of our wardrobe. "We wear fabric every day," said Li, a professor of mechanical engineering at USC. "One day our cotton T-shirts could have more functions; for example, a flexible energy storage device that could charge your cell phone or your iPad." Li is helping make the vision a reality. He and post-doctoral associate Lihong Bao have just reported in the journal Advanced Materials how to turn the material in a cotton T-shirt into a source of electrical power. Starting with a T-shirt from a local discount store, Li's team soaked it in a solution of fluoride, dried it and baked it at high temperature. They excluded oxygen in the oven to prevent the material from charring or simply combusting. The surfaces of the resulting fibers in the fabric were shown by infrared spectroscopy to have been converted from cellulose to activated carbon. Yet the material retained flexibility; it could be folded without breaking. "We will soon see roll-up cell phones and laptop computers on the market," Li said. "But a flexible energy storage device is needed to make this possible." The once-cotton T-shirt proved to be a repository for electricity. By using small swatches of the fabric as an electrode, the researchers showed that the flexible material, which Li's team terms activated carbon textile, acts as a capacitor. Capacitors are components of nearly every electronic device on the market, and they have the ability to store electrical charge. Moreover, Li reports that activated carbon textile acts like double-layer capacitors, which are also called a supercapacitors because they can have particularly high energy storage densities.
A drug developed to treat certain strains of cystic fibrosis may be useful in the treatment of chronic obstructive pulmonary disease, one of the most common lung diseases, which is seen frequently in smokers and has no cure. The findings were published online Friday, June 29, 2012 in the online journal PLoS ONE by researchers at the University of Alabama at Birmingham. Smokers with chronic obstructive pulmonary disease have problems with the accumulation of mucus in their lungs, which leads to chronic cough and sputum production and increases the risk of hospitalization and death, according to Mark Dransfield, M.D., medical director of the UAB Lung Health Center and co-author on the article. These symptoms are similar to those seen in patients with cystic fibrosis who are born without a protein needed to keep their airway hydrated and free of mucus, called the “cystic fibrosis transmembrane regulator” protein. “We found that cigarette smoking reduces the activity of CFTR both in the lab and in human volunteers who smoke and have COPD,” Dransfield says. The researchers showed that a new drug recently approved by the FDA for patients with certain types of CF, called ivacaftor (known by its brand name Kalydeco, VX-770), activated the protein and normalized airway hydration and mucus clearance. “This sets the stage to test the drug in clinical trials in patients with COPD to see if they would benefit.” “Our studies demonstrate that cigarette smoking causes an acquired abnormality of mucus clearance that is mediated through CFTR, and that this pathway can be pharmacologically reversed in the laboratory by drugs that activate CFTR,” says Steve Rowe, M.D., assistant professor in the Division of Pulmonary, Allergy and Critical Care Medicine and lead author on the paper. “This opens the potential of a new therapeutic paradigm to treat individuals with COPD, and deserves testing in well-controlled clinical trials using agents known to augment CFTR function.”
Johns Hopkins researchers have discovered that a single protein molecule may hold the key to turning cardiac stem cells into blood vessels or muscle tissue, a finding that may lead to better ways to treat heart attack patients. Human heart tissue does not heal well after a heart attack, instead forming debilitating scars. For reasons not completely understood, however, stem cells can assist in this repair process by turning into the cells that make up healthy heart tissue, including heart muscle and blood vessels. Recently, doctors elsewhere have reported promising early results in the use of cardiac stem cells to curb the formation of unhealthy scar tissue after a heart attack. But the discovery of a “master molecule” that guides the destiny of these stem cells could result in even more effective treatments for heart patients, the Johns Hopkins researchers say. In a study published in the June 5 online edition of the journal Science Signaling, the team reported that tinkering with a protein molecule called p190RhoGAP shaped the development of cardiac stem cells, prodding them to become the building blocks for either blood vessels or heart muscle. The team members said that by altering levels of this protein, they were able to affect the future of these stem cells. “In biology, finding a central regulator like this is like finding a pot of gold,” said Andre Levchenko, a biomedical engineering professor and member of the Johns Hopkins Institute for Cell Engineering, who supervised the research effort. The lead author of the journal article, Kshitiz, a postdoctoral fellow who uses only his first name, said, “Our findings greatly enhance our understanding of stem cell biology and suggest innovative new ways to control the behavior of cardiac stem cells before and after they are transplanted into a patient. This discovery could significantly change the way stem cell therapy is administered in heart patients.” Earlier this year, a medical team at Cedars-Sinai Medical Center in Los Angeles reported initial success in reducing scar tissue in heart attack patients after harvesting some of the patient’s own cardiac stem cells, growing more of these cells in a lab and transfusing them back into the patient. Using the stem cells from the patient’s own heart prevented the rejection problems that often occur when tissue is transplanted from another person. Levchenko’s team wanted to figure out what, at the molecular level, causes the stem cells to change into helpful heart tissue. If they could solve this mystery, the researchers hoped the cardiac stem cell technique used by the Los Angeles doctors could be altered to yield even better results. During their research, the Johns Hopkins team members wondered whether changing the surface where the harvested stem cells grew would affect the cells’ development. The researchers were surprised to find that growing the cells on a surface whose rigidity resembled that of heart tissue caused the stem cells to grow faster and to form blood vessels. A cell population boom occurred far less often in the stem cells grown in the glass or plastic dishes typically used in biology labs. This result also suggested why formation of cardiac scar tissue, a structure with very different rigidity, can inhibit stem cells naturally residing there from regenerating the heart. Looking further into this stem cell differentiation, the Johns Hopkins researchers found that the increased cell growth occurred when there was a decrease in the presence of the protein p190RhoGAP. “It was the kind of master regulator of this process,” Levchenko said. “And an even bigger surprise was that if we directly forced this molecule to disappear, we no longer needed the special heart-matched surfaces. When the master regulator was missing, the stem cells started to form blood vessels, even on glass.” A final surprise occurred when the team decided to increase the presence of p190RhoGAP, instead of making it disappear. “The stem cells started to turn into cardiac muscle tissue, instead of blood vessels,” Levchenko said. “This told us that this amazing molecule was the master regulator not only of the blood vessel development, but that it also determined whether cardiac muscles and blood vessels would develop from the same cells, even though these types of tissue are quite different.” But would these lab discoveries make a difference in the treatment of living beings? To find out, the researchers, working on the heart-matching surfaces they had designed, limited the production of p190RhoGAP within the heart cells. The cells that possessed less of this protein integrated more smoothly into an animal’s blood vessel networks in the aftermath of a heart attack. In addition, more of these transplanted heart cells survived, compared to what had occurred in earlier cell-growing procedures.
Researchers create “Huntington’s disease in a dish” to enable search for treatment Newswise — Johns Hopkins researchers, working with an international consortium, say they have generated stem cells from skin cells from a person with a severe, early-onset form of Huntington’s disease (HD), and turned them into neurons that degenerate just like those affected by the fatal inherited disorder. By creating “HD in a dish,” the researchers say they have taken a major step forward in efforts to better understand what disables and kills the cells in people with HD, and to test the effects of potential drug therapies on cells that are otherwise locked deep in the brain.
Researchers at the Buck Institute have corrected the genetic mutation responsible for Huntington’s Disease (HD) using a human induced pluripotent stem cell (iPSC) that came from a patient suffering from the incurable, inherited neurodegenerative disorder. Scientists took the diseased iPSCs, made the genetic correction, generated neural stem cells and then transplanted the mutation-free cells into a mouse model of HD where they are generating normal neurons in the area of the brain affected by HD. Results of the research are published in the June 28, 2012 online edition of the journal Cell Stem Cell. iPSCs are reverse-engineered from human cells such as skin, back to a state where they can be coaxed into becoming any type of cell. They can be used to model numerous human diseases and may also serve as sources of transplantable cells that can be used in novel cell therapies. In the latter case, the patient provides a sample of his or her own skin to the laboratory. “We believe the ability to make patient-specific, genetically corrected iPSCs from HD patients is a critical step for the eventual use of these cells in cell replacement therapy,” said Buck faculty Lisa Ellerby, PhD, lead author of the study. “The genetic correction reversed the signs of disease in these cells – the neural stem cells were no longer susceptible to cell death and the function of their mitochondria was normal.” Ellerby said the corrected cells could populate the area of the mouse brain affected in HD, therefore, the next stage of research involves transplantation of corrected cells to see if the HD-afflicted mice show improved function. Ellerby said these studies are important as now we can deliver patient-specific cells for cell therapy, that no longer have the disease causing mutation. Huntington's disease (HD) is a devastating, neurodegenerative genetic disorder that affects muscle coordination and leads to cognitive decline and psychiatric problems. It typically becomes noticeable in mid-adult life, with symptoms beginning between 35 and 44 years of age. Life expectancy following onset of visual symptoms is about 20 years. The worldwide prevalence of HD is 5-10 cases per 100,000 persons. More than a quarter of a million Americans have HD or are "at risk" of inheriting the disease from an affected parent. Key to the disease process is the formation of specific protein aggregates (essentially abnormal clumps) inside some neurons. All humans have two copies of the Huntingtin gene (HTT), which codes for the protein Huntingtin (Htt). Part of this gene is a repeated section called a trinucleotide repeat, which varies in length between individuals and may change between generations. When the length of this repeated section reaches a certain threshold, it produces an altered form of the protein, called mutant Huntingtin protein (mHtt). Scientists in the Ellerby lab corrected the mutation by replacing the expanded trinucleotide repeat with a normal repeat using homologous recombination. Homologous recombination is a type of genetic recombination where two molecules of DNA are exchanged. In this case the diseased DNA sequence is exchanged for the normal DNA sequence.
Newswise — A recent study led by Andrew Puca, Ph.D. under the supervision and direction of Antonio Giordano, M.D., Ph.D. set out to illustrate novel mechanical transduction properties of Hematopoietic Stem Cells in relation to defining the expression of humoral factors by facilitating paracrine/autocrine signalling via microgravity.
Wednesday, June 27, 2012
Newswise — Justin M. Brown, MD, reconstructive neurosurgeon at UC San Diego Health System, is one of only a few specialists in the world who have pioneered a novel technique to restore hand function in patients with spinal cord injury. In a delicate four-hour procedure, Brown splices together tiny nerve endings, only one millimeter in width, to help restore hand mobility. Most patients return home 24 hours after surgery. “Even if a patient appears to have lost total hand function, as long as there is some nerve in the arm or shoulder under the patient’s control, some mobility may be regained,” said Brown, director of the Neurosurgery Peripheral Nerve Program and co-director of the Center for Neurophysiology and Restorative Neurology at UC San Diego Health System. “With a nerve transfer, the goal is to reverse paralysis. This means achieving functional grasp and release so that patients can eat independently, operate a computer or hold a loved one’s hand.”
UC San Diego doctors refute studies condoning moderate alcohol consumption during pregnancy Newswise — Experts at the University of California, San Diego School of Medicine disagree with a series of new studies from Denmark that suggest consumption of up to 8 alcoholic drinks a week or occasional binge drinking during pregnancy is generally safe for the developing baby. Kenneth Lyons Jones, MD, professor in the UCSD Department of Pediatrics and a renowned expert in birth defects, and Christina Chambers, MPH, PhD, director of the California Teratogen Information Service (CTIS) Pregnancy Health Information Line, say these studies are misleading to pregnant women, citing more than 30 years of research to the contrary.
An extensive review of pregnancies over the course of more than three decades shows that women with poorly managed asthma are at an increased risk of having a low-birth weight baby, a premature baby and other pregnancy complications, such as preeclampsia. The new study was recently published in the British Journal of Obstetrics and Gynecology. Christina Chambers, a professor of pediatrics at the University of California, San Diego was part of the team of seven researchers who reviewed data involving more than 1 million women between 1975 and 2009.Click here to read more about the study.
Baby born with six separate heart defects whose parents were told he would not survive stuns doctors with recovery
A toddler who is thought to be the only child in the world born with SIX separate heart defects has stunned doctors by making an incredible recovery - after they warned his parents he would not survive. Medics told Michelle and Darryl Lewis that their baby had an unheard of number of heart defects - and suggested they terminate the pregnancy at a 21-week scan. But the devastated couple refused - despite doctors’ fears that the baby would not survive the birth. However, despite a difficult birth and surgery at just eight-hours old, followed by four more operations, young Riley recovered. He made such good progress that he was even well enough to be a pageboy at his parents’ wedding just seven months later. Michelle, 32, from Tunbridge Wells, Kent, said: 'The surgeons told us that it is nothing short of a miracle that Riley is here at all. 'Doctors told us having one heart defect was unlucky, but six was unheard of. They said he probably wouldn’t make it through the pregnancy, and even if he did, wouldn’t survive long after birth. 'We were devastated - but Darryl and I just turned to each other and we knew straight away there was no way either of us was going to give up on our baby. 'The rest of my pregnancy was terrifying - I lived in fear of something going wrong. 'When I went into labour, Riley’s heart stopped every time I had a contraction - we nearly lost him so many times. But he’s a tough little boy and he’s amazed all of us - he’s so strong, and we are so proud of him.' The couple had no idea there were any complications with Michelle’s pregnancy until their 21 week scan - when nurses struggled to hear the heartbeat clearly. Michelle was booked in for a specialist scan a week later at the Evelina Children’s Hospital, London, where the stunned couple were told the extent of Riley’s condition. They were warned their unborn child was suffering from a back-to-front valves, a large hole in the heart, as well as various conditions that made his arteries thick and made it difficult for blood to pump around the body. Michelle said: 'The nurse’s face just dropped, and she went out of the room to get a doctor. 'I knew something was wrong and I just burst into tears before they had even told us what was wrong. It was the worst feeling I’ve ever experienced in my life.The doctors said our baby’s heart was struggling to pump blood around his body, and he had so many complications he wouldn’t live. 'They said they’d never seen anything like it - having so many conditions was totally unheard of. 'They took us into a little room and advised us to terminate the pregnancy - but there was no way we would even consider it. 'They agreed to monitor me closely throughout the rest of the pregnancy, but it was terrifying. I was always thinking the worst was going to happen, but Darryl kept me positive and was my rock. 'I was induced and went into labour on Christmas eve, and it lasted 15 terrifying hours before doctors decided to perform an emergency caesarean on Christmas day 2010. 'I heard this little scream and the nurses held him up for me to see before he was whisked away. I remember seeing his big beautiful eyes and just pleading that he would keep fighting.'
Scientists have successfully reversed diabetes in mice using stem cells, paving the way for a breakthrough treatment for the illness. The research is the first to show that human stem cell transplants can successfully restore insulin production and reverse diabetes in mice. Crucially, the team re-created the 'feedback loop' that enables insulin levels to automatically rise or fall based on blood glucose levels. Diabetes affects more than two million people in Britain. After the stem cell transplant, the diabetic mice were weaned off insulin, a procedure designed to mimic human clinical conditions. Three to four months later, the mice were able to maintain healthy blood sugar levels even when being fed large quantities of sugar. Transplanted cells removed from the mice after several months had all the markings of normal insulin-producing pancreatic cells.
Two mugs of coffee a day could help keep the heart healthy. A study has linked the drink with a lower risk of heart failure. With up to 40 per cent of those affected dying within a year of diagnosis, heart failure has a worse survival rate than many cancers. The latest research suggests that regularly drinking moderate amounts of coffee can cut the odds of cardiac trouble – though too much could be counter-productive. Crunching together the results of five previous studies, involving almost 150,000 men and women, showed that those who enjoyed one or two mugs of coffee a day were 11 per cent less likely to develop heart failure than those who had none. Heart attack survivors gained as much benefit as those with healthy hearts.
The world's first geneticallymodified humans have been created, it was revealed last night. The disclosure that 30 healthy babies were born after a series of experiments in the United States provoked another furious debate about ethics. So far, two of the babies have been tested and have been found to contain genes from three 'parents'. Fifteen of the children were born in the past three years as a result of one experimental programme at the Institute for Reproductive Medicine and Science of St Barnabas in New Jersey. The babies were born to women who had problems conceiving. Extra genes from a female donor were inserted into their eggs before they were fertilised in an attempt to enable them to conceive. Genetic fingerprint tests on two one-year- old children confirm that they have inherited DNA from three adults --two women and one man. The fact that the children have inherited the extra genes and incorporated them into their 'germline' means that they will, in turn, be able to pass them on to their own offspring. Altering the human germline - in effect tinkering with the very make-up of our species - is a technique shunned by the vast majority of the world's scientists. Geneticists fear that one day this method could be used to create new races of humans with extra, desired characteristics such as strength or high intelligence. Writing in the journal Human Reproduction, the researchers, led by fertility pioneer Professor Jacques Cohen, say that this 'is the first case of human germline genetic modification resulting in normal healthy children'. Some experts severely criticised the experiments. Lord Winston, of the Hammersmith Hospital in West London, told the BBC yesterday: 'Regarding the treat-ment of the infertile, there is no evidence that this technique is worth doing . . . I am very surprised that it was even carried out at this stage. It would certainly not be allowed in Britain.'
Monday, June 25, 2012
We need to eat less meat and recycle our waste to rebalance the global carbon cycle and reduce our risk of dangerous levels of climate change, according to scientists. New research from the University of Exeter shows that if today's meat-eating habits continue, the predicted rise in the global population could spell ecological disaster. But changes in our lifestyle and our farming could make space for growing crops for bioenergy and carbon storage. Though less efficient as an energy source than fossil fuels, plants capture and store carbon that would otherwise stay in the atmosphere and contribute to global warming. Burning our waste from organic materials, such as food and manure, and any bioenergy crops we can grow, while capturing the carbon contained within them, could be a powerful way to reduce atmospheric carbon dioxide. Published June 20, 2012 in the journal Energy and Environmental Science, the research suggests that in order to feed a population of 9.3 billion by 2050 we need to dramatically increase the efficiency of our farming by eating less beef, recycling waste and wasting less food. These changes could reduce the amount of land needed for farming, despite the increase in population, leaving sufficient land for some bio-energy. To make a really significant difference, however, we will need to bring down the average global meat consumption from 16.6 per cent to 15 per cent of average daily calorie intake -- about half that of the average western diet. The researchers argue that if we change the way we use our land, recycle waste, and dedicate enough space to growing bioenergy crops we could bring down atmospheric carbon dioxide to safe levels. Not doing this means we would lose our natural ecosystems and face increasingly dangerous levels of atmospheric carbon dioxide. The research team generated four different future scenarios, based on dietary preferences and agricultural efficiency up to 2050: 'high-meat, low-efficiency', 'low-meat, low-efficiency', 'high-meat, high-efficiency' and 'low-meat, high-efficiency'. The different agricultural options looked at the type of livestock being produced, with beef being the least energy-efficient and pork being the most. They also looked at how intensively animals are farmed and examined options for reducing food waste and making better use of manure to make livestock farming more efficient. They used established mathematical models to forecast the effects of each scenario on atmospheric carbon dioxide. By 2050, a 'high-meat, low-efficiency' scenario would add 55 ppm of carbon dioxide to the atmosphere, whereas a 'low-meat, high-efficiency' approach with carbon dioxide removal could remove 25 ppm. A 25 ppm reduction could mean we avoid exceeding the two-degree rise in global temperatures that is now widely accepted as a safe threshold. Lead researcher Tom Powell of Geography at the University of Exeter said: "Our research clearly shows that recycling more and eating less meat could provide a key to rebalancing the global carbon cycle. Meat production involves significant energy losses: only around four per cent of crops grown for livestock turn into meat. By focusing on making agriculture more efficient and encouraging people to reduce the amount of meat they eat, we could keep global temperatures within the two degrees threshold." Co-author Professor Tim Lenton of the University of Exeter said: "Bioenergy with carbon storage could play a major role in helping us reduce future levels of atmospheric carbon dioxide. However, we only stand a chance of realising that potential, both for energy and carbon capture, if we increase the efficiency of agriculture. With livestock production accounting for 78 per cent of agricultural land use today, this is the area where change could have a significant impact." Professor Tim Lenton is leading three consultation workshops as part of his review of Sustainability Research at the University of Exeter. Colleagues from all disciplines are invited to attend to contribute their ideas on the key Grand Challenges in Sustainability Research. Workshops are on 21 June and 5 July in Exeter and 3 July in Cornwall.
New discovery expands our knowledge as to when the mammalian cell detects an incoming viral attack -- and what the cell does to protect the body: The new finding may improve vaccine efficiency and could provide better treatment of recurrent infections. Researchers from Aarhus University have now located the place in the human body where the earliest virus alert signal triggers the human immune system. They have also discovered a new alarm signal, which is activated at the very first sign of a virus attack. The groundbreaking finding has just been published in the scientific journal Nature Immunology. "It may turn out that patients suffering from frequent infections actually have problems with activating the mechanism that we have now detected," says Søren Riis Paludan, professor of immunology and virology at Aarhus University, who has completed the study together with Christian Holm, postdoc at Aarhus University. Cell membrane triggers the alarm Recent research indicates that our immune system is alerted about a threatening virus infection when genomic material from the virus enters the cell. Researchers from Aarhus University have revealed a process which is triggered already before the foreign genomic material enters the cell, i.e. in the membrane surrounding the cell. "We have detected a new immune alarm signal, which helps the cells realize that they may soon get infected with virus," says Søren Riis Paludan. Without this knowledge, the body cannot start fighting virus, which then may spread freely and possibly result in e.g. AIDS, hepatitis, influenza and cold sore. Alarm signals in two directions "The cellular membranes are in this situation comparable to a borderline territory in looming war -- and this is the place to put an outpost," says Christian Holm. The 'outpost' will send alarm signals in two directions when danger is detected. One signal (outbound) will prepare the body for a possible attack, whereas the other signal (inbound) will make the cell investigate the threat. "In the present study, we have revealed that this happens -- and what this process means. In future studies, we will investigate how this happens," the researchers say. They add that this new knowledge could also lead to development of more efficient vaccines. The researchers from Aarhus University have collaborated on the project with colleagues from Yale University School of Medicine and the University of Massachusetts. Article: "Virus-cell fusion as a trigger of innate immunity dependent on the adaptor STING"
A fungicide used on farm crops can induce insulin resistance, a new tissue-culture study finds, providing another piece of evidence linking environmental pollutants to diabetes. The results were presented June 23 at The Endocrine Society's 94th Annual Meeting in Houston. "For the first time, we've ascribed a molecular mechanism by which an environmental pollutant can induce insulin resistance, lending credence to the hypothesis that some synthetic chemicals might be contributors to the diabetes epidemic," said investigator Robert Sargis, M.D., Ph.D., instructor in the endocrinology division at the University of Chicago. The chemical, tolylfluanid, is used on farm crops in several countries outside of the United States to prevent fungal infestation, and sometimes is used in paint on ships to prevent organisms from sticking to their hulls. Animal studies have indicated that the chemical may adversely affect the thyroid gland, as well as other organs, and that it may increase the risk of cancer in humans. Within the last decade, research attention has increasingly focused on the link between environmental contaminants and the rising rates of obesity and diabetes throughout many parts of the world. In the United States alone, nearly 26 million adults and children have some form of diabetes, according to the American Diabetes Association. A serious disease by itself, diabetes also increases the risk of other medical complications, including heart and blood-vessel diseases. Normally, the pancreas secretes the hormone insulin, which acts to regulate blood-sugar levels. Among diabetic patients, insulin secretion either decreases or stops altogether, or cells become resistant to the hormone's activity. These conditions then disrupt the process that transports sugar, or glucose, from the blood to the body's other cells, which can lead to the dangerously high blood-sugar levels associated with diabetes. In this project, Sargis and his co-investigators used mouse fat to examine the effects of tolylfluanid on insulin resistance at the cellular level. They found that exposure to tolylfluanid induced insulin resistance in fat cells, which play a critical role in regulating the body's blood glucose and fat levels. When exposed to tolylfluanid in culture the ability of insulin to trigger action inside the fat cell, or adipocyte, was reduced, which is an early indication of diabetes. "The fungicide and antifouling agent tolylfluanid may pose a threat to public health through the induction of adipocytic-insulin resistance, an early step in the pathogenesis of type 2 diabetes," Sargis said. "Based on these studies, further efforts should be undertaken to clarify human exposure to tolylfluanid and the possible metabolic consequences of that exposure." At the same time, tolylfluanid-exposed cells stored more fat, or lipids, in a similar action to a steroid called corticosterone. Like this steroid, tolylfluanid bound receptors in fat cells, called glucocorticoid receptors, which help regulate blood-sugar levels, as well as many other important body processes. "For the public, this raises the specter of environmental pollutants as potential contributors to the metabolic disease epidemic," said Sargis, adding that, "hopefully, it will put further pressure on public policy makers to reassess the contribution of environmental pollution as a contributor to human disease in order to encourage the development of strategies for reversing those effects." The National Institute of Environmental Health Sciences and the University of Chicago Diabetes Research and Training Center funded this research.
A 150-foot-high garbage dump in Colombia, South America, may have new life as a public park. Researchers at the University of Illinois have demonstrated that bacteria found in the dump can be used to neutralize the contaminants in the soil. Jerry Sims, a U of I associate professor of crop sciences and USDA-Agricultural Research Service research leader and Andres Gomez, a graduate student from Medellín, Colombia, have been working on a landfill called "El Morro" in the Moravia Hill neighborhood of Medellín, which served as the city dump from 1972 to 1984. In that period, thousands of people came to the city from the rural areas to escape diverse social problems. There was no housing or employment for them, so they made a living picking up trash from this dump and built their homes upon it. "There are some frightening pictures of this site on the Internet," said Sims. "At one point, close to 50,000 people lived there. They grew vegetables on the contaminated soil and hand-pumped drinking water out of the garbage hill." In recent years, the Colombian government decided to relocate the people to different neighborhoods with better conditions. Then they decided to see if it was possible to clean up the area and turn it into a park. Unfortunately, the most reliable solution -- digging up the garbage and treating it -- is not economically feasible in Colombia. Another problem was that there were no records of exactly what was in the dump. "Apparently, hydrocarbon compounds were one of the main sources of contamination," said Gomez. "Phenyls, chlorinated biphenyls, and all kinds of compounds that are sometimes very difficult to clean up." Three professors from The National University of Colombia in Medellin -- Hernan Martinez, Gloria Cadavid-Restrepo and Claudia Moreno -- considered a microbial ecology approach. They designed an experiment to determine whether bioremediation, which uses biological agents such as bacteria or plants to remove or neutralize contaminants, could be used to clean the site. Gomez, who was working on his master's thesis at the time, collaborated with them. He was charged with finding out if there were microorganisms living in the soil that could feed on the carbon in the most challenging contaminants. This was not a trivial task. As Sims explained, "There are maybe 10,000 species of bacteria and a similar number of fungi in a gram of soil." Gomez's work was further complicated by the fact that the material in the hill was loose and porous with air spaces and voids that resulted from dirt being thrown over layers of garbage. Because of the unusual physical structure and the contaminant levels, it was unclear if the indigenous bacterial community would be as complex, and thus as effective at bioremediation, as those normally found in soils. Gomez analyzed bacteria at different depths in the hill down to 30 meters. He found microbial communities that appeared to have profiles typical of bacteria involved in bioremediation. The communities seemed to contain a robust set of many organisms that could be expected to weather environmental insults or manipulations. Gomez then came to Sims's lab at the U of I on a grant from the American Society for Microbiology to perform stable isotope probing, a test to link diversity and function that he was not able to do in Colombia. Contaminants are labeled with a heavy isotope that serves as a tracer that can be detected in the end products of biodegradation. His results confirmed that the bacterial communities had, in fact, been carrying out bioremediation functions. In collaboration with assistant professor of microbial ecology Tony Yannarell who assisted with the microbial diversity analysis, he determined that the organisms involved changed at every depth. Based on these results, the Colombian government decided to go ahead with the bioremediation project using the indigenous organisms. One of the professors who worked on the pilot study is looking at ways to provide the microorganisms with extra nutrients to speed up the process. Another project takes a phytoremediation approach, which uses plants to absorb heavy metals. Gomez has gone back to his first love, animal microbiology. While he was at U of I, he met animal sciences professor Bryan White and is now working on a Ph.D. studying the microflora of primates.
Steady advances in automated liquid handling at the µL to nL level are enabling more scientists around the world to reap the benefits of increased throughput, decreased costs, and more efficient use of reagents. Many of the latest techniques were discussed at the recent “European Lab Automation” conference. For example, Hugues Ryckelynck, scientific associate II, oncology disease area, biochemistry unit, NIBR (Novartis Institute of Biomedical Research), described how to quickly implement automatic liquid handlers to dispense accurately and rapidly in the nL and μL ranges. He noted that “automated liquid-handling systems represent a great opportunity to increase experimental throughput and reduce reagents costs through assay miniaturization.” However, he emphasized that setting up an automatic liquid dispenser to operate optimally in the nL to low μL ranges requires careful observations at the bench, or the results are likely to be highly variable and of low quality. Ryckelynck went on to describe simple operations to be done by hand with air and positive-displacement pipettes in order to anticipate problems such as high liquid viscosity, surface bonding, foaming, and surface tension that will be faced during the development of assays on nL and μL automatic liquid dispensers. He further stressed that “when pipetting nanovolumes, an important source of variation is also the carryover from the source and from well to well.” Ryckelynck gave practical examples of procedures developed by his group to optimize liquid handling on air/positive displacement (pipettes) and injector (pressure-based) systems that can be applied when working at submicromolar and subnanomolar concentrations of target proteins or reagents. He explained that the choice of robotic dispenser, i.e., dispensing technique (injectors vs. pipette) and speed of dispensing, and the experimental setup in general, should be made based on the biophysical properties of compounds and solutions. The usual programming for complex solutions would be to pipette in a single pipetting mode (take once, dispense once, and change the tips). A faster and more efficient technique is serial dispensing (take once, dispense many), which allows the user to save on reagents and consumables and reduces experimental variability when optimally set (changing tips is like getting a new tool with its own new variability). By performing sequential dummy runs, Ryckelynck showed high variability at the beginning and at the end of serial dispenses with all the dispensing systems used. This variability can be compound/reagent-related, a carryover effect, a result of forward pipetting, or inherent to the dispensers. Ryckelynck developed simple dispensing procedures in order to avoid these problems. These procedures involve: (i) working in the optimal dispensing series of the instruments and getting rid of carryover by performing prime and post-dispenses, and (ii) saving time by minimizing well-to-well contamination flaws by performing dispenses in logical series. He gave practical examples of how compound dilution protocols that avoid contamination and time loss in the nL range can be carried out quickly, and be easily implemented on a positive-displacement system. He explained his “reverse-half-Log” dilution technique, which provides the double advantage of an internal control on all dilutions and data best suited to Log presentations. In conclusion, Ryckelynck presented a flexible laboratory setup for an optimal use of nL and µL liquid handlers that is used in his unit. Key features of this setup include the use of standalone instruments working on modules of experiments running in parallel: (i) bulk dispense of microvolumes for limited experimental conditions is performed with pressure-based dispensers, (ii) filling from complex sources is performed using an air-displacement liquid handler for microvolumes and nanoliter positive dispenser for nanovolumes, (iii) source and plate (re)organization is done using an air-displacement liquid handler while (iv) compound dilutions and nL transfer to limit solvent impact on the experiments is performed with a nanoliter positive dispenser. Cycloolefin Microplates Rainer Heller, Ph.D., is a leader in Greiner Bio-One’s high-throughput screening (HTS) group, where he is responsible for launching many new products dealing with HTS. Greiner Bio-One boasts newly designed microplates made from cycloolefins. Due to their excellent optical, chemical resistance, and physical properties, cycloolefin microplates have become increasingly popular in research and high-throughput applications, Dr. Heller said. A variety of different cycloolefin microplates are now, or will soon be, available from Greiner Bio-One for different purposes, including cell-based assays, compound storage, liquid handling, and biochemical assays. The new 1,536-well SCREENSTAR Microplate, for example, is a cycloolefin microplate designed for microscopic applications, high-content screening, and high-throughput screening. The SCREENSTAR Microplate was co-developed by Greiner Bio-One and GNF Systems and features a black pigmented frame with a 190 µm ultra-clear film bottom for ideal compatibility with instrument optics. Well bottoms display excellent optical properties for the highest optical transparency, with reduced autofluorescence in the lower UV range, low birefringence, and a refractive index of 1.53, the same as glass. Recessed microplate wells enable complete periphery access for high-magnification objectives. Cell culture treatment and sterility assure exceptional performance, he said, for high-content screening, especially with fluorescence microscopy in the lower UV range. A smooth microplate top, absent of alphanumeric coding, facilitates flush lid mounting for use with ultra-high-throughput screening systems. A similarly constructed 96-well cycloolefin microplate will soon be available from Greiner Bio-One, and the company is currently developing a similar 384-well cycloolefin microplate. Another application of cycloolefin microplates is for compound storage, as cycloolefins exhibit low water absorption, low impurities, high transparency, and resistance to polar solvents, particularly DMSO, which is commonly used to preserve biological samples. In order to make the latest technical and design innovations available for HTS, Greiner Bio-One will soon be introducing a new 1,536-well cycloolefin microplate for compound storage, liquid handling (including acoustic systems and pin tools), and transmission measurements in biochemical assays. This new 1,536-well microplate will follow the most relevant ANSI recommendations and feature a smooth microplate top, also absent of alphanumeric coding to facilitate flush lid mounting for use with the GNF ultra-high-throughput screening system. The wells are more tapered than in classic 1,536-well microplates, reducing the dead volume in different liquid-handling applications. Regulated Bioanalysis Joseph A. Tweed, a bioanalytical scientist working in the pharmacokinetics, dynamics, and metabolism department at Pfizer, described the development of an internal, regulated, automated sample-preparation and extraction platform for use on the Hamilton MICROLAB® STAR liquid-handling workstation. Tweed noted that the sample-preparation and extraction techniques used for regulated preclinical and clinical bioanalysis of serum, plasma, urine, and cerebrospinal fluid are often very repetitive and tedious tasks that can greatly benefit from automation. His group chose the Hamilton STAR because of its air-displacement pipetting technology and its ability to pipet microliter (µL) volumes with reliable precision and accuracy. In addition, the Hamilton STAR offers flexible and customizable deck platforms with robotic manipulation arm(s) and integrated one-dimensional (1-D) and two-dimensional (2-D) bar code scanners, he said. Tweed and his group developed a graphical user interface to couple with the Hamilton STAR liquid-handling method. This approach allows the bioanalytical scientist increased flexibility and customization of study- and assay-specific parameters for any given bioanalytical sample-preparation technique selected. Tweed said the platform incorporates the ability to batch process study samples among five automated extraction techniques: protein precipitation, solid-phase extraction, liquid-liquid extraction, plate-based protein precipitation, and supported liquid extraction. Additional features include the ability to prosecute routine sample batches via an ordered laboratory information management system sequence or randomized 1-D or 2-D barcodes. Tweed said that the software package and the modular method design provide a flexible and versatile approach for routine bioanalytical sample preparation. The advantages provided by this technology are that it offers increased throughput, improved chain-of-custody for study sample analysis, and a streamlined approach for routine bioanalytical sample preparation. Tweed noted that after sample preparation has been accomplished, specimens are analyzed via liquid chromatographic tandem mass spectrometric analysis.