Tuesday, November 8, 2011

Biochemist India: Lipid Tutorials

Biochemist India: Lipid Tutorials: Tutorials and lectures on lipids
Index Tutorial Series on Lipid Chemistry and Lipid Metabolomics Presented as a set of Powe...

Biochemist India: Lipidomics methods Ov...

Biochemist India: Lipidomics methods

Ov...
: Lipidomics methods Overview of Quantitative Lipid Analysis by Mass Spectrometry (png) (PDF) - Flow chart ou...


  • Overview of Quantitative Lipid Analysis by Mass Spectrometry (png) (PDF) - Flow chart outlining the procedure for lipid analysis of various lipid classes, from extraction to laboratory analysis to bioinformatics.



  • Methods in Enzymology Volume 432, LIPID MAPS chapters - (PDF) For the complete Methods in Enzymology volumes, including LIPID MAPS chapters in Volumes 432, 433, and 434, please see http://www.sciencedirect.com/science/bookseries/00766879..
    References to individual chapters from these volumes can also be found on our LIPID MAPS Publications: 2007 page.

    • Qualitative Analysis and Quantitative Assessment of Changes in Neutral Glycerol Lipid Molecular Species Within Cells

    • Glycerophospholipid Identification and Qualitation by Electrospray Ionization Mass Spectrometry

    • Detection and Quantitation of Eicosanoids via High Performance Liquid Chromotography-Electrospray Ionization-Mass Spectrometry

    • Structure-Specific, Quantitative Methods for Analysis of Sphingolipids by Liquid Chromotography-Tandem Mass Spectrometry: "Inside-Out" Sphingolipidomics

    • Analysis of Ubiquinones, Dolichols, and Dolichol Diphosphate-Oligosaccharides by Liquid Chromatography-Electrospray Ionization-Mass Spectrometry

    • Extraction and Analysis of Sterols in Biological Matrices by High Performance Liquid Chromatography Electrospray Ionization Mass Spectrometry

    • The Lipid Maps Initiative in Lipidomics

    • Bioinformatics for Lipidomics

    • Quantitation and Standardization of Lipid Internal Standards for Mass Spectrometry



  • Lipidomics workshop at EB2009 - (PDFs) The LIPID MAPS Consortium presented a workshop on lipidomics techniques at EB2009 under the auspices of the ASBMB. The slides from each of these presentations contain references to most of the methodoligies described .


  • Lipidomics workshop at EB2007 - (PDFs) The LIPID MAPS Consortium conducted a workshop on lipidomics at Experimental Biology 2007 under the auspices of the American Society for Nutrition. A panel of eight experts discussed methods for analysis of various lipids by mass spectrometry, plus related issues such as sample extraction, internal standards, bioinformatics, and nomenclature.

Saturday, November 5, 2011

Biochemist India: Biostatistics

Biochemist India: Biostatistics: The ROC Curve: Uncovering the pearls When clinicians order blood tests for patients, in essence they are asking for help in mak...

Biochemist India: Quality Management

Biochemist India: Quality Management: Quality in Health Care: How Can Labs Improve the Total Testing Process? Over the past decade, results from published studi...

Biochemist India: Metabolic Syndrome

Biochemist India: Metabolic Syndrome: What Lab Tests Should Be Used for Assessing Patients? Seven years ago, the World Health Organization (WHO) published the first defini...

Biochemist India: Quality Control: How Labs Can Apply Six Sigma Prin...

Biochemist India: Quality Control: How Labs Can Apply Six Sigma Prin...: Since the publication of the Centers for Medicare and Medicaid Services’ (CMS) Clinical Laboratory Improvement Amendments (CLIA) final rul...

Quality Control: How Labs Can Apply Six Sigma Principles To Quality Control Planning


Since the publication of the Centers for Medicare and Medicaid Services’ (CMS) Clinical Laboratory Improvement Amendments (CLIA) final rules in January 2003, many in the clinical laboratory community have questioned the amount of quality control (QC) needed in labs today. Based on the improved performance of new analytical systems, some laboratorians might be misled into thinking that less QC is required for such devices. However, in order to determine the appropriate amount of QC, the question that needs to be answered is: How does an instrument’s analytical performance relate quantitatively to QC?
The emergence of Six Sigma Quality Management as a laboratory quality management tool provides the framework for addressing the issues surrounding what has become known as equivalent QC (EQC). While much has been written about the complexities of Six Sigma, laboratorians can actually use a simple calculation to determine a sigma metric that characterizes the level of quality required for a test and the precision and accuracy observed for the measurement procedure. This metric can then be related to the rejection characteristics of QC procedures in order to select the appropriate control rules and the number of control measurements for an individual lab test.
This article will both describe the relationship of Six Sigma to QC, as well as provide a practical QC planning tool that will allow laboratorians to apply this approach in their own labs.
Read more at-
http://www.aacc.org/publications/cln/series/2006/Documents/qualityControl_January2006.pdf

Folate: An Overview

Researchers first documented an association between pregnant women’s folate levels and neural tube defects (NTD) in newborns in the 1960s. Today, low folate is a well-established risk factor for NTDs, and in 1998, the Food and Drug Administration mandated fortification of cereal grain products in the U.S. to ensure that women who are unaware of their pregnancies do not compromise their fetuses during the critical stage when the neural tube closes. The primary driving force behind this policy was the recognition that periconceptional folate supplementation, in addition to normal dietary folate intake, significantly reduces the incidence of NTDs. In fact, many countries now have either mandatory or voluntary flour fortification regulations to reduce the risk of NTDs in newborns.

Since the fortification of grains with folate began, there has been a substantial rise in folate concentration in the general population. So much so that low folate levels are now rare. Although folate levels in pregnant women are critical to the health of newborns, the majority of laboratory test orders for folate are made to investigate anemia along with vitamin B12 deficiency in nonpregnant patients.

This article will describe the metabolism of folate, the causes, clinical effects and prevalence of folate deficiency, as well as evidence about diagnostic thresholds and the clinical utility of folate testing.

Folate Facts

Folic acid (pteroylmonoglutamate) acts as a carrier of one-carbon units in a variety of metabolic reactions (Figure 1). This class of vitamins is essential for the synthesis of nucleic acids, thymidylate, neurotransmitters, phospholipids, and hormones. Folate is also integral to the de-novo generation of methionine, which is required for genomic and nongenomic methylation reactions.

Humans lack the enzymes to synthesize folate, so dietary intake is necessary. Nutritionists estimate that the body stores 10–100 mg of folate, with 5–10 mg sufficient for about 4 months of normal metabolism.

Rich sources of dietary folate include green leafy vegetables, fruits, dairy products, cereals, yeast, and animal proteins. Cooking, however, destroys most folate in food. The dietary reference intake (DRI) recommendation for adults in the U.S. is 400 µg/day, and other countries have similar recommendations. For children, the recommended DRI is lower, but higher amounts are recommended for women during pregnancy and lactation. Natural folate is 50% bioavailable compared to 85% in fortified foods and almost 100% when taken as a supplement.

FOLATE METABOLISM-

Folate Metabolism

Thursday, November 3, 2011

Biochemist India: A New Approach to Quality Control

Biochemist India: A New Approach to Quality Control: biochemist INDIA (3 nov. 011): When the Centers for Medicare and Medicaid Services (CMS) finalized the Clinical Laboratory Improvement Amen...

A New Approach to Quality Control

biochemist INDIA (3 nov. 011): When the Centers for Medicare and Medicaid Services (CMS) finalized the Clinical Laboratory Improvement Amendments regulations in 2003, many in the lab community expressed dissatisfaction with what was perceived as ambiguous and unscientific guidance on how to conduct quality control (QC). While the regulations set basic requirements for testing external QC materials, most laboratories found they needed to go above and beyond these standards to avoid quality problems. In the 8 years since the agency published the final regulation, exactly how often labs need to perform external QC and other quality checks has been widely debated. Quality tools like Six Sigma, Lean, and others abound, but so far, a comprehensive approach to QC that suits regulators and a majority of the laboratory community has not emerged.

Now the Clinical and Laboratory Standards Institute (CLSI) has published a long-awaited guideline that aims to fill this gap, enabling labs to customize QC to match both changing technology and the uniqueness of each lab. However, in what form CMS and other accrediting organizations will adopt or endorse it remains to be seen.

Published in October, the new guideline, EP-23, Laboratory Quality Control Based on Risk Management, translates the time-honored concept of risk management used in manufacturing, defense, aerospace, and other industries into the language of the clinical lab. Risk management has been standard for engineering and other professions for decades, and an application for laboratories is long overdue, said James Nichols, PhD, chair of the CLSI subcommittee that developed the guideline.

“We’re all using devices in different ways with different types of staff. And it’s becoming even more complicated the more laboratories decentralize testing to nursing units or physician offices,” Nichols said. “Risk management can help labs build a custom QC plan for each device and test that strikes the optimum balance of built-in internal monitoring systems with external QC, as well as all of the other processes that we have to reduce our risk of errors.” Nichols is a professor of pathology at Tufts University School of Medicine and director of clinical chemistry at Baystate Health in Springfield, Mass.

Risk in the Lab

Whether a lab produces 500 or 5 million results a year, the lab director bears ultimate responsibility for the accuracy and timeliness of each test result. Yet even as advanced instruments, information technology, and automation have enabled labs to accomplish more with less, the complexity of the lab as a web of systems within systems continues to swell, and the greater volume and power of lab testing means the stakes are even higher if a serious error occurs.

CLSI designed EP-23 to help labs create thoughtful, deliberate QC plans that tackle the many sources of error that other approaches do not always account for. In fact, laboratorians already engage in ad hoc risk management every day, according to Nichols. “Risk management is essentially the process of sitting down and saying, what can go wrong when I perform a test, and what do I do to prevent that error from happening? We as laboratorians are actually trained to think in that manner, and we do that every day,” he said. “With risk management in EP-23, we’re giving this a name and putting it together in a formal plan—for example, documenting why I run liquid QC, and when I do based on specific risks in my lab. Is liquid QC really telling me everything I need to know about what could go wrong with the system, or do I need other types of control processes to fill in those holes and improve my confidence?”

The heart of risk management is balancing potential sources of error, referred to as hazards, with the right controls. To help labs do this, EP-23 explains how risk management applies to the lab as a continuous process of risk assessment, control, monitoring, and failure investigation. Most of what will be new to laboratorians falls under the risk assessment phase, which includes tools for identifying hazards and estimating and evaluating the risk (See Figure 1,below).

Figure 1
Life Cycle Risk Management Process

risk figure 1

Risk management aims to identify sources of error and match up these hazards with appropriate controls. It begins with creating a map of each lab process, such as performing a particular test, then considering how the process could fail at each step. Risk estimation follows, which includes assessing the likelihood of a given failure, as well as the severity of patient harm resulting from the failure. Third, risk evaluation compares the results of risk estimation to the lab’s risk controls. Risk monitoring and failure investigation complete the loop.

Source: CLSI EP-23A.

EP-23 encourages laboratorians to think globally and consider information on risk from many sources, for example: the environment of the lab, staff competence, internal and external evaluation/verification data, clinical application of tests, and the severity of patient harm that would result from an error. The idea is to make decisions about quality control based on greater depth and breadth of information.

As a primer on risk management, EP-23 can help laboratorians take a more comprehensive view of their operations that builds on the real-world consequences and sources of errors, according to Greg Cooper, CLS, MHA, CQA, who worked on the CLSI subcommittee responsible for EP-23 and also is chair of the organization’s Consensus Committee on Evaluation Protocols. “This is written by laboratorians for laboratorians—it’s not just something extracted from documents for another industry,” he said. “The lab is going to gain a better understanding of how it’s really operating and where its potential failures might come from, and what it needs to do to prevent those failures in the future. Labs will also better use their resources because QC is going to be focused where it needs to be focused.” Formerly a quality control and education expert for Bio-Rad Laboratories Quality Systems Division, Cooper now runs his own consulting practice.

Doing the Right QC

Although in some cases, risk management could enable labs to limit their dependence on external QC, EP-23 is about doing the right QC, not less, Cooper emphasized. “EP-23 is about right-sizing the QC for the test,” he said. “This gets labs thinking about all of the conditions, activities, and processes that they control that could potentially create a hazardous situation and cause harm to a patient. It’s not only about the device failing and not catching it.”

In effect, EP-23 targets the vacuum left after stakeholders could not agree on in-depth, evidence-based QC requirements for the 2003 CLIA final regulations, which now carry minimum requirements held over from older technology. The requirement for testing two levels of liquid QC every day a test is run comes from the days when labs ran just a few batches of patient samples a day, Nichols explained. “With the new, more automated analyzers, there is no longer batch analysis and patient samples are analyzed continuously,” he said. “So now the question is, do we hold those samples until the next QC run, or do we run QC continuously, every 10, 20, or 50 samples and release results in small batches? These operational considerations lead to turnaround time issues, cost issues, and resource issues. By analyzing the risk of errors and the mitigations in place, EP-23 can help provide a framework for explaining the reasoning behind the lab’s decision-making.”

With a focus on the analytical phase of testing, EP-23 zeroes in on how to make such decisions about QC at a time when manufacturers advertise a litany of internal monitoring systems and other quality checks built into new instruments. These high-tech features minimize the potential for certain errors, in many cases duplicating external QC. For other potential errors, external QC is not redundant, leaving the lab to weigh all the potential causes of error against the options for checking quality. Risk management includes process mapping that lets laboratories examine all processes, how these processes relate or interact with one another, and potential sources of error so that the lab has a complete picture of its unique situation when preparing its customized QC plan (See Figure 2, below).

Figure 2
Hazard Identification Phase of Risk Management

risk figure 2

A tool used in the hazard identification phase of risk management, a fishbone diagram identifies the possible causes of errors—called failure modes—and their effects. This example from EP-23 lists failures that could lead to incorrect test results.

Source: CLSI EP-23A.

A customized plan means that each test on each system can get the particular QC treatment it needs, noted Cooper. “Frankly, there are some tests—and everyone in the lab knows which ones those are—that are rock-solid performers that never, ever change. So, do you really need to do external QC every day for those? Probably not. But there are many other tests that are not quite so rock-solid in their performance, and they’re sensitive to reagent lot changes and other variables. For these tests, there is a need to have more intense scrutiny of your QC.” EP-23 also prompts the laboratorian to look at factors outside of the analytical phase that might inform the type and intensity of QC, Cooper said. For example, how the test is used clinically: an incorrect triglyceride would not carry the same risk as a false-negative HIV result.

Nichols emphasized that risk management would rarely mean complete reliance on an instrument’s internal monitoring systems, however. “A device may have an electronic check that looks at the device itself and tells us that it’s functioning to specification, but that doesn’t tell you anything about the liquid reagents that are read by that device,” he said. “That’s where we have to consider what other types of controls we have.”

A customized QC plan based on risk management can also help the lab justify elements of its budget. Testing external liquid QC means consuming reagents and takes staff time, Nichols noted. “More and more, hospital administrators are asking us why we have so much non-productive testing, and everyone is thinking about limited resources in terms of staff and money,” he said. “With risk management, you’ve documented why you run external QC or perform other controls when you do, and how it addresses your particular risks.”

Beyond the Analytical Phase

Despite EP-23’s focus on the analytical phase of testing, risk management could also do a lot to help labs deal with pre-analytical and post-analytical errors, stressed Jan Krouwer, PhD, an in vitro diagnostics industry consultant who has worked with test and instrument manufacturers on risk management. As a standard in engineering, manufacturers have employed risk management for decades, but the concept has broad application outside that realm. “It could work for labs. In fact, it’s an essential part of how you would manage quality in a laboratory, because there are a lot of errors in a lab that are really dealt with best by risk management,” he said. “In particular, all of the pre-analytical and post-analytical errors, you’re not going to assess them very effectively by the usual evaluation means of method comparison and imprecision studies. But they are amenable to risk management.”

Risk management can help labs understand and manage the limitations of traditional external QC, according to Krouwer. “Essentially, external QC is just looking at differences from the target. But, if you’re asking about how all of the steps in your process could go wrong, and seeing whether or not you have effective means of preventing each item from going wrong, that’s quite a different process, and one that could be quite useful for laboratories.”

However, Krouwer warned that the well-established techniques in risk management need to be followed closely to achieve the maximum benefit. “This is one of the potential problems with risk management. Everyone does risk management to some degree; it’s in everyone’s life. For example, when do I change lanes on the freeway? Formal risk management comprises a set of techniques that’s not hard to learn. But to do the technique informally is not as valuable,” he said. “If risk management is implemented informally, you can miss things.”

Hospitals accredited by the Joint Commission must conduct a risk assessment of one area of their operations every 18 months. The organization does not track which operational area hospitals choose, and could not say which, if any, hospitals had chosen the lab. Krouwer’s experience with hospital risk management serves as a cautionary tale, though. He wrote a software application for hospitals to help them fulfill this requirement, but hospitals struggled with taking the time and effort the risk management process required even with the aid of the software, he said. “My sense was that they wanted something much quicker where they could just check off boxes,” he said.

A New Option for Regulators?

While quality experts in the lab community have mulled the concept of risk management for a long time, the current push that resulted in EP-23 began in 2005 when CMS convened its “QC For the Future” meeting that brought together professional associations, industry representatives, and other regulatory agencies to tackle the gaps left by the CLIA final rule.

In the original 1992 version, CLIA called for the FDA to review manufacturers’ claims about the built-in QC features of their devices, with labs responsible for following manufacturers’ instructions. When FDA could not meet its end of the bargain due to resource constraints, CMS had to regroup. The result was the basic 2-levels-a-day default for external QC, and the call for lab directors to develop their own quality plan that went beyond the minimum and that took into account the lab’s particular needs and environment.

CMS now plans to evaluate EP-23, in concert with other lab accreditors, to consider making risk management a part of CLIA’s interpretive guidelines. “We engaged in a partnership with CLSI because they use a consensus approach to their documents. We felt it would be good to work with an organization like CLSI so that people would not assume that we were sitting in our ivory tower and making up policy on quality control,” said Judith Yost, MA, MT (ASCP), director of CMS’s Division of Laboratory Services. “We have also provided, among the other government agencies involved in CLIA, representation on the subcommittee that developed the document.”

Yost pointed out that the top 10 deficiencies chart that CMS has tracked over the years always features QC problems. Although the number of labs cited by CLIA for QC failures has declined in recent years, other accreditors like the College of American Pathologists (CAP) and the Joint Commission rank QC issues at the top of their deficiencies. “Problems with quality control do not seem to be going away, and that’s why we feel that whatever we do next is so crucial,” Yost said. “That’s why we will be looking at EP-23 so closely.”

Wednesday, November 2, 2011

What role do cytokines play in autoimmune diseases?

biochemist INDIA (2 Nov 2011): Cytokines, a varied group of signalling chemicals in the body, have been described as the software that runs the immune system, but when that software malfunctions, dysregulation of the immune system can result in debilitating autoimmune diseases.
In the introductory Editorial, researchers identifies cytokines as the first step in the onset of immune responses in which the body attacks its own cells and tissues, leading to the development of autoimmune diseases. Drs Moudgil and Choubey present an overview of the role cytokines play in the induction, regulation, and treatment of autoimmunity. An original research article, ‘Critical Cytokine Pathways to Cardiac Inflammation’, by Noel Rose, The Johns Hopkins Schools of Medicine and Public Health (Baltimore, MD), describes a mouse model of autoimmune myocarditis – inflammation of the heart muscles – that is triggered by infection with Coxsackievirus B3. The model allows researchers to study the cytokine pathways involved in this disease, with the goal of identifying chemical markers that could be used to predict patients more likely to experience an autoimmune reaction after infection.
most up-to-date findings and unique perspectives on the role of cytokines in autoimmune diseases are published in a special issue of Journal of Interferon & Cytokine Research; available free online for a limited time.

Monday, October 31, 2011

Monoclonal antibody therapy

biochemist INDIA (31 Oct 011): The main objective is stimulating the patient's immune system to attack the malignant tumor cells and the prevention of tumor growth by blocking specific cell receptors.

Variations exist within this treatment, e.g.radioimmunotherapy a radioactive dose directly to the target cell, and lethal chemical doses to the target. Structure and function of human and therapeutic antibodies Immunoglobulin G (IgG) antibodies are large heterodimeric molecules, approximately 150 kDa and are composed of two different kinds of polypeptide chain, called the heavy (~50kDa) and the light chain (~25kDa).

There are two types of light chains, kappa (κ) and lambda (λ).

By cleavage with enzyme papain, the Fab (fragment-antigen binding) part can be separated from the Fc (fragment crystalline) part of the molecule.

The Fab fragments contain the variable domains, which consist of three hypervariable amino acid domains responsible for the antibody specificity embedded into constant regions.

There are four known IgG subclasses all of which are involved in Antibody-dependent cellular cytotoxicity. The immune system responds to the environmental factors it encounters on the basis of discrimination between self and non-self.

Tumor cells are not specifically targeted by one's immune system since tumor cells are the patient's own cells.

Tumor cells, however are highly abnormal, and many display unusual antigens that are either inappropriate for the cell type, its environment, or are only normally present during the organisms' development (e.g.

fetal antigens). Other tumor cells display cell surface receptors that are rare or absent on the surfaces of healthy cells, and which are responsible for activating cellular signal transduction pathways that cause the unregulated growth and division of the tumor cell.

Examples include ErbB2, a constitutively active cell surface receptor that is produced at abnormally high levels on the surface of approximately 30% of breast cancer tumor cells.

Such breast cancer is known a HER2 positive breast cancer. Antibodies are a key component of the adaptive immune response, playing a central role in both in the recognition of foreign antigens and the stimulation of an immune response to them.

The advent of monoclonal antibody technology has made it possible to raise antibodies against specific antigens presented on the surfaces of tumors.

Therapy For Alzheimer Disease: CpG DNA

biochemist INDIA (31 oct 011):

Alzheimer disease is the most common form of dementia, affecting approximately 1.6% of the population in the United States (nearly 19% in the 75-84 age group). It is an incurable, degenerate, and terminal disease thought to be caused by accumulation of oligomeric amyloid β (oAβ).

Microglia are the resident immune cells in the central nervous system; they remove damaged neurons, plaques, and infectious agents from the brain and spinal cord. Microglia cluster around senile Aβplaques in Alzheimer disease patients; however, the role of microglia in oAβtoxicity remains unclear. Doi et al discovered that microglial activation with unmethylated CpG DNA, which binds to an immune receptor on microglia, prevented oAβtoxicity and enhanced oAβ peptide clearance in culture. Furthermore, injection of CpG DNA directly into the brain mitigated both cognitive impairment and learning defects in a mouse model of Alzheimer disease. CpG DNA may therefore be a therapeutic candidate for treatment of Alzheimer disease.

Researchers conclude that "CpG, especially class B and C, may also be effective therapeutic agents against oAβ1-42 neurotoxicity in [Alzheimer disease]."

Alzheimer's Disease Risk and Amyloid Beta Toxicity

BIOCHEMIST INDIA (30 OCT 011): In a development that sheds new light on the pathology of Alzheimer's disease (AD), a team of Whitehead Institute scientists has identified connections between genetic risk factors for the disease and the effects of a peptide toxic to nerve cells in the brains of AD patients.
The scientists, working in and in collaboration with the lab of Whitehead Member Susan Lindquist, established these previously unknown links in an unexpected way. They used a very simple cell type -- yeast cells -- to investigate the harmful effects of amyloid beta (Aβ), a peptide whose accumulation in amyloid plaques is a hallmark of AD. This new yeast model of Aβ toxicity, which they further validated in the worm C. elegans and in rat neurons, enables researchers to identify and test potential genetic modifiers of this toxicity.

"As we tackle other diseases and extend our lifetimes, Alzheimer's and related diseases will be the most devastating personal challenge for our families and one the most crushing burdens on our economy," says Lindquist, who is also a professor of biology at Massachusetts Institute of Technology and an investigator of the Howard Hughes Medical Institute. "We have to try new approaches and find out-of the-box solutions."

In a multi-step process, reported in the journal Science, the researchers were able to introduce the form of Aβ most closely associated with AD into yeast in a manner that mimics its presence in human cells. The resulting toxicity in yeast reflects aspects of the mechanism by which this protein damages neurons. This became clear when a screen of the yeast genome for genes that affect Aβ toxicity identified a dozen genes that have clear human homologs, including several that have previously been linked to AD risk by genome-wide association studies (GWAS) but with no known mechanistic connection.

With these genetic candidates in hand, the team set out to answer two key questions: Would the genes identified in yeast actually affect Aβ toxicity in neurons? And if so, how?

To address the first issue, in a collaboration with Guy Caldwell's lab at the University of Alabama, researchers created lines of C. elegans worms expressing the toxic form of Aβ specifically in a subset of neurons particularly vulnerable in AD. This resulted in an age-dependent loss of these neurons. Introducing the genes identified in the yeast that suppressed Aβ toxicity into the worms counteracted this toxicity. One of these modifiers is the homolog of PICALM, one of the most highly validated human AD risk factors. To address whetherPICALM could also suppress Aβ toxicity in mammalian neurons, the group exposed cultured rat neurons to toxic Aβ species. Expressing PICALM in these neurons increased their survival.

The question of how these AD risk genes were actually impacting Aβ toxicity in neurons remained. The researchers had noted that many of the genes were associated with a key cellular protein-trafficking process known as endocytosis. This is the pathway that nerve cells use to move around the vital signaling molecules with which they connect circuits in the brain. They theorized that perhaps Aβ was doing its damage by disrupting this process. Returning to yeast, they discovered that, in fact, the trafficking of signaling molecules in yeast was adversely affected by Aβ. Here again, introducing genes identified as suppressors of Aβ toxicity helped restore proper functioning.

Much remains to be learned, but the work provides a new and promising avenue to explore the mechanisms of genes identified in studies of disease susceptibility.

"We now have the sequencing power to detect all these important disease risk alleles, but that doesn't tell us what they're actually doing, how they lead to disease," says Sebastian Treusch, a former graduate student in the Lindquist lab and now a postdoctoral research associate at Princeton University.

Jessica Goodman, a postdoctoral fellow in the Lindquist lab, says the yeast model provides a link between genetic data and efforts to understand AD from the biochemical and neurological perspectives.

"Our yeast model bridges the gap between these two fields," Goodman adds. "It enables us to figure out the mechanisms of these risk factors which were previously unknown."

Members of the Lindquist lab intend to fully exploit the yeast model, using it to identify novel AD risk genes, perhaps in a first step to determining if identified genes have mutations in AD patient samples. The work will undoubtedly take the lab into uncharted territory.

Notes staff scientist Kent Matlack: "We know that Aβ is toxic, and so far, the majority of efforts in the area of Aβ have been focused on ways to prevent it from forming in the first place. But we need to look at everything, including ways to reduce or prevent its toxicity. That's the focus of the model. Any genes that we find that we can connect to humans will go into an area of research that has been less explored so far."

This work was supported by an HHMI Collaborative Innovation Award, an NRSA fellowship, the Cure Alzheimer's Fund, the National Institutes of Health, the Kempe foundation, and Alzheimerfonden

Tuesday, October 25, 2011

Duchenne muscular dystrophy (DMD): role of gene therapist.


Duchenne muscular dystrophy (DMD) is associated with mutations in the dystrophin gene that disrupt the open reading frame whereas the milder Becker's form is associated with mutations which leave an in-frame mRNA transcript that can be translated into a protein that includes the N- and C- terminal functional domains.
It has been shown that by excluding specific exons at, or adjacent to, frame-shifting mutations, open reading frame can be restored to an out-of-frame mRNA, leading to the production of a partially functional Becker-like dystrophin protein. Such targeted exclusion can be achieved by administration of oligonucleotides that are complementary to sequences that are crucial to normal splicing of the exon into the transcript.
This principle has been validated in mouse and canine models of DMD with a number of variants of oligonucleotide analogue chemistries and by transduction with adeno-associated virus (AAV)-small nuclear RNA (snRNA) reagents encoding the antisense sequence. Two different oligonucleotide agents are now being investigated in human trials for splicing out of exon 51 with some early indications of success at the biochemical level.


Biochemist India: Methylenetetrahydrofolate reductase gene polymorph...

Biochemist India: Methylenetetrahydrofolate reductase gene polymorph...: Cerebral palsy (CP) covers a group of non-progressive chronic disorders of motor function and posture caused by lesions of the developing fe...

Methylenetetrahydrofolate reductase gene polymorphisms and cerebral palsy in Chinese infants

Cerebral palsy (CP) covers a group of non-progressive chronic
disorders of motor function and posture caused by lesions of the
developing fetal or infant brain. CP is the most common cause of severe physical disability in childhood, occurring in 1–2/1000 live births. many cases are multifactorial in origin and exhibit marked etiologic heterogeneity. Risk factors for CP can be categorized as prenatally, perinatally and postnatally acquired of which about 70– 80% are acquired prenatally. CP may be related to genomic factors, as well as to environmental incursions during brain development. Methylenetetrahydrofolate reductase (MTHFR) catalyses irreversibly the conversion of 5,10-methylenetetrahydrofolate to 5-methyltetrahydrofolate. Studies showed that MTHFR gene polymorphisms are associated with inherited thrombophilias, which can result in adverse pregnancy outcomes such as CP.
Researchers observed a significant difference in allele and genotype frequencies between CP + MR patients and controls at rs4846049, rs1476413 and rs1801131 and there was a statistically significant difference in the frequencies of the three SNPs between CP + MR and CP-only cases.
this is the first report to our knowledge to demonstrate that MTHFR genetic polymorphisms are associated
with CP combined with MR. It adds to the existing evidence that certain gene variants may in some way contribute to CP






Monday, October 24, 2011

UGA scientists team up to define first-ever sequence of biologically important carbohydrate

If genes provide the blueprint for life and proteins are the machines that do much of the work for cells, then carbohydrates that are linked to proteins are among the tools that enable cells to communicate with the outside world and each other.
But until now, scientists have been unable to determine the structure of a biologically important so-called GAG proteoglycan – or even to agree whether these remarkably complex molecules have well-defined structures.

In a paper published in the early online edition of Nature Chemical Biology, however, a team of scientists from the University of Georgia and Rensselaer Polytechnic Institute announced that it has, for the first time, determined the sequence and structure of a glycosaminoglycan, or GAG, proteoglycan.

“The fact that a structure even exists is surprising, because people had the sense that the complexity of these molecules pointed to a randomness,” said study co-author Jonathan Amster, professor and head of the department of chemistry in the UGA Franklin College of Arts and Sciences. “There are many different areas in medicine that will be enabled by understanding carbohydrates at this fundamental level.”

Modifications to the GAG, or carbohydrate biopolymer, portion of proteoglycans have been associated with the presence and malignancy of certain cancers, for example, and the researchers noted that the identification of carbohydrates that are involved in disease opens the door to the development of drugs that can block their action.

The field of glycobiology is still in its infancy, largely because attempts to sequence proteoglycans have, until now, ended in frustration and failure. A small sample of DNA can be amplified many times, and its sequence, or arrangement of molecules, can be determined quickly with modern tools. DNA is simply a set of instructions for making proteins, so a sample of DNA also can allow scientists to produce copious quantities of protein for study.

Carbohydrates, however, are a bit messier. Scientists don’t fully understand how cells create them, and a given proteoglycan exists in multiple forms that are similar but not quite the same.

The researchers chose the simplest known GAG proteoglycan, a compound known as bikunin that is used in Japan for the treatment of acute pancreatitis, for their study. Of course, simplicity is a relative term: the sugar is composed of up to 55 distinct carbohydrate rings, which means that there are 210 billion different sequence possibilities. Previous studies performed over the past five years by the researchers that identified common sequences within the carbohydrate decreased the expected number of sequences to a mere 43 million.

Past attempts to sequence proteoglycans have relied on the so-called ‘bottom up’ approach in which scientists use enzymes to chop a molecule into its component parts and then try to put it back together, like a puzzle. Using an alternative approach known as the ‘top down’ method, the scientists placed the compound into high-powered mass spectrometers in both the Amster and Linhardt labs that allowed them to break the compound in predictable places. With larger puzzle pieces to work with, the scientists were able to deduce the structure of bikunin.

“Now that we have demonstrated that bikunin, a small chondroitin sulfate proteoglycan, has sequence, we are moving on to larger, more structurally complex dermatan sulfate and heparan sulfate proteoglycans,” said study co-author Robert Linhardt, professor at Rensselaer Polytechnic Institute. “These show important biological activities in development and in cancer, and we are optimistic that our sequencing approach will work here as well.”

Like all groundbreaking scientific discoveries, the finding actually raises more questions than it answers. Amster explained that the addition of sulfate to the sugar, for example, could in principle occur anywhere along the carbohydrate chain. What the researchers found, however, was that the sites of sulfation occur only in particular rings. “That was the unexpected finding,” Amster said, “because based on the current understanding of biology, there is no known mechanism for controlling that type of specificity.”

As they work to determine the structure of more complex proteoglycans, the scientists hope that their findings will encourage other researchers to consider the role that they play in health.

“We know that carbohydrates are how cells communicate with each other and their environment, but they’re also likely to play many roles that we can’t even envision yet,” Amster said. “And in order to understand them, we need to be able to study them at this molecular level.”