Yearly Archives: 2016

A novel approach to analyzing brain structure that focuses on the shape, rather than the size, of particular features may allow identification of individuals who are in the early, pre-symptomatic stages of Alzheimer’s disease.

A team of investigators used advanced computational tools to analyze data from standard MRI scans. They found that people with Alzheimer’s disease, including those diagnosed partway through a multiyear study, had greater levels of asymmetry in key brain structures: differences in shape between the left and right sides of the brain. Their study has been published in the journal Brain.

The team developed a computer-aided system, called BrainPrint, for representing the whole brain based on the shape, rather than the size or volume, of structures. Originally described in a 2015 article in NeuroImage, BrainPrint appears to be as accurate as a fingerprint in distinguishing among individuals. In a recent paper in the same journal, the researchers demonstrated the use of BrainPrint for automated diagnosis of Alzheimer’s disease.

The current study used BrainPrint to analyze structural asymmetries in a series of MR images of almost 700 participants in the National Institutes of Health-sponsored Alzheimer’s Disease Neuroimaging Initiative. BrainPrint analysis of the data revealed that initial, between-hemisphere differences in the shapes of the hippocampus and amygdala—structures known to be sites of neurodegeneration in Alzheimer’s disease—were highest in individuals with dementia and lowest in healthy controls. Among those originally classified with mild cognitive impairment, baseline asymmetry was higher in those that progressed to Alzheimer’s dementia and became even greater as symptoms developed. Increased asymmetry was also associated with poorer cognitive test scores and with increased cortical atrophy.

Paper: “Whole-brain analysis reveals increased neuroanatomical asymmetries in dementia for hippocampus and amygdala”

Reprinted from materials provided by Mass General.

Iron occurs naturally in the human body. However, in people with Parkinson’s disease it distributes in an unusual way over the brain, according to a new study that has been published in the journal Brain.

Researchers applied a special type of magnetic resonance imaging (MRI) allowing them to map iron levels in the entire brain. It is the first time that this has been done in Parkinson’s disease.

Iron is indispensable for human metabolism. However, iron is also potentially harmful as it is able to trigger production of reactive molecular species that may cause „oxidative stress” and ultimately damage to neurons.

For the study, the researchers examined the brains of 25 people with Parkinson’s and 50 healthy subjects by using a special MRI technique called QSM, which is the acronym for „quantitative susceptibility mapping”.

As with conventional MRI, QSM is non-invasive and relies on a combination of magnetic fields, electromagnetic waves and analysis software to generate pictures of the insides of the human body. However, QSM benefits from raw data usually discarded in conventional MRI. As a consequence, QSM can probe a magnetic parameter indicating metallic presence.

Paper: “The whole-brain pattern of magnetic susceptibility perturbations in Parkinson’s disease”

Reprinted from materials provided by DZNE.

Accumulating amounts of amyloid in the brain have been associated with the development of dementia, including Alzheimer’s disease. Now a team of neuroscience and biochemistry researchers have made a novel discovery that illustrates for the first time the difference between amyloid buildup in brain blood vessels and amyloid buildup around brain neurons. Their findings, which may provide a new path to research on Alzheimer’s disease and its cause, was published in Nature Communications.

The researchers mapped out the structural signature of amyloid that accumulates in brain blood vessels and compared it to the known structure of amyloid that accumulates in plaque around brain neurons.

The team found that the subunits of the amyloid that accumulates in vessels line up uniquely and in alternating patterns, which presents in a near opposite pattern of amyloid buildup in plaque around neurons.

They hypothesize that the unique structure of this brain blood vessel amyloid could promote different pathological responses, i.e., inflammation, which likely contributes differently to cognitive impairment and dementia than neuron amyloid.

Paper: “Cerebral vascular amyloid seeds drive amyloid β-protein fibril assembly with a distinct anti-parallel structure”

Reprinted from materials provided by Stony Brook University.

Three years ago the United States government launched the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative to accelerate the development and application of novel technologies that will give us a better understanding about how brains work.

Since then, dozens of technology firms, academic institutions, scientists and other have been developing new tools to give researchers unprecedented opportunities to explore how the brain processes, utilizes, stores and retrieves information. But without a coherent strategy to analyze, manage and understand the data generated by these new technologies, advancements in the field will be limited.

For this reason, an international team of interdisciplinary researchers—including mathematicians, computer scientists, physicists and experimental and computational neuroscientists— was assembled to develop a plan for managing, analyzing and sharing neuroscience data. Their recommendations were published in a recent issue of Neuron.

To maximize the return on investments in global neuroscience initiatives, the researchers argue that the international neuroscience community should have an integrated strategy for data management and analysis. This coordination would facilitate the reproducibility of workflows, which then allows researchers to build on each other’s work.

For a first step, the authors recommend that researchers from all facets of neuroscience agree on standard descriptions and file formats for products derived from data analysis and simulations. After that, the researchers should work with computer scientists to develop hardware and software ecosystems for archiving and sharing data.

The authors suggest an ecosystem similar to the one used by the physics community to share data collected by experiments like the Large Hadron Collider (LHC). In this case, each research group has their own local repository of physiological or simulation data that they’ve collected or generated. But eventually, all of this information should also be included in “meta-repositories” that are accessible to the greater neuroscience community. Files in the “meta-repositories” should be in a common format, and the repositories would ideally be hosted by an open-science supercomputing facility.

Because novel technologies are producing unprecedented amounts of data, the researchers also propose that neuroscientists collaborate with mathematicians to develop new approaches for data analysis and modify existing analysis tools to run on supercomputers. To maximize these collaborations, the analysis tools should be open-source and should integrate with brain-scale simulations, they say.

Paper: “High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination”

Reprinted from materials provided by Berkeley Lab.