Machine learning algorithm to predict mortality in heart failure patients

Researchers from the University of California San Diego developed a machine-learning model by training a boosted decision tree algorithm on de-identified electronic health records data of 5,822 hospitalized or ambulatory patients with heart failure from the University of California San Diego.

This machine learning model is based on eight readily available variables. These include, diastolic blood pressure, creatinine, blood urea nitrogen, haemoglobin, white blood cell count, platelets, albumin, and red blood cell distribution width. This model was able to predict life expectancy in 88% of the patients.

This study is published in the European Journal of Heart Failure.

The tool was additionally tested using data from the University of California, San Francisco, and a database derived from 11 European medical centers.

With the advance of machine learning and artificial intelligence tools, large amounts of health data thanks to electronic health records, and computing power, we able to work on creating more and more accurate risk prediction tools. These tools will be commonplace in clinical practice to help with data-based decisions.

Calculator to predict five year risk for chronic kidney disease

A new risk calculator tool to predict the risk of chronic kidney disease is developed by Chronic Kidney Disease Prognosis Consortium, a large global collaboration led by researchers at the Johns Hopkins Bloomberg School of Public Health. It utilizes a mix of variables to predict accurately whether someone is likely to develop chronic kidney disease within five years.

The risk calculator published in the Journal of the American Medical Association (JAMA), is based on an analysis of clinical data from more than five million people around the world. The calculator is based on risk prediction model that utilizes age, sex, race/ethnicity, eGFR, history of cardiovascular disease, ever smoker, hypertension, body mass index, and albuminuria concentration. For participants with diabetes, the models also included diabetes medications, hemoglobin A1c, and the interaction between those two.

This calculator is available online at www.ckdpcrisk.corg

Risk prediction tools can help identify high-risk patients that can be followed with interventions that can slow or stop disease progression. In the case of kidney disease, progression can be delayed or stopped with treatments that address kidney-harming disorders such as hypertension and diabetes, and by limiting the use of kidney-stressing substances such as certain antibiotics, NSAID painkillers, and imaging contrast agents.

Increased mortality associated with short sleep duration

According to a new study published in the Journal of the American Heart Association middle-aged adults with high blood pressure, type 2 diabetes, heart disease or stroke could be at high risk for cancer and early death when sleeping less than six hours per day.

This study was done on a total of 1654 adults (aged 20–74 years) from the Penn State Adult Cohort. All adults in this cohort had cardiometabolic risk factors like stage 2 hypertension or type 2 diabetes. In addition, some had diagnosis and treatment for and or stroke. Participants were studied in the sleep laboratory (1991-1998) for one night and then researchers tracked their cause of death up to the end of 2016.

Statistical analysis showed that participants who slept less than 6 hours had higher all-cause mortality. They also had a higher incidence of cerebro and cardiovascular-related mortality. Another significant finding was increased cancer-related mortality in patients who had less than 6 hours of sleep.

Although the study was based on only one-night sleep assessment this is an important study that shows a relation between lack of sleep and mortality in patients with cardiometabolic risk factors. Further research is warranted in understanding this relationship and to promote adequate sleep duration as effective risk modifier.

Animal tests show promise for needle free flu vaccine patch

Researchers from the University of Rochester Medical Center published a study in the Journal of Investigative Dermatology about a technology that could replace needle-based vaccination methods.

In this study, researchers utilized a synthetic peptide to bind and inhibit the claudin-1 protein. Claudin-1 protein is essential for skin barrier strength and decreases the permeability of the skin. It is noted that eczema patients have significantly reduced claudin-1 and therefore has leaky skin barrier.

Researchers utilized synthetic peptide to inhibit claudin-1 along with recombinant flu vaccine to create a patch. Testing on mice researchers found that in previously immunized mice it elicited a significant immune reaction. Further testing showed no lasting damage to the skin at the site of the patch.

Although it is still in the early stages of development, this study shows potential for needle-free vaccine patch development. Needle-free vaccines have great potential as they reduce the burden on health care professionals, reduces biowaste hazard and generally appealing for people as there is no injection involved.

New retinal biomarker for identifying early Alzheimer’s disease

Researchers at the Complutense University of Madrid (UCM) have identified changes in retinal layer thickness, inflammation or thinning in patients with mild Alzheimer’s disease. These changes are identified with non invasive assessment using optical coherence tomography may be an important biomarker for early diagnosis.

Researchers observed that in some patients diagnosed with Alzheimer’s disease, the retinal layers presented neurodegeneration, whereas in others they presented neuroinflammation, the stage prior to neurodegeneration, a finding which can be used to diagnose the disease before other tests.

The study was conducted with a group of 19 patients selected from 2124 clinical histories at the San Carlos Hospital Clinic Geriatric Service in Madrid. These patients had very early stage Alzheimer’s disease and did not present any other disease that affected the retina. The study also included a control group comprising 24 volunteers similar in age and other characteristics but without any relevant disease. The results of this investigation has been published in Scientific Reports.

Use of mobile phone games to assess cognitive decline

Researchers from University of Kent shown that mobile phone games could be used as new tool for identifying early signs of cognitive decline and thus identify a possibility of developing dementia.

Investigating the link between patterns of tap, swipe and rotational gestures during mobile game play and the users’ cognitive performance, the research shows that the speed, length and intensity of these motions correlates with brain function. In particular, the performance of these gestures reveals key information about players’ visual search abilities, mental flexibility and inhibition of their responses. They all offer clues about the individuals’ overall brain health.

The results of the study, ‘Exploring the Touch and Motion Features in Game-Based Cognitive Assessments’, will be presented at ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp). 21 participants were included in this study. All participants had standard paper-based cognitive assessment tests, followed by 10-minute sessions of playing Tetris, Candy Crush Saga and Fruit Ninja over two separate periods, two weeks apart.

Using the sensors built into the mobile phones to collect data, the team showed how users interacted with the games and illustrated a clear link between the subjects’ touch gestures, or taps and swipes, their rotational gestures and their levels of cognitive performance. The study revealed the participants’ ability to perform visuo-spatial and visual search tasks, as well as testing their memory, mental flexibility and attention span.

The research team concluded that off-the-shelf, popular mobile games can provide an effective measure of brain function to spot changes in motor abilities which are commonly seen in patients with Alzheimer’s Disease, stroke, traumatic brain injury, schizophrenia and obsessive-compulsive-disorder. Early detection of the signs of cognitive decline is crucial to effective treatment and prevention, as well as identification of individuals at risk of brain disease.

Macrophage protein shown to have major role in fracture healing

Duke Health researchers previously showed that introducing bone marrow stem cells to a bone injury can expedite healing, but the exact process was unclear. Now researchers found that macrophage and the proteins it secretes called low-density lipoprotein receptor-related protein 1 (Lrp1) can have a rejuvenating effect on tissue. These findings are published in journal Nature Communications.

After tissue injury, the body dispatches macrophages to areas of trauma, where they undergo functional changes to coordinate tissue repair. During fracture healing, macrophages are found at the fracture site. But when they’re depleted, fractures will not heal effectively. Macrophage populations and characteristics can change with aging. And in this research scientists found that a protein called low-density lipoprotein receptor-related protein 1 (Lrp1) mostly found in young macrophages possibly responsible for bone healing effects.

Finding ways to speed bone repair is a public health priority that could save both lives and health care costs. The Centers for Disease Control and Prevention reports that more than 800,000 patients a year are hospitalized because of fall injuries, including broken hips, and these hospitalizations cost an average of $30,000.

Citation: Linda Vi, Gurpreet S. Baht, Erik J. Soderblom, Heather Whetstone, Qingxia Wei, Bridgette Furman, Vijitha Puviindran, Puviindran Nadesan, Matthew Foster, Raymond Poon, James P. White, Yasuhito Yahara, Adeline Ng, Tomasa Barrientos, Marc Grynpas, M. Arthur Mosely, and Benjamin A. Alman. “Macrophage Cells Secrete Factors including LRP1 That Orchestrate the Rejuvenation of Bone Repair in Mice.” Nature Communications 9, no. 1 (2018). doi:10.1038/s41467-018-07666-0.

Scientists identify neural pathways behind visual perceptual decision-making

Scientists at the National Eye Institute (NEI) have found that neurons in the superior colliculus are key players in allowing us to detect visual objects and events. This structure doesn’t help us recognize what the specific object or event is; instead, it’s the part of the brain that decides something is there at all. 
In this study researchers used an “accumulator threshold model” to study how neuronal activity in the superior colliculus relates to behavior. By comparing brain activity recorded from the right and left superior colliculi at the same time, the researchers were able to predict whether an animal was seeing an event. The findings were published today in the journal Nature Neuroscience.
This new study shows that process of deciding that an object is present or that an event has occurred in the visual field – is handled by the superior colliculus. The process of deciding to take an action (a behavior, like avoiding a chair) based on information received from the senses (like visual information) is known as “perceptual decision-making”. Most research into perceptual decision-making – in humans, non-human primates, or in other animals – uses mathematical models to describe a relationship between a stimulus shown to an animal (like moving dots, changes in color, or appearance of objects) and the animal’s behavior. But because visual information processing in the brain is highly complex, scientists have struggled to demonstrate that these mathematical models accurately mimic a biological process happening in the brain during decision-making.
Citation: James P. Herman, Leor N. Katz, and Richard J. Krauzlis. “Midbrain Activity Can Explain Perceptual Decisions during an Attention Task.” Nature Neuroscience 21, no. 12 (2018): 1651-655. doi:10.1038/s41593-018-0271-5.

Early diagnosis of Alzheimer’s disease using artificial intelligence

According to a study published in the journal of radiology, research shows that artificial intelligence (AI) technology predict the development of Alzheimer’s disease early.

Early diagnosis of Alzheimer’s is important as treatments and interventions are more effective early in the course of the disease. However, early diagnosis has proven to be challenging. Research has linked the disease process to changes in metabolism, as shown by glucose uptake in certain regions of the brain, but these changes can be difficult to recognize.

Credit: Radiological Society of North America

“Differences in the pattern of glucose uptake in the brain are very subtle and diffuse,” said study co-author Jae Ho Sohn, M.D., from the Radiology & Biomedical Imaging Department at the University of California in San Francisco (UCSF). “People are good at finding specific biomarkers of disease, but metabolic changes represent a more global and subtle process.”

The researchers trained the deep learning algorithm on a special imaging technology known as 18-F-fluorodeoxyglucose positron emission tomography (FDG-PET). In an FDG-PET scan, FDG, a radioactive glucose compound, is injected into the blood. PET scans can then measure the uptake of FDG in brain cells, an indicator of metabolic activity.

The researchers had access to data from the Alzheimer’s Disease Neuroimaging Initiative (ADNI), a major multi-site study focused on clinical trials to improve the prevention and treatment of this disease. The ADNI dataset included more than 2,100 FDG-PET brain images from 1,002 patients. Researchers trained the deep learning algorithm on 90 percent of the dataset and then tested it on the remaining 10 percent of the dataset. Through deep learning, the algorithm was able to teach itself metabolic patterns that corresponded to Alzheimer’s disease.

Finally, the researchers tested the algorithm on an independent set of 40 imaging exams from 40 patients that it had never studied. The algorithm achieved 100 percent sensitivity at detecting the disease an average of more than six years prior to the final diagnosis.

“We were very pleased with the algorithm’s performance,” Dr. Sohn said. “It was able to predict every single case that advanced to Alzheimer’s disease.”

Although he cautioned that their independent test set was small and needs further validation with a larger multi-institutional prospective study, Dr. Sohn said that the algorithm could be a useful tool to complement the work of radiologists especially in conjunction with other biochemical and imaging tests–in providing an opportunity for early therapeutic intervention.

Future research directions include training the deep learning algorithm to look for patterns associated with the accumulation of beta-amyloid and tau proteins, abnormal protein clumps and tangles in the brain that are markers specific to Alzheimer’s disease, according to UCSF’s Youngho Seo, Ph.D., who served as one of the faculty advisors of the study.

Citation: Yiming Ding, Jae Ho Sohn, Michael G. Kawczynski, Hari Trivedi, Roy Harnish, Nathaniel W. Jenkins, Dmytro Lituiev, Timothy P. Copeland, Mariam S. Aboian, Carina Mari Aparici, Spencer C. Behr, Robert R. Flavell, Shih-Ying Huang, Kelly A. Zalocusky, Lorenzo Nardo, Youngho Seo, Randall A. Hawkins, Miguel Hernandez Pampaloni, Dexter Hadley, and Benjamin L. Franc. “A Deep Learning Model to Predict a Diagnosis of Alzheimer Disease by Using 18F-FDG PET of the Brain.” Radiology, 2018, 180958.
doi:10.1148/radiol.2018180958.

.embed-container { position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden; max-width: 100%; } .embed-container iframe, .embed-container object, .embed-container embed { position: absolute; top: 0; left: 0; width: 100%; height: 100%; }

Keep up with your weight loss goals with daily weighing

According to research presented in the American Heart Association’s 2018 scientific meeting, daily weighing may help with weight loss goals. People who don’t weigh themselves at all or rarely were less likely to lose weight than those who weighed themselves often,

Researchers examined the self-weighing patterns of 1,042 adults (78 percent male, 90 percent white, average age 47) and whether there were differences in weight change by these self-weighing patterns over 12 months. They analyzed remotely transmitted self-weighing data from Health eHeart, an ongoing prospective e-cohort study. The participants weighed themselves at home as they normally would, without interventions, guidance or weight-loss incentives from researchers.

Researchers identified several categories of self-weighing adults, from those that weighed themselves daily or almost daily to adults who never used at-home scales.

They found that people who never weighed themselves or only weighed once a week did not lose weight in the following year. Those that weighed themselves six to seven times a week had a significant weight loss (1.7 percent) in 12 months.

Citation: Daily weighing may be key to losing weight
American Heart Association Meeting  Poster Presentation Sa2394 – Session: NR.APS.01
Yaguang Zheng, Ph.D., M.S.N., R.N., University of Pittsburgh School of Nursing, Pittsburgh, PA

.embed-container { position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden; max-width: 100%; } .embed-container iframe, .embed-container object, .embed-container embed { position: absolute; top: 0; left: 0; width: 100%; height: 100%; }