Blog

Small dietary changes reduces cardiovascular disease risk by more than a quarter

Share

Exchanging few commercially regular-consumed food items with improved fat quality reduces total and LDL cholesterol. A new double-blind randomized controlled trial published in British Journal of Nutrition suggests almost 30% reduction in cardiovascular disease risk

Polyunsaturated fat

Exchanging few regular-consumed food items with improved fat quality in the daily diet for eight weeks reduces the serum total cholesterol and LDL-cholesterol by 9 % and 11 %, respectively. This change corresponds to a 27 % reduction in cardiovascular disease risk. In the human trial saturated fat was replaced by polyunsaturated fat in key food items such as spread on bread, fat for cooking, cheese, bread and cereals.

Cardiovascular disease

Cardiovascular disease (CVD) remains the major contributor to the global burden of disease worldwide. Even though there has been substantial reduction in CVD mortality over the last 30 years, new reports show an increase in acute myocardial infarction among the younger population in Norway, and similar observations have been reported also from other countries.

Cholesterol

Elevated plasma low-density lipoprotein cholesterol (LDL-C) is an established risk factor of CVD, and dietary fatty acids play a significant role in modulating plasma LDL-C and thereby influencing the risk of CVD. In particular there is strong evidence that replacing saturated fatty acids (SFA) with polyunsaturated fatty acids (PUFA) will reduce the risk of CVD. However, controversy still exist about beneficial versus potential harmful effects of n-6 PUFA since n-6 PUFA has been suggested to promote inflammation.

The Nordic diet

Adherence to a healthy Nordic diet based on the Nordic nutrition recommendations has previously been shown to have beneficial effect on blood lipids among subjects at risk of CVD. However the extent of food changes needed to achieve these effects is less explored. In order to increase compliance to dietary fat intake recommendations in the general population it is important that one can achieve this with relatively small dietary changes, leading to improved lipid profile.

Scientific Abstract

The aim of the study was to investigate the effects of exchanging few commercially regularly-consumed key food items (e.g. spread on bread, fat for cooking, cheese, bread and cereals) with improved fat quality on total cholesterol, LDL-C and inflammatory markers in a double-blind randomized, controlled trial.

In total 115 moderately hypercholesterolemic non-statin treated adults (25-70 y) were randomly assigned to an experimental diet group (Ex-diet group) or control diet group (C-diet group) for eight weeks with commercially available food items with different fatty acid composition (replacing saturated fatty acids with mostly n-6 polyunsaturated fatty acids).

In the Ex-diet group, serum total cholesterol (P<0.001) and LDL-C (P<0.001) were reduced after eight weeks, compared to the C-diet group. The difference in change between the two groups at the end of the study was -9 % and -11 % in total cholesterol and LDL-C, respectively. No difference in change in plasma levels of inflammatory markers was observed between the groups.

In conclusion, exchanging few regularly-consumed food items with improved fat quality reduces total cholesterol, with no negative effect on levels of inflammatory markers. This shows that an exchange of few commercially available food items was easy and manageable and leads to clinically relevant cholesterol reduction potentially affecting future CVD risk

Read more

Live Longer

Share

There’s a multi-billion-dollar industry devoted to products that fight signs of aging, but moisturizers only go skin deep. Aging occurs deeper — at a cellular level — and scientists have found that eating less can slow this cellular process.

Recent research published in Molecular & Cellular Proteomics offers one glimpse into how cutting calories impacts aging inside a cell. The researchers found that when ribosomes — the cell’s protein makers — slow down, the aging process slows too. The decreased speed lowers production but gives ribosomes extra time to repair themselves.

“The ribosome is a very complex machine, sort of like your car, and it periodically needs maintenance to replace the parts that wear out the fastest,” said BYU biochemistry professor and senior author John Price. “When tires wear out, you don’t throw the whole car away and buy new ones. It’s cheaper to replace the tires.”

So what causes ribosome production to slow down in the first place? At least for mice: reduced calorie consumption.

Price and his fellow researchers observed two groups of mice. One group had unlimited access to food while the other was restricted to consume 35 percent fewer calories, though still receiving all the necessary nutrients for survival.

“When you restrict calorie consumption, there’s almost a linear increase in lifespan,” Price said. “We inferred that the restriction caused real biochemical changes that slowed down the rate of aging.”

Price’s team isn’t the first to make the connection between cut calories and lifespan, but they were the first to show that general protein synthesis slows down and to recognize the ribosome’s role in facilitating those youth-extending biochemical changes.

“The calorie-restricted mice are more energetic and suffered fewer diseases,” Price said. “And it’s not just that they’re living longer, but because they’re better at maintaining their bodies, they’re younger for longer as well.”

Ribosomes, like cars, are expensive and important — they use 10–20 percent of the cell’s total energy to build all the proteins necessary for the cell to operate. Because of this, it’s impractical to destroy an entire ribosome when it starts to malfunction. But repairing individual parts of the ribosome on a regular basis enables ribosomes to continue producing high-quality proteins for longer than they would otherwise. This top-quality production in turn keeps cells and the entire body functioning well.

Despite this study’s observed connection between consuming fewer calories and improved lifespan, Price assured that people shouldn’t start counting calories and expect to stay forever young. Calorie restriction has not been tested in humans as an anti-aging strategy, and the essential message is understanding the importance of taking care of our bodies.

“Food isn’t just material to be burned — it’s a signal that tells our body and cells how to respond,” Price said. “We’re getting down to the mechanisms of aging, which may help us make more educated decisions about what we eat.”

Read more

Health Factory® provides accurate caloric intake amounts and consistent quality products for your weight loss program

Share

shakes

Increased portion sizes in Americans’ diets is widely recognized as a contributor to the obesity epidemic, and now new research published in Obesity, the scientific journal of The Obesity Society, examines the effect of prepackaged, portion-controlled meals on weight loss. The researchers found that when combined with behavioral counseling as part of a complete weight-loss intervention, a meal plan incorporating portion-controlled, prepackaged, frozen lunch and dinner entrées can promote greater weight loss than a self-selected diet.

“Participants who were prescribed twice-daily prepackaged meals lost about eight percent of their initial weight, compared to participants in the control group — who could select their own diets — who only lost about six percent,” said Cheryl Rock, PhD, RD, lead researcher and Professor of Family Medicine and Public Health at the University of California San Diego School of Medicine. “What’s more, our study found that food satisfaction was comparable among all groups, which is a critical factor that may determine long-term usefulness of this strategy. We believe that removing the complexity of planning and preparing low-calorie meals was beneficial to the participants in the intervention.”

To conduct the study, Dr. Rock and colleagues assigned 183 study participants to three groups: one that was prescribed two prepackaged meals per day, one that was prescribed two prepackaged meals per day that were higher in protein (>25% energy), and the control group that was allowed to select their own meals. All participants met with a dietitian for a one- to two-hour personalized counseling session in which they determined their own weight-loss goals, received physical activity recommendations and learned behavioral strategies to help them achieve their goals.

After three months, 74% of the participants eating the prepackaged foods had achieved a 5% weight loss, whereas only 53% of the control achieved that milestone. The greater weight loss also led to a decrease in other cardiovascular disease risk factors like total cholesterol and LDL cholesterol for the participants consuming the prepackaged meals. Additionally, meal satisfaction ratings were similar among all groups, and the groups that consumed the prepackaged meals expressed greater confidence in their ability to follow a meal plan long-term.

“Reduction in energy intake is a key factor to weight loss, but it can be difficult for most individuals with overweight or obesity to put into practice,” said Martin Binks, PhD, Associate Professor of Nutritional Sciences at Texas Tech University and spokesperson for The Obesity Society. “This type of strategy is a step toward implementing effective, evidence-based solutions to obesity.”

The biggest limitation to the study is the lack of detailed dietary intake data. Longer term studies that carefully measure adherence to this type of program would be beneficial.

 

To find out more Shop here

Read more

Gray matters

Share

robot-507811_1920

General human intelligence appears to be based on the volume of gray matter tissue in certain regions of the brain, UC Irvine College of Medicine researchers have found in the most comprehensive structural brain-scan study of intelligence to date.

The study also discovered that because these regions related to intelligence are located throughout the brain, a single “intelligence center,” such as the frontal lobe, is unlikely.

Dr. Richard Haier, professor of psychology in the Department of Pediatrics and long-time human intelligence researcher, and colleagues at UCI and the University of New Mexico used MRI to obtain structural images of the brain in 47 normal adults who also took standard intelligence quotient tests. The researchers used a technique called voxel-based morphometry to determine gray matter volume throughout the brain which they correlated to IQ scores. Study results appear on the online version of NeuroImage.

Previous research had shown that larger brains are weakly related to higher IQ, but this study is the first to demonstrate that gray matter in specific regions in the brain is more related to IQ than is overall size. Multiple brain areas are related to IQ, the UCI and UNM researchers have found, and various combinations of these areas can similarly account for IQ scores. Therefore, it is likely that a person’s mental strengths and weaknesses depend in large part on the individual pattern of gray matter across his or her brain.

“This may be why one person is quite good at mathematics and not so good at spelling, and another person, with the same IQ, has the opposite pattern of abilities,” Haier said.

While gray matter amounts are vital to intelligence levels, the researchers were surprised to find that only about 6 percent of all the gray matter in the brain appears related to IQ.

“There is a constant cascade of information being processed in the entire brain, but intelligence seems related to an efficient use of relatively few structures, where the more gray matter the better,” Haier said. “In addition, these structures that are important for intelligence are also implicated in memory, attention and language.”

The findings also suggest that the brain areas where gray matter is related to IQ show some differences between young-adult and middle-aged subjects. In middle age, more of the frontal and parietal lobes are related to IQ; less frontal and more temporal areas are related to IQ in the younger adults.

The research does not address why some people have more gray matter in some brain areas than other people, although previous research has shown the regional distribution of gray matter in humans is highly heritable. Haier and his colleagues are currently evaluating the MRI data to see if there are gender differences in IQ patterns.

Read more

Degeneration of knee cartilage in overweight people

Share

surgery-708470_1920

Osteoarthritis is a degenerative disease in which the articular cartilage protecting the joint starts to degenerate and wear off over time. The main risk factors of osteoarthritis are advanced age and overweight-induced significant stresses on the knee joint. Current imaging methods such as MRI and X-ray provide information about the thickness and composition of the cartilage, but they do not provide a quantitative estimate of the patient-specific risk of osteoarthritis or its progression.

Published in Nature, a recent study from the University of Eastern Finland developed and validated a novel computational modelling method for the assessment of the patient-specific progression of osteoarthritis in the knee joint by using MRI data. The research group comprises researchers from the University of Eastern Finland and Lund University.

“The method we have developed is based on stresses experienced by the knee joint during walking, and these were simulated on a computer. Our idea was that walking-induced cumulative stresses that exceed a certain threshold will cause local degeneration in the articular cartilage of the knee,” says Postdoctoral Researcher Mika Mononen from the University of Eastern Finland’s Department of Applied Physics.

The patient-specific estimates of the progression of osteoarthritis obtained by computer modelling were validated against four-year follow-up data from X-ray measurements, in which the thickness of the articular cartilage in the knee was evaluated by using the Kellgren-Lawrence method. For the validation of the model, two patient groups were established: the normal weight group and the overweight group.

In the normal weight group, the thickness of healthy cartilage did not change over the four-year follow-up, whereas significant degeneration was observed in the overweight group.

“The study shows that this new method, which is based on computer modelling, was able to predict similar changes in the articular cartilage of the knee as experimental follow-up data,” Mononen says.

In the future, the method can serve as a new tool for making patient-specific prognoses on the progression of osteoarthritis. Furthermore, the method can be used for assessing the patient-specific effects of overweight on the future health of the knee joint as well as the success of clinical treatment such as menisectomy, a widely used surgical procedure

Read more

Diet a driver of evolution?

Share

ancestor-1257195_1920

Homo sapiens, the ancestor of modern humans, shared the planet with Neanderthals, a close, heavy-set relative that dwelled almost exclusively in Ice-Age Europe, until some 40,000 years ago. Neanderthals were similar to Homo sapiens, with whom they sometimes mated — but they were different, too. Among these many differences, Neanderthals were shorter and stockier, with wider pelvises and rib-cages than their modern human counterparts.

But what accounted for these anatomical differences? A new Tel Aviv University study finds that the Ice-Age diet — a high-protein intake of large animals — triggered physical changes in Neanderthals, namely a larger ribcage and a wider pelvis.

According to the research, the bell-shaped Neanderthal rib-cage or thorax had to evolve to accommodate a larger liver, the organ responsible for metabolizing great quantities of protein into energy. This heightened metabolism also required an expanded renal system (enlarged bladder and kidneys) to remove large amounts of toxic urea, possibly resulting in a wide Neanderthal pelvis.

Seeing evolution from a new angle

“The anatomical differences between the thoraxes and pelvises of Homo sapiens and Neanderthals have been well-known for many years, but now we’re approaching it from a new angle — diet,” said Prof. Avi Gopher.

“During harsh Ice-Age winters, carbohydrates were scarce and fat was in limited supply. But large game, the typical prey of the Neanderthal, thrived,” said Ben-Dor. “This situation triggered an evolutionary adaptation to a high-protein diet — an enlarged liver, expanded renal system and their corresponding morphological manifestations. All of these contributed to the Neanderthal evolutionary process.”

“In a 2011 paper, which dealt with the demise of Homo erectus in the Levant, we had already tapped into the notion that diet played a major role in human evolution,” said Prof. Barkai. “We argued then that high fat consumption was one of the most important solutions to the predicament presented by human evolution. Humans are limited in the amount of protein they are able turn into energy — protein provides just 30 percent of their overall diet. The solution, therefore, was to consume more fat and more carbohydrates when they were seasonally available.

“We found that, in the case of the Neanderthals, an acute shortage of carbohydrates and a limited availability of fat caused their biological adaptation to a high-protein diet.”

The proof in the dietary pudding

Numerous animal experiments have already demonstrated that a high-protein diet is likely to produce enlarged livers and kidneys. “Early indigenous Arctic populations who primarily ate meat also displayed enlarged livers and the tendency to drink a lot of water, a sign of increased renal activity,” said Ben-Dor.

According to the researchers, the total dependence of Neanderthals on large animals to answer their fat and protein needs may provide a clue to their eventual extinction, which took place at the same time as the beginning of the demise of giant animals or “Megafauna” in Europe some 50,000 years ago. The team is now researching this subject.

Prof. Gopher, Prof. Ran Barkai and PhD candidate Miki Ben-Dor, all of TAU’s Department of Archaeology and Ancient Near Eastern Cultures, co-authored the study, which was recently published in the American Journal of Physical Anthropology.

Read more

Better sleep with a high-protein diet

Share

logo_public_retina

Overweight and obese adults who are losing weight with a high-protein diet are more likely to sleep better, according to new research from Purdue University.

“Most research looks at the effects of sleep on diet and weight control, and our research flipped that question to ask what are the effects of weight loss and diet — specifically the amount of protein — on sleep,” said Wayne Campbell, a professor of nutrition science. “We found that while consuming a lower calorie diet with a higher amount of protein, sleep quality improves for middle-age adults. This sleep quality is better compared to those who lost the same amount of weight while consuming a normal amount of protein.”

These findings are published in the American Journal of Clinical Nutrition, which is affiliated with the American Society for Nutrition. The research was funded by Beef Checkoff, National Pork Board, National Dairy Council, Purdue Ingestive Behavior Research Center and National Institutes of Health.

A pilot study found that in 14 participants, consuming more dietary protein resulted in better sleep after four weeks of weight loss. Then, in the main study, 44 overweight or obese participants were included to consume either a normal-protein or a higher-protein weight loss diet. After three weeks of adapting to the diet, the groups consumed either 0.8 or 1.5 grams of protein for each kg of body weight daily for 16 weeks. The participants completed a survey to rate the quality of their sleep every month throughout the study. Those who consumed more protein while losing weight reported an improvement in sleep quality after three and four months of dietary intervention.

A dietitian designed a diet that met each study participant’s daily energy need and 750 calories in fats and carbohydrates were trimmed per day while maintaining the protein amount based on whether they were in the higher- or normal-protein group. The sources of protein used in the two studies varied from beef, pork, soy, legumes and milk protein.

“Short sleep duration and compromised sleep quality frequently lead to metabolic and cardiovascular diseases and premature death,” said Jing Zhou, a doctoral student in nutrition science and the study’s first author. “Given the high prevalence of sleep problems it’s important to know how changes to diet and lifestyle can help improve sleep.”

Campbell’s lab also has studied how dietary protein quantity, sources and patterns affect appetite, body weight and body composition.

“This research adds sleep quality to the growing list of positive outcomes of higher-protein intake while losing weight, and those other outcomes include promoting body fat loss, retention of lean body mass and improvements in blood pressure,” Campbell said. “Sleep is recognized as a very important modifier of a person’s health, and our research is the first to address the question of how a sustained dietary pattern influences sleep. We’ve showed an improvement in subjective sleep quality after higher dietary protein intake during weight loss

Read more

Cardiac Health

Share

cardiac-156059_1280

A new study analyzed heart disease risk factors among more than 3,900 patients who were treated for ST-elevation myocardial infarction, or STEMI–the most severe and deadly type of heart attack–at Cleveland Clinic between 1995 and 2014.

“On the whole, the medical community has done an outstanding job of improving treatments for heart disease, but this study shows that we have to do better on the prevention side,” said Samir Kapadia, M.D., professor of medicine and section head for interventional cardiology at Cleveland Clinic and the study’s primary investigator. “When people come for routine checkups, it is critical to stress the importance of reducing risk factors through weight reduction, eating a healthy diet and being physically active.”

A STEMI heart attack results when one of the heart’s main arteries becomes completely blocked by plaque, stopping the flow of blood. Immediate medical attention can increase the chances of survival, but STEMI carries a high risk of death and disability.

Many factors are known to increase a person’s heart attack risk. While some, such as age and family history, are beyond the individual’s control, many risk factors can be reduced through lifestyle choices, such as exercising more, quitting smoking and adopting a heart-healthy diet.

The researchers divided the records of Cleveland Clinic’s STEMI patients from 1995 to 2014 into four quartiles, each representing a span of five years. Analyzing the baseline risk factors and health conditions of patients in each grouping, they found the average age of STEMI patients decreased from 64 to 60, and the prevalence of obesity increased from 31 to 40 percent between the first five-year span and the last five-year span. The proportion of patients with diabetes increased from 24 to 31 percent, the proportion with high blood pressure grew from 55 to 77 percent, and the proportion with chronic obstructive pulmonary disease rose from 5 to 12 percent over the same period. All changes were statistically significant.

One of the most striking findings, according to study authors, was the change in smoking rates, which increased from 28 to 46 percent–a finding counter to national trends, which reflect an overall decline in smoking rates over the past 20 years. All of the other risk factor trends seen in the Cleveland Clinic study were in line with national trends.

The study also revealed a significant increase in the proportion of patients who have three or more major risk factors, which grew from 65 to 85 percent. Kapadia said the findings carry strong messages for both the medical community and the general public.

“Prevention must be kept in the forefront of primary care,” Kapadia said. “Cardiac health is not just dependent on the cardiologist. The primary care physicians and the patient need to take ownership of this problem.”

For patients, taking ownership means adopting a heart-healthy lifestyle early. “Don’t wait until you have a diagnosed heart problem to start taking care of yourself and paying attention to your lifestyle and dietary choices. You should be working hard to avoid developing heart disease in the first place,” Kapadia said

Read more

Simplication is not always the best

Share

road-sign-798175_1280

For University of Saskatchewan accounting professor Fred Phillips, that was a startling realization. A recent study by Phillips has shown that making accounting problems simple does not help students as much as making those same problems difficult.

“When I first started teaching, I thought my role as a teacher was to take difficult topics and make them easy,” said Phillips, who has been teaching in the Edwards School of Business for the past 20 years. “While there is some immediate value in that, it is fleeting — it degrades in memory over time.”

By making students struggle with problems — introducing designed difficulty into problem solving — Phillips has discovered that students have fared better on topics over the long term.

“When students have to really think and evaluate what they have to do, this desirable difficulty contributes to meaningful learning,” explained Phillips, a recipient of the 3M National Teaching Fellowship, the highest teaching honour in Canada.

To gain a better understanding of this concept, Phillips recruited 170 business students to take part in the study outside of class.

One set of students was given a series of accounting problems in successive order, each concept building on the next: essentially they learned “A,” then “B” then “C” in a grouped pattern (think practicing a sequence of problems as AAABBBCCC)

The other group received interleaved problems where A, B and C were presented in a non-grouped order (ABCABCABC). This group did not practice A, B or C in successive order and students took longer to solve the problems.

“The theory is that struggle leads to longer-term connections in memory that won’t degrade as much over time, ” said Phillips.

Immediately following the practice problems, Phillips tested both groups on the concepts. The first group, Phillips explained, could do the problems faster and scored higher (about 8 per cent higher). Phillips tested the students once more, this time a week later. This time the second group came out on top by about 15 per cent. Interestingly enough, the first group’s score dropped significantly compared to the previous scores (a 27 point decline), while the second group’s score dropped on average by only four per cent.

“Desirable difficulty contributes to meaningful learning,” said Phillips, adding that he has hunch that the difference would dissipate with time.

“The real challenge is to help students see the value in struggling, failing and overcoming. It’s challenging for professors as well because we are evaluated by students on how easy we make their learning feel. It’s not intuitive for students or instructors to value learning difficulties. It doesn’t feel good.”

Phillips said he reminds himself “our job is to help students overcome difficulties. We need to think carefully about the hurdles students struggle with and making those hurdles an intentional part of the instructional process. Let students struggle, but be there to help.”

Read more

Why some people are more prone to anxiety

Share

model-589219_1920

People with anxiety fundamentally perceive the world differently, according to a study reported in the Cell Press journal Current Biology on March 3. They aren’t simply making the choice to “play it safe.”

The new study shows that people diagnosed with anxiety are less able to distinguish between a neutral, “safe” stimulus (in this case, the sound of a tone) and one that was earlier associated with the threat of money loss or gain. In other words, when it comes to emotional experiences, they show a behavioral phenomenon known as over-generalization, the researchers say.

“We show that in patients with anxiety, emotional experience induces plasticity in brain circuits that lasts after the experience is over,” says Rony Paz of Weizmann Institute of Science in Israel. “Such plastic changes occur in primary circuits that later mediate the response to new stimuli, resulting in an inability to discriminate between the originally experienced stimulus and a new similar stimulus. Therefore, anxiety patients respond emotionally to such new stimuli as well, resulting in anxiety even in apparently irrelevant new situations. Importantly, they cannot control this, as it is a perceptual inability to discriminate.”

In the study, Paz and his colleagues trained people with anxiety to associate three distinct tones with one of three outcomes: money loss, money gain, or no consequence. In the next phase, study participants were presented with one of 15 tones and were asked whether they’d heard the tone before in training or not. If they were right, they were rewarded with money.

The best strategy was not to mistake (or over-generalize) a new tone for one they’d heard in the training phase. But the researchers found that people with anxiety were more likely than healthy controls to think that a new tone was actually one of the tones they’d heard earlier. That is, they were more likely to mistakenly associate a new tone with money loss or gain. Those differences weren’t explained by differences in participants’ hearing or learning abilities. They simply perceived the sounds that were earlier linked to an emotional experience differently.

Functional magnetic resonance images (fMRIs) of the brains of people with anxiety versus healthy controls showed differences in brain responses, too. Those differences were mainly found in the amygdala, a brain region related to fear and anxiety, and also in primary sensory regions of the brain. These results strengthen the idea that emotional experiences induce changes in sensory representations in anxiety patients’ brains.

The findings might help to explain why some people are more prone to anxiety than others, although the underlying brain plasticity that leads to anxiety isn’t in itself “bad,” Paz says.

“Anxiety traits can be completely normal, and even beneficial evolutionarily. Yet an emotional event, even minor sometimes, can induce brain changes that might lead to full-blown anxiety,” he says.

Read more

Identifying whether you are being selfish or altruistic

Share

160202185552_1_540x360

To understand human behaviors, it is crucial to understand the motives behind them. So far, there was no direct way to identify motives, simply observing behavior or eliciting explanations from individuals for their actions will not give reliable results as motives are considered to be private and people can be unwilling to unveil — or even be unaware of — their own motives but now a new study suggests that the specific alignment of neural networks in the brain is indicative of whether a person’s altruism was motivated by selfish or altruistic behavior

Psychologist and neuroscientist Grit Hein and Ernst Fehr from the Department of Economics, University of Zurich teamed up with Yosuke Morishima, Susanne Leiberg, Sunhae Sul and found that the way relevant brain regions communicate with each other is altered depending on the motives driving a specific behavioral choice. This interplay between brain regions allowed them to identify the underlying motives. These motives could not be uncovered by observing the person’s choices, or based on the brain regions that are activated during the decision-making.

Connections between brain regions linked to motives

During the study, participants were placed in an fMRI scanner and made altruistic decisions driven by an empathy motive (the desire to help a person for whom one feels empathy) or a reciprocity motive (the desire to reciprocate an individual’s previous kindness). Simply looking at the functional activity of specific regions of the brain couldn’t reveal the motive underlying the decisions. Broadly speaking, the same areas in the brain lit up in both settings. “However, using Dynamic Causal Modeling (DCM) analyses, we could investigate the interplay between these brain regions and found marked differences between empathy- based and reciprocity-based decisions,” explains Grit Hein. “The impact of the motives on the interplay between different brain regions was so fundamentally different that it could be used to classify the motive of a person with high accuracy” she continues.

Empathy motive increases altruistic behavior in selfish people

A further important result was that motives are processed differently in selfish and prosocial people. In selfish people, the empathy but not the reciprocity motive increased the number of altruistic decisions. After activating the empathy motive, selfish individual resembled persons with prosocial preferences in terms of brain connectivity and altruistic behavior. In contrast, prosocial people behaved even more altruistically after activating the reciprocity, but not the empathy motive.

Read more

How to kick the sugar habit

Share

m-ms-687630_1920

A little sugar now and then isn’t a big deal to most children. But sugar has been shown to have an effect similar to an addictive drug, and many kids across the United States are in fact addicted to it. Sugar affects children’s short-term behavior and moods and also can seriously affect their long-term health.

Here are some ways to help kids cut back.

  1. Prevent unrelated cravings: sugar addiction causes tangible cravings, but so do other entirely unrelated factors, such as low blood sugar and thirst. So begin by dealing with them: Make sure kids always get enough protein, healthful fats, other nutrients and water to keep their energy at optimum levels. And watch out for social or psychological conditions — boredom, emotional issues, teenage hormone surges — that might lead to overindulging.
  2. Cut the culprits: soda, energy drinks and sports drinks make up 32 percent of an average child’s sugar intake, followed by desserts at 18 percent, fruit drinks at 15 percent, candy at 7 percent and ready-to-eat cereal at 6 percent. The less sugar children consume, the less they are going to crave it.
  3. Improve breakfast by adding protein and fresh fruit. Switch to 100 percent whole-grain cereal or toast. Use the weekend, rather than a school morning, to introduce a lower-sugar pancake or a new egg dish, as you will probably have more time to cook and to ride out complaints or questions.
    • Stock the fridge with healthful snacks, and don’t regularly buy sugary treats and then tell your kids they can’t have them. It is better to go out for ice cream occasionally than to have a tub of mint chocolate chip in the freezer marked off limits. But do still bake cookies with them once in a while.
    • Divorce dinner from dessert — sweet stuff should never be a reward for eating a healthy dinner. Pick one night a week when the family enjoys dessert together, and allow your children to take turns choosing what will be served. Spoon out appropriate portions, let the children have seconds if they are truly still hungry, and do not judge the amount they eat. Also avoid overeating yourself, and don’t express feelings of guilt after eating.
    • And watch out for hidden culprits in prepared foods. Read labels on these main offenders to be sure they are low in added sugar: yogurts, energy bars, ketchup, sauces (spaghetti, barbecue), salad dressings, breads, crackers, peanut butter, infant formula and drinks.
  4. Retrain addicted kids: our tastes for overly sweetened flavors are learned, which means they can also be unlearned.
    • Designate the number of “sometimes foods” (sports drinks, ice cream, cookies, hot chocolate — anything with added sugar) you believe is appropriate for your kids to consume in a day or week. If one a day or three a week sounds right to you, that’s fine. Just allow the children to choose what and when, and do not judge them. Your children should be in control. When they use up their daily or weekly allocation, they have to wait for the next day or week. Let them also choose the dinner menu one night a week, the healthy snacks they eat and what gets packed in their lunch. This way they learn to control the food they eat in a balanced way.
    • Remind kids there will always be another chance for something sweet. If they are afraid they won’t get another Oreo for a long time, they are more likely to eat past the point of satisfaction.
    • If a child has an acute problem with sugar, it is okay to incentivize him or her to kick the habit — with a reward that isn’t food. We are all motivated by an incentive, be it a bonus, a paycheck or kudos for a job well done. Once your child actively agrees to kick his or her sugar habit in return for an established prize, stock a healthy kitchen and start the child on this plan.
  5. Set a good example: enjoy all food with your children, never express guilt for enjoying a dessert, and model moderation.
    • It is important to get on the same page as your spouse and be consistent with your food rules and messages. Stay neutral, and do not judge your kids’ choices or mistakes. We are all drawn to desserts and sweeter items at times, and kids are no exception.
    • Lastly, support and encourage children when they make good choices.
Tags:
Read more

Overweight people have the lowest all-cause mortality rates

Share

apple-497510_1280

Cells with higher fat content outlive lean cells, says a new study from Michigan State University.

This study has implications for larger organisms, such as humans, as the results support the phenomenon known as the “obesity paradox.” This concept shows that overweight people have the lowest all-cause mortality rates while fit people, oddly enough, have mortality rates comparable to those categorized as slightly obese.

“The obesity paradox baffles scientists across numerous disciplines,” said Min-Hao Kuo, MSU biochemist and molecular biologist who published the study in the current issue of PLoS Genetics. “But when it comes to yeast, which is an excellent model for the studies of human aging, increasing the cellular content of triacylglycerol, or fat, extends the lifespan.”

Kuo’s team was the first to show a positive correlation between Triacylglycerol, or TAG, content and lifespan. The connection provides support for the obesity paradox theory, he added.

TAG is a fat found in all eukaryotes that include animals, plants and fungi. The lipid’s ability to store excessive energy, provide insulation and accumulate in response to many stressors is well known. What’s perplexing, though, is how TAG influences lifespan.

“Our team used genetic approaches to manipulate the cellular capacity of triacylglycerol reproduction and degradation,” Kuo said. “Via sophisticated analyses, we demonstrated that it preserves life through a mechanism that is largely independent of other lifespan regulation pathways common in yeast as well as humans.”

The first thing Kuo’s team did was delete TAG lipases, enzymes that break down the lipid into smaller molecules for different uses including energy extraction. Unable to utilize TAG, these yeast accumulated fat inside the cells. In addition, Kuo and his colleagues boosted the production of the fat by increasing the enzyme for TAG synthesis.

In both cases, blocking TAG breakdown and forcing its production, yeast cells are fatter and have longer lifespan. In contrast, yeast cells depleted of the ability to synthesize TAG are lean but die early. Overexpressing a TAG lipase in an otherwise normal strain forces TAG breakdown. These cells also suffer from a shorter lifespan.

Interestingly, those fat and long-living yeast cells do not seem to suffer from obvious growth defects. They mate and produce progeny well. They also have normal resistance to different environmental stresses. On the other hand, other common methods of extending lifespan, such as caloric restriction and deletion of genes key to nutrient sensing, frequently cause cells to grow slowly or be less tolerant of environmental stresses.

While the team suspects that the pro-longevity function exists in humans, they’ve yet to prove that triacylglycerol could drive the intriguing phenomenon in humans.

Read more

Does the food we eat affect how our genes behave?

Share

market-1154999_1920

Almost all of our genes may be influenced by the food we eat, suggests recent research published in the journal Nature Microbiology. The study, carried out in yeast — which can be used to model some of the body’s fundamental processes — shows that while the activity of our genes influences our metabolism, the opposite is also true and the nutrients available to cells influence our genes.

The behaviour of our cells is determined by a combination of the activity of its genes and the chemical reactions needed to maintain the cells, known as metabolism. Metabolism works in two directions: the breakdown of molecules to provide energy for the body and the production of all compounds needed by the cells.

Knowing the genome — the complete DNA ‘blueprint’ of an organism — can provide a substantial amount of information about how a particular organism will look. However, this does not give the complete picture: genes can be regulated by other genes or regions of DNA, or by ‘epigenetic’ modifiers — small molecules attached to the DNA that act like switches to turn genes on and off.

Previous studies have suggested that another player in gene regulation may exist: the metabolic network — the biochemical reactions that occur within an organism. These reactions mainly depend on the nutrients a cell has available — the sugars, amino acids, fatty acids and vitamins that are derived from the food we eat.

To examine the scale at which this happens, an international team of researchers, led by Dr Markus Ralser at the University of Cambridge and the Francis Crick Institute, London, addressed the role of metabolism in the most basic functionality of a cell. They did so using yeast cells. Yeast is an ideal model organism for large scale experiments at it is much simpler to manipulate than animal models, yet many of its important genes and fundamental cellular mechanisms are the same as or very similar to those in animals and humans.

The researchers manipulated the levels of important metabolites — the products of metabolic reactions — in the yeast cells and examined how this affected the behaviour of the genes and the molecules they produced. Almost nine out of ten genes and their products were affected by changes in cellular metabolism.

“Cellular metabolism plays a far more dynamic role in the cells than we previously thought,” explains Dr Ralser. “Nearly all of a cell’s genes are influenced by changes to the nutrients they have access to. In fact, in many cases the effects were so strong, that changing a cell’s metabolic profile could make some of its genes behave in a completely different manner. The classical view is that genes control how nutrients are broken down into important molecules, but we’ve shown that the opposite is true, too: how the nutrients break down affects how our genes behave.”

For more information

Read more

Some words about diets

Share

picjumbo.com_IMG_4340

Thousands flock to the internet in search of ways to boost a healthy lifestyle. Many popular diet facts and trends are circulated so often in the media that it’s hard to know which tips to trust and which ones should be tossed. Underneath popular opinion and platitudes, the truth about eating healthy may surprise you. A Texas A&M Health Science Center registered dietician separates myths from fact when it comes to your diet.

Gluten-free desserts are healthier

“Gluten-free desserts are not healthier than ‘normal’ desserts,” said Lisa Mallonee, a registered dietician with the Texas A&M University Baylor College of Dentistry. “In fact, gluten substitutes may actually increase calorie content and contribute to weight gain. With that being said, gluten-free food is great to consume by those diagnosed with celiac disease or who are gluten-intolerant — but gluten-free desserts should be eaten in moderation and with a balanced diet.”

Sugar free and fat free foods lead to fat-free bodies

When the words ‘sugar free’ or ‘fat free’ are splashed across a box of chocolate it’s probably easy to feel less guilty about eating the entire box in one sitting. “Fat free and sugar free do not mean foods are calorie free,” Mallonee said. “It doesn’t matter what type of food you are eating, if you are consuming more calories than you’re expending, you will gain weight.”

While browsing fat free or sugar free treats it’s essential to be a conscious label reader. In fact, the fat content in many of these ‘sugar free’ items can be extremely high. Similar to gluten-free desserts, when nutrients like fat are removed from food, artificial ingredients may be added back to the food to account for taste. This filler may lead to more calories.

Carbs make you fat

Carbs alone do not cause weight gain — instead, it’s the type of carbs we choose to consume that lead to more fat cells in the body. “We need carbs because they are the body’s main source of fuel,” Mallonee said. “The real problem with carbohydrates lies in the American diet rich in refined carbs and processed foods. Binging on these carbohydrates will contribute to weight gain.”

Mallonee recommends eating a balanced diet higher in complex carbs and lower in simple or processed carbs. “The average American needs to be consuming more fruits, vegetables and whole grains and less processed foods, refined carbohydrates and white flour products,” she said.

Healthy food is more expensive

“Indeed, eating fresh may cost more than loading up your shopping cart with processed foods or fast food from restaurant value menus, but, in the big picture, it will likely cost you more in medical bills to maintain an unhealthy lifestyle,” Mallonee said. “You have to look at the long-term health impact.”

According to Mallonee, it is possible to eat clean at an economical price. “When it comes to fruits and vegetables my word of reason is to always buy in-season. We all have favorites but when we buy them year-round when they’re not in season we will see a price increase. You should always vary your palate — don’t be afraid to try the eggplant or cauliflower when it’s in season over broccoli or asparagus,” she said.

You’ll gain weight if you eat late at night

‘Eat breakfast like a king, lunch like a queen and dinner like a pauper.’ Have you ever heard this saying?

Mallonee said it doesn’t matter what time you’re eating as much as what you are eating. “This is more about portion control and how you’re expending calories,” she said. “It doesn’t matter what time of day you eat as long as you are eating a balanced diet, consuming foods in moderation and burning off more calories than you consume.”

Fasting is important to cleanse the body

Mallonee stressed she doesn’t recommend fasting unless it’s for religious purposes. “We already have a built in cleansing system: our kidneys and liver,” she said. “Simply fasting to ‘cleanse’ where you don’t eat for a certain number of days can be dangerous. I recommend consulting a physician prior to any extreme diet that encourages fasting for an extended period of time”

“Having a diet that’s fiber-rich is what moves toxins out of your body naturally,” she added. “The more fiber you consume the more it’s able to move food and the related toxins out of the body. Unfortunately, most Americans have a refined diet that is too low in fiber. This is what allows toxins to thrive inside our bodies. It’s important to know we all have cells with the potential to turn into cancer cells. The way we fuel our body determines if these are transformed into cancer cells or are terminated.”

Energy bars are good for weight loss

Our busy lives often don’t allow for adequate meal preparation and many Americans turn to energy bars as a quick and easy meal replacement. Mallonee stressed that while energy bars are convenient, they need to be consumed along with a balanced diet and we should be wary of their ingredients.

“Most of the time I refer to energy bars as glorified candy bars,” she said. “They can be extremely high in fat and sugar content. While they may be a good way for athletes to consume extra calories, I wouldn’t recommend them for a person trying to boost fat loss.”

You can’t always trust the internet

The internet is an excellent resource for diet tips and healthy living, but it can be untrustworthy. It’s always best to talk to your health care provider or a registered dietician to get the most up-to-date and factual nutrition advice.

Read more

Barley’s health benefits

Share

barley-373360_1920

Barley can rapidly improve people’s health by reducing blood sugar levels and the risk for diabetes, a recent study shows. The secret lies in the special mixture of dietary fibers found in barley, which can also help reduce people’s appetite and risk for cardiovascular disease

A study from Lund University in Sweden shows that barley can rapidly improve people’s health by reducing blood sugar levels and the risk for diabetes. The secret lies in the special mixture of dietary fibres found in barley, which can also help reduce people’s appetite and risk for cardiovascular disease.

“It is surprising yet promising that choosing the right blend of dietary fibres can — in a short period of time — generate such remarkable health benefits,” says Anne Nilsson, Associate Professor at the Food for Health Science Centre and one of the researchers behind the study.

The study was conducted with healthy middle-aged participants who were asked to eat bread largely made out of barley kernels for three days — at breakfast, lunch and dinner. Approximately 11-14 hours after their final meal of the day participants were examined for risk indicators of diabetes and cardiovascular disease.

The researchers found that the participants’ metabolism improved for up to 14 hours, with additional benefits such as decreases in blood sugar and insulin levels, increases in insulin sensitivity and improved appetite control. The effects arise when the special mixture of dietary fibres in barley kernel reaches the gut, stimulating the increase of good bacteria and the release of important hormones.

“After eating the bread made out of barley kernel, we saw an increase in gut hormones that regulate metabolism and appetite, and an increase in a hormone that helps reduce chronic low-grade inflammation, among the participants. In time this could help prevent the occurrence of both cardiovascular disease and diabetes,” says Anne Nilsson.

In a previous related study conducted with a team from the University of Gothenburg in Sweden researchers also found that dietary fibres from barley kernel generate an increase of the gut bacteria Prevotella copri, which have a direct regulatory effect on blood sugar levels and help decrease the proportion of a type of gut bacteria that is considered unhealthy.

The effects from barley kernel are influenced by the composition of the individual’s gut microbiota, meaning people with low concentrations of the Prevotella copri bacteria experienced less effect from their intake of barley products. Eating more barley could, however, help stimulate growth of the bacteria.

The results are timely, as rates of obesity and type 2 diabetes have significantly increased in the past few years. Researchers hope that more knowledge about the impact of specific dietary fibres on people’s health will hopefully result in stores keeping more food products with healthy properties such as barley kernels in stores, researchers hope. The ambition is also to get more people to use barley in meals for example in salads, soups, stews, or as an alternative to rice or potatoes

Read more

Developing more effective therapies for mental health.

Share

160202185552_1_540x360

After just nine weeks of internet-delivered cognitive behavioral therapy, the brain of patients suffering from social anxiety disorder changes in volume. Anxiety is reduced, and parts of the patients’ brains decrease in both volume and activity.

We have known for many years that the brain is remarkably adaptable. For instance, previous studies have shown that juggling and video games affect brain volume. However many questions about how the brain adapts remain unanswered.

A group of researchers from Linköping University and other Swedish universities has studied how internet-delivered cognitive behavioral therapy (ICBT) affects brain volume and activity. The researchers focused on patients with social anxiety disorder (SAD), one of the most common mental health problems. Before and after treatment the brains of patients were examined with magnetic resonance imaging (MRI).

The researchers found that in patients with SAD, brain volume and activity in the amygdala decrease as a result of ICBT. The results are presented in Translational Psychiatry, a Nature publication.

“The greater the improvement we saw in the patients, the smaller the size of their amygdalae. The study also suggests that the reduction in volume drives the reduction in brain activity,” says doctoral student Kristoffer NT Månsson, who led the study together with Linköping colleague Gerhard Andersson and researchers from the Karolinska Institutet, Uppsala University, Umeå University and Stockholm University.

The study comprised 26 individuals treated over the internet for nine weeks, making it a relatively small study. However, it is unique in that it investigates multiple factors at the same time: post-treatment changes in both brain volume and brain activity.

Read more

Valentine’s Day


Normalize Diabetes

Share

intro-physician

On February 26, 2016 MCI Health will begin enrollment for Normalize Diabetes a national multi-center, Phase 1 clinical trial for reversal of Type 2 Diabetes Mellitus. The trial will consist of a 12 week intervention and a 40 week maintenance phase. The end point will be normalization of beta-cell function and insulin resistance.

For more information call 877 395 6731

Read more

Using antiperspirant and deodorant completely rearranges the microbial ecosystem of your skin

Share

female-839203_1920

Wearing antiperspirant or deodorant doesn’t just affect your social life, it substantially changes the microbial life that lives on you. New research finds that antiperspirant and deodorant can significantly influence both the type and quantity of bacterial life found in the human armpit’s “microbiome.” The work was done by researchers at North Carolina State University, the North Carolina Museum of Natural Sciences, North Carolina Central University, Rutgers University and Duke University.

“We wanted to understand what effect antiperspirant and deodorant have on the microbial life that lives on our bodies, and how our daily habits influence the life that lives on us,” says Julie Horvath, head of the genomics and microbiology research laboratory at the NC Museum of Natural Sciences, an associate research professor at NC Central, and corresponding author of a paper describing the work published in the journal PeerJ. “Ultimately, we want to know if any changes in our microbial ecosystem are good or bad, but first we have to know what the landscape looks like and how our daily habits change it.”

“Thousands of bacteria species have the potential to live on human skin, and in particular in the armpit,” says Rob Dunn, a professor of applied ecology at NC State and co-author of the paper. “Just which of these species live in any particular armpit has been hard to predict until now, but we’ve discovered that one of the biggest determinants of the bacteria in your armpits is your use of deodorant and/or antiperspirant.”

“Within the last century, use of underarm products has become routine for the vast majority of Americans,” says Julie Urban, co-author of the paper, assistant head of the genomics and microbiology laboratory at the NC Museum of Natural Sciences, and adjunct professor of entomology at NC State. “Yet, whether use of these products favors certain bacterial species — be they pathogenic or perhaps even beneficial — seems not to have been considered, and remains an intriguing area needing further study.”

To learn about the microbial impact of antiperspirant and deodorant, the researchers recruited 17 study participants: three men and four women who used antiperspirant products, which reduce the amount we sweat; three men and two women who used deodorant, which often includes ethanol or other antimicrobials to kill off odor-causing microbes; and three men and two women who used neither product. They then launched an eight-day experiment, in which all of the participants had swabs taken of their armpits between 11 a.m. and 1 p.m.

On day one, participants followed their normal hygiene routine in regard to deodorant or antiperspirant use. On days two through six, participants did not use any deodorant or antiperspirant. On days seven and eight, all participants used antiperspirant.

The researchers then cultured all the samples to determine the abundance of microbial organisms growing on each participant and how that differed day to day.

“We found that, on the first day, people using antiperspirant had fewer microbes in their samples than people who didn’t use product at all — but there was a lot of variability, making it hard to draw firm conclusions,” Horvath says. “In addition, people who used deodorant actually often had more microbes — on average — than those who didn’t use product.”

By the third day, participants who had used antiperspirant were beginning to see more microbial growth. And by day six, the amount of bacteria for all study participants was fairly comparable.

“However, once all participants began using antiperspirant on days seven and eight, we found very few microbes on any of the participants, verifying that these products dramatically reduce microbial growth,” Horvath notes.

The researchers also did genetic sequencing on all of the samples from days three and six, to determine how antiperspirant and deodorant might affect the microbial biodiversity — the composition and variety of types of bacteria — over time.

They found that, among study participants who hadn’t worn deodorant or antiperspirant, 62 percent of the microbes they found were Corynebacteria, followed by various Staphylococcaceae bacteria (21 percent), with a random assortment of other bacteria accounting for less than 10 percent. Corynebacteria are partially responsible for producing the bad smells we associate with body odor, but they are also thought to help us defend against pathogens. Staphylococcaceae are a diverse group of bacteria that are among the most common microbes found on human skin and, while some can pose a risk to human health, most are considered beneficial.

The participants who had been regular antiperspirant users coming into the study had wildly different results. Sixty percent of their microbes were Staphylococcaceae, only 14 percent were Corynebacteria, and more than 20 percent were filed under “other” — meaning they were a grab-bag of opportunistic bacteria.

“Using antiperspirant and deodorant completely rearranges the microbial ecosystem of your skin — what’s living on us and in what amounts,” Horvath says. “And we have no idea what effect, if any, that has on our skin and on our health. Is it beneficial? Is it detrimental? We really don’t know at this point. Those are questions that we’re potentially interested in exploring.”

The new findings also highlight how human behavior can have a profound, if unintended, impact on the evolution of microbial organisms.

In another paper, published last month in Proceedings of the Royal Society B, the researchers, in addition to collaborators at Duke and the University of Pennsylvania, examined the diversity and abundance of microbes found in the armpits of humans, compared to other primates: chimpanzees, gorillas, baboons and rhesus macaques. In that paper, the researchers found that armpit microbes have evolved over time in conjunction with the primates they live on. But the microbial ecosystems found in the armpits of humans are vastly different — and far less diverse — than those found in our primate relatives.

“One exciting finding was that the non-human primates were more covered in fecal and soil associated microbes, which we often view as dirty,” Horvath says. “Perhaps the diversity of fecal and soil microbes on non-human primate skin serves some benefit that we don’t yet understand or appreciate.

“Over evolutionary time, we would expect our microbes to co-evolve with us,” Horvath says. “But we appear to have altered that process considerably through our habits, from bathing to taking steps to change the way we look or smell.”

Read more

Age-dependent alterations in metabolism and gene regulation are linked to a reduction in lifespan

Share

old-man-971889_1920

Midlife crisis in the insect world: In a new study, Ludwig-Maximilians-Universitaet (LMU) in Munich researchers have detected  in middle-aged fruitflies, and show that these effects .

The aging process is accompanied by characteristic changes in physiology whose overall effect is to decrease the capacity for tissue repair and increase susceptibility to metabolic disease. In particular, the overall level of metabolic activity falls, and errors in the regulation of gene activity become more frequent. Now, a collaborative study by two research groups at LMU’s Biomedical Center, led by Axel Imhof (Professor of Molecular Biology) and Andreas Ladurner (Professor of Physiological Chemistry), has shown in the fruitfly Drosophila melanogaster that such age-dependent changes are already detectable in middle age. Genetic investigation of the signal pathways involved in mediating this effect identified a common process — the modification of proteins by the attachment of so-called acetyl groups (CH3COO−) to proteins — that links the age-related changes at the metabolic and genetic levels. Their findings appear in the journal “EMBO reports.”

As we age, the efficiency of the mitochondria progressively declines. Mitochondria are subcellular organelles in the cells of higher organisms that convert nutrients into biochemically usable energy. Mitochondria also possess their own genome, and mutations in this mitochondrial DNA have been linked to a reduction in lifespan. Paradoxically, however, several studies have shown that reducing levels of mitochondrial activity — by restricting food intake, for instance — can actually extend lifespan. “These findings imply that the primary cause of aging cannot simply lie in a reduction in overall metabolic activity, so the whole issue must be more complicated than that,” Imhof points out. Most studies of the aging process employ comparisons between young and old individuals belonging to the same species. “However, in aged animals, many of the potentially relevant physiological operations no longer function optimally, which makes it difficult to probe their interactions. That is why we chose to look in Drosophila to see whether we could find any characteristic metabolic changes or other striking modifications in flies on the threshold of old age and, if so, ask how these processes interact with each other,” he explains.

Rates of protein modification rise

The two teams first made the surprising discovery that middle-aged male flies (7 weeks old) actually consume more oxygen than their younger conspecifics. This points to a metabolic readjustment which is accompanied by an increase in mitochondrial activity. — And indeed, the researchers noted a rise in the intracellular concentration of acetyl-CoA in these flies. Acetyl-CoA is a metabolite that is produced in the mitochondria, which participates in large number of processes in energy metabolism. Furthermore, it is an important source of acetyl groups for the chemical modification of proteins. “Acetyl groups are attached to specific positions in certain proteins by dedicated enzymes, and can be removed by a separate set of enzymes. These modifications modulate the functions of the proteins to which they are added,” Ladurner explains. “And our experiments have shown that many proteins are much more likely to be found in acetylated form in middle-aged flies than in younger individuals.”

Strikingly, this is true not only for proteins that are involved in basic metabolism, but also for proteins that are directly responsible for regulating gene expression. In the cell nucleus, the genomic DNA molecules are wrapped around “spools” made of proteins called histones. These spools or “nucleosomes” are tightly packed together, and keep the nuclear DNA in a compact, condensed form. Various chemical modifications of the nucleosomal histones — including acetylation — regulate the accessibility of the DNA to the enzymes required for gene expression, and thus determine which genes are active at any given time. “We were able to show that the histones in middle-aged flies are overacetylated,” Imhof says. “This reduces the packing density of the DNA, and with it the stringency of gene regulation. The overall result is a rise in the level of errors in the expression of the genetic information, because genetic material that should be maintained in a repressed state can now be reactivated.” And Ladurner adds: “In the prime of their lives, fruitflies begin to produce a surfeit of acetylated proteins, which turns out to be too much of a good thing.”

Inhibiting acetylation increases lifespan

Taken together, these findings indicate that changes in acetylation may be a key factor in the process of natural aging, reflecting alterations in basic metabolism as well as modifying gene regulation. “A rise in the level of protein acetylation seems to be linked to a decrease in life expectancy,” says Ladurner. “For inhibition of an acetylase enzyme which specifically attaches acetyl groups to histones, or attenuation of the rate of synthesis of acetyl-CoA — which reduces the supply of acetyl groups — reverses many of the age-dependent modifications seen in these animals, and both interventions are associated with a longer and more active lifespan.”

The researchers are now planning to look for comparable effects in mammals. “If that turns out to be the case, then the enzymes that specifically acetylate histones might well be interesting targets for the development of novel therapeutic agents that correct age-dependent dysregulation,” says Imhof. “Partial inhibitors that reduce enzyme activity without completely blocking it would probably be most effective in this context.”

Read more

The maternal diet influences fat and glucose metabolism of offspring through epigenetic alterations

Share

mother-1039765_1920

As the study shows, a high-fat diet during pregnancy and lactation leads to epigenetic changes in the offspring. These changes affect metabolic pathways regulated by the gut hormone GIP, whereby the adult offspring are more susceptible to obesity and insulin resistance, the precursor to type 2 diabetes. Similar mechanisms cannot be ruled out in humans, according to Pfeiffer.

The lead authors Michael Kruse and Farnaz Keyhani-Nejad recently published the results in collaboration with researchers of Helmholtz Zentrum München in the journal Diabetes.

As scientists throughout the world observe, children of obese mothers have a higher risk of obesity and metabolic disorders. Recent findings suggest that diet-related epigenetic effects may also play a causal role in this. Since humans and mice are genetically very similar, many scientists use mouse models to study such relationships under controlled conditions. Such studies on humans are not possible.

This study focused on the epigenetic effects on the GIP-regulated metabolic pathways that are triggered by the maternal diet during pregnancy and lactation. GIP is a hormone that the gut releases after food intake and which stimulates the secretion of insulin from the pancreas. It influences the metabolism of fat cells and fat oxidation in skeletal muscles and as anabolic hormone promotes the build-up of body mass. These effects are mediated by the GIP via the GIP receptor. If this receptor is lacking as in the Gipr-/- mouse, the hormone can no longer exert its natural effect, and the animals are normally protected from obesity and insulin resistance. Since the Gipr-/- mouse model is well suited for the study of GIP-regulated metabolic pathways, the researchers used this mouse strain for their study. The wild-type strain of the mouse model served as control.

First, the researchers divided the mouse mothers into three groups, who were fed different chow during pregnancy and lactation:

Group 1: Gipr-/- mice who received a high-fat diet
Group 2: Gipr-/- mice who received regular chow
Group 3: Wild-type mice with intact GIP receptor, who received regular chow
After weaning, all offspring of the three groups were fed normal chow for 22 weeks followed by a high-fat diet for an additional 20 weeks.

As the scientists observed, the adult offspring of groups 1 and 3 gained a significant amount of fat mass during the 20-week high-fat diet although they ate less than the offspring of group 2. They also had heightened levels of cholesterol, glucose, and insulin in the blood. In addition, they exhibited increased adipose tissue inflammation and enlarged fat cells and oxidized less fat in their muscles. Furthermore, the researchers found that the activity of different genes was altered in group 1 and 3 in comparison to group 2. These genes play a role in fat oxidation in muscles and in inflammatory processes in adipose tissue or are involved in the regulation of energy consumption by the brain.

“The altered gene activity could partially be traced back to DNA methylation, that is, epigenetic changes,” said Pfeiffer. “Our results indicate that the GIP also plays a role in energy consumption, which is controlled by the brain, probably indirectly by reducing the insulin sensitivity of the hypothalamus,” the endocrinologist added. This is an entirely new finding. It remains to be seen to what extent these results can be applied to humans. More research on this topic is needed. However, it is clear that diet not only has a direct influence on the individual, but also may affect the offspring

Read more

9 methods to handle food cravings

Share

Food cravings are the dieter’s worst enemy.

These are intense or uncontrollable desires for specific foods, stronger than normal hunger.

The types of foods that people crave are highly variable, but these are often processed junk foods that are high in sugar.

Cravings are one of the biggest reasons why people have problems losing weight and keeping it off.

Here are 9 simple ways to prevent or stop unhealthy food and sugar cravings.

glasses-933582_1920

1. Drink Water

Thirst is often confused with hunger or food cravings.

If you feel a sudden urge for a specific food, try drinking a large glass of water and wait a few minutes. You may find that the craving fades away, because your body was actually just thirsty.

Furthermore, drinking plenty of water may have many health benefits. In middle-aged and older people, drinking water before meals can reduce appetite and help with weight loss.

Bottom Line: Drinking water before meals may reduce cravings and appetite, as well as help with weight loss.

salmon-518032_1920

2. Eat More Protein

Eating more protein may reduce your appetite and keep you from overeating.

It also reduces cravings, and helps you feel full and satisfied for longer (4).

One study of overweight teenage girls showed that eating a high-protein breakfast reduced cravings significantly.

Another study in overweight men showed that increasing protein intake to 25% ofcalories reduced cravings by 60%. Additionally, the desire to snack at night was reduced by 50%.

Bottom Line: Increasing protein intake may reduce cravings by up to 60% and cut the desire to snack at night by 50%.

praline-933432_1920

3. Distance Yourself From the Craving

When you feel a craving, try to distance yourself from it.

For example, you can take a brisk walk or a shower to shift your mind onto something else. A change in thought and environment may help stop the craving.

Some studies have also shown that chewing gum can help reduce appetite and cravings.

Bottom Line: Try to distance yourself from the craving by chewing gum, going on a walk or taking a shower.

leek-640530_1920

4. Plan Your Meals

If possible, try to plan your meals for the day or upcoming week.

By already knowing what you’re going to eat, you eliminate the factor of spontaneity and uncertainty.

If you don’t have to think about what to eat at the following meal, you will be less tempted and less likely to experience cravings.

Bottom Line: Planning your meals for the day or upcoming week eliminates spontaneity and uncertainty, both of which can cause cravings.

apple-18721_1920

5. Avoid Getting Extremely Hungry

Hunger is one of the biggest reasons why we experience cravings.

To avoid getting extremely hungry, it may be a good idea to eat regularly and have healthy snacks close at hand.

By being prepared, and avoiding long periods of hunger, you may be able to prevent the craving from showing up at all.

Bottom Line: Hunger is a big reason for cravings. Avoid extreme hunger by always having a healthy snack ready.

ache-19005_1920

6. Fight Stress

Stress may induce food cravings and influence eating behaviors, especially for women.

Women under stress have been shown to eat significantly more calories and experience more cravings than non-stressed women.

Furthermore, stress raises your blood levels of cortisol, a hormone that can make yougain weight, especially in the belly area.

Try to minimize stress in your environment by planning ahead, meditating and generally slowing down.

Bottom Line: Being under stress may induce cravings, eating and weight gain, especially in women.

friends-581753_1920-3

7. Practice Mindful Eating

Mindful eating is about practicing mindfulness, a type of meditation, in relation to foods and eating.

It teaches you to develop awareness of your eating habits, emotions, hunger, cravings and physical sensations.

Mindful eating teaches you to distinguish between cravings and actual physical hunger. It helps you choose your response, instead of acting thoughtlessly or impulsively.

Eating mindfully involves being present while you eat, slowing down and chewing thoroughly. It is also important to avoid distractions, like the TV or your smartphone.

One 6-week study in binge eaters found that mindful eating reduced binge eating episodes from 4 to 1.5 per week. It also reduced the severity of each binge.

Bottom Line: Mindful eating is about learning to recognize the difference between cravings and actual hunger, helping you choose your response.

man-909049_1920

8. Get Enough Sleep

Your appetite is largely affected by hormones that fluctuate throughout the day.

Sleep deprivation disrupts the fluctuations, and may lead to poor appetite regulation and strong cravings.

Studies support this, showing that sleep-deprived people are up to 55% more likely to become obese, compared to people who get enough sleep.

For this reason, getting good sleep may be one of the most powerful ways to prevent cravings from showing up.

Bottom Line: Sleep deprivation may disrupt normal fluctuations in appetite hormones, leading to cravings and poor appetite control.

meal-918638_1920

9. Eat Proper Meals

Hunger and a lack of key nutrients can both cause certain cravings.

Therefore, it’s important to eat proper meals at mealtimes. This way, your body gets the nutrients it needs and you won’t get extremely hungry right after eating.

If you find yourself in need of a snack between meals, make sure it’s somethinghealthy. Reach for whole foods, such as fruits, nuts, vegetables or seeds.

Bottom Line: Eating proper meals helps prevent hunger and cravings, while also ensuring that your body gets the nutrients it needs.

Read more

Health deteriorates after adults stop driving

Share

steam-606527_1920

For older adults, driving a car is an important aspect of having control over one’s life. While 81 percent of the 29.5 million U.S. adults aged 65 and over continue to hold a license and get behind the wheel, age-related declines in cognition and physical function make driving more difficult, and many seniors reduce or eventually stop driving altogether. Researchers at Columbia University’s Mailman School of Public Health examined the health and well-being of older adults after they stopped driving and found that their health worsened in a variety of ways. In particular, driving cessation nearly doubled the risk of depressive symptoms, while also contributing to diminished cognitive abilities and physical functioning. Findings are published online in the Journal of the American Geriatrics Society.

“For many older adults, driving is more than a privilege; it is instrumental to their daily living and is a strong indicator of self-control, personal freedom, and independence,” said Guohua Li, MD, DrPH, Mailman School professor of Epidemiology, the founding director of the Center for Injury Epidemiology and Prevention at Columbia, and senior author. “Unfortunately, it is almost inevitable to face the decision to stop driving during the process of aging as cognitive and physical functions continue to decline.”

Dr. Li and a team of researchers reviewed and analyzed quantitative health-related data for drivers aged 55 and older from 16 studies that met eligibility criteria and compared results with data from current drivers. The study updates and expands on earlier findings with more than 10 additional years of empirical research.

Data showed that older adults experienced faster declines in cognitive function and physical health after stopping driving. Driving cessation was also associated with a 51-percent reduction in the size of social networks of friends and relatives–something the researchers say can contrain the social lives of seniors and their ability to engage with others. Decline in social health after driving cessation appeared greater in women than in men.

Former drivers were also nearly five times as likely as current drivers to be admitted to a nursing home, assisted living community, or retirement home, after adjusting for marital status or co-residence.

“As older ex-drivers begin substituting outside activities with indoor activities around the home, these activities may not be as beneficial to physical functioning as working or volunteering on the outside,” said Thelma Mielenz, PhD, assistant professor of Epidemiology at the Mailman School and co-author. “When time comes to stop driving, it is important to make personalized plans to maintain mobility and social functions.”

The researchers note that merely making alternative transportation available to older adults does not necessarily offset the adverse health effects of driving cessation. “What we need most of all are effective programs that can ensure and prolong an older adult’s mobility, physical, and social functioning,” said Li

Read more

Epigenetics plays a key role in the evolution of memory and learning

Share

great-tit-238678_1920

A well-known songbird, the great tit, has revealed its genetic code, offering researchers new insight into how species adapt to a changing planet. Their initial findings suggest that epigenetics — what’s on rather than what’s in the gene — may play a key role in the evolution of memory and learning. And that’s not just true for birds. An international research team led by the Netherlands Institute of Ecology (NIOO-KNAW) and Wageningen University will publish these findings in Nature Communications on Monday.

“People in our field have been waiting for this for decades,” explain researchers Kees van Oers and Veronika Laine from the Netherlands Institute of Ecology. The reference genome of their favourite model species, the great tit, is “a powerful toolbox that all ecologists and evolutionary biologists should know about.”

Coming from a single Dutch bird, the genetic code of the assembled reference genome will help to reveal the genetic basis of phenotypic evolution. This is essential for understanding how wild species adapt to our changing planet.

In addition to looking at the genome, the research team have also determined the so-called transcriptome and methylome. The latter belongs to the field of epigenetics: the study of what you can inherit not in but ‘on’ your genes. Specific DNA sequences in the genome can be ‘methylated’: methyl groups are added to them, modifying how the genes function.

The research team sequenced the complete genomes of a further 29 great tit individuals from different parts of Europe. This enabled them to identify regions in the great tit’s genome that have been under selection during recent evolution of the bird. These regions appeared to be overrepresented for genes related to learning and cognition.

“The great tit has evolved to be smart,” says Van Oers. “Very smart.” It’s not your average bird, as it belongs to the top 3% smartest birds when it comes to learning new behaviour. That makes it a perfect candidate for research into the evolution of learning, memory and cognitive processes.

What that research has revealed are so-called conserved patterns of methylation in those same regions, present not only in birds but also in humans and other mammals. It’s evidence of a correlation between epigenetic processes such as methylation and the rate of molecular evolution: “the more methylation, the more evolution.”

And so the great tit has once more proved that its role as a model species in a variety of biological research fields for over 60 years is by no means coincidental

Read more

Getting hooked on a habit changes the brain

Share

brain-998996_1920

By now, you might have discovered that taming your sweet tooth as a New Year’s resolution is harder than you think.

New research by Duke University scientists suggests that a habit leaves a lasting mark on specific circuits in the brain, priming us to feed our cravings.

Published online Jan. 21 in the journal Neuron, the research deepens scientists’ understanding of how habits like sugar and other vices manifest in the brain and suggests new strategies for breaking them.

“One day, we may be able to target these circuits in people to help promote habits that we want and kick out those that we don’t want,” said the study’s senior investigator Nicole Calakos, M.D., Ph.D., an associate professor of neurology and neurobiology at the Duke University Medical Center.

Calakos, an expert in the brain’s adaptability, teamed up with Henry Yin, an expert in animal models of habit behavior in Duke’s department of psychology and neuroscience. Both scientists are also members of the Duke Institute for Brain Sciences.

Their groups trained otherwise healthy mice to form sugar habits of varying severity, a process that entailed pressing a lever to receive tiny sweets. The animals that became hooked kept pressing the lever even after the treats were removed.

The researchers then compared the brains of mice that had formed a habit to the ones that didn’t. In particular, the team studied electrical activity in the basal ganglia, a complex network of brain areas that controls motor actions and compulsive behaviors, including drug addiction.

In the basal ganglia, two main types of paths carry opposing messages: One carries a ‘go’ signal which spurs an action, the other a ‘stop’ signal.

Experiments by Duke neurobiology graduate student Justin O’Hare found that the stop and go pathways were both more active in the sugar-habit mice. O’Hare said he didn’t expect to see the stop signal equally ramped up in the habit brains, because it has been traditionally viewed as the factor that helps prevent a behavior.

The team also discovered a change in the timing of activation in the two pathways. In mice that had formed a habit, the go pathway turned on before the stop pathway. In non-habit brains, the stop signal preceded the go.

These changes in the brain circuitry were so long-lasting and obvious that it was possible for the group to predict which mice had formed a habit just by looking at isolated pieces of their brains in a petri dish.

Scientists have previously noted that these opposing basal ganglia pathways seem to be in a race, though no one has shown that a habit gives the go pathway a head start. O’Hare said that’s because the go and stop signals had not been studied in the same brain at the same time. But new labeling strategies used by the Duke scientists allowed researchers to measure activity across dozens of neurons in both pathways simultaneously, in the same animal.

“The go pathway’s head start makes sense,” said Calakos. “It could prime the animal to be more likely to engage in the behavior.” The researchers are testing this idea, as well as investigating how the rearrangements in activity occur in the first place.

Interestingly, the group observed that changes in go and stop activity occurred across the entire region of the basal ganglia they were studying as opposed to specific subsets of brain cells. O’Hare said this may relate to the observation that an addiction to one thing can make a person more likely to engage in other unhealthy habits or addictions as well.

To see if they could break a habit, the researchers encouraged the mice to change their habit by rewarding them only if they stopped pressing the lever. The mice that were the most successful at quitting had weaker go cells. But how this might translate into help for humans with bad habits is still unclear. Because the basal ganglia is involved in a broad array of functions, it may be tricky to target with medicines.

Calakos said some researchers are beginning to explore the possibility of treating drug addiction using transcranial magnetic stimulation or TMS, a noninvasive technique that uses magnetic pulses to stimulate the brain. “TMS is an inroad to access these circuits in more severe diseases,” she said, in particular targeting the cortex, a brain area that serves as the main input to the basal ganglia.

For more ordinary bad habits “simpler, behavioral strategies many of us try may also tap into similar mechanisms,” Calakos added. “It may be just a matter of figuring out which of them are the most effective.”

Read more

Give your loved one the gift of Health this Valentine’s Day

Share

scale-403585_1280-2

Buy it here

Read more

Social interactions are important for gut microbial diversity in chimps

Share

chimpanzee-718273_1920

Spending time in close contact with others often means risking catching germs and getting sick. But being sociable may also help transmit ‘good’ microbes, finds a multi-institutional study of gut microbiomes in chimpanzees.

Researchers monitored changes in the gut microbes and social behavior of wild chimpanzees over eight years in Gombe National Park, Tanzania. They found that the number of bacteria species in a chimp’s GI tract goes up when the chimps are more gregarious.

The results help scientists better understand the factors that maintain a healthy gut microbiome.

The warm, soft folds of our intestines are home to hundreds of species of bacteria and other microbes that help break down food, synthesize vitamins, train the immune system and fight infections. Reduced gut microbial diversity in humans has been linked to obesity, diabetes, Crohn’s and other diseases.

“The more diverse people’s microbiomes are, the more resistant they seem to be to opportunistic infections,” said Andrew Moeller, research fellow at the University of California, Berkeley, who co-authored the study published in the Jan. 15, 2016 issue of Science Advances.

Moeller and colleagues analyzed the bacterial DNA in droppings collected from 40 chimpanzees between 2000 and 2008. The chimpanzees ranged in age from infants to seniors.

The researchers identified thousands of species of bacteria thriving in the animals’ guts, many of which are also commonly found in humans, such as species of Olsenella and Prevotella.

The team then combined the microbial data with daily records of what the animals ate and how much time they spent with other chimps versus alone.

“Chimpanzees tend to spend more time together during the wet season when food is more abundant,” said Duke University research scientist Steffen Foerster, who co-authored the study. “During the dry season they spend more time alone.”

The researchers found that each chimpanzee carried roughly 20 to 25 percent more bacterial species during the abundant and social wet season than during the dry season.

But the microbiome differences weren’t solely due to seasonal changes in the fruit, leaves and insects that make up their diet, the researchers found. The chimps’ shifts between hobnobbing and loner lifestyles were also important.

Gut bacteria likely pass from chimp to chimp during grooming, mating or other forms of physical contact, or when they inadvertently step where other chimps have pooped, said co-author Anne Pusey, chair of Duke’s department of evolutionary anthropology.

The mix of bacteria in the animals’ bowels was just as similar between unrelated individuals as it was between mothers and offspring, the researchers found. This was surprising because infants pick up their first microbiomes from their mother when they pass through her birth canal. The findings suggest that, over a lifetime, social interactions with other chimps are just as important for gut microbial diversity as initial exposure from mom.

Scientists don’t yet know if social networks help maintain gut microbiome diversity in humans. “One of the main reasons that we started studying the microbiomes of chimpanzees was that it allowed us to do studies that have not or cannot be done in humans,” said study co-author Howard Ochman of the University of Texas at Austin. “It’s really an amazing and previously underexploited resource.”

More about this study can be found http://today.duke.edu/2016/01/chimpmicrobiome

Read more

Disjointed maternal care can create emotional disorders that manifest themselves throughout a child’s life

Share

woman-791874_1920

Mothers, put down your smartphones when caring for your babies! That’s the message from University of California, Irvine researchers, who have found that fragmented and chaotic maternal care can disrupt proper brain development, which can lead to emotional disorders later in life.

While the study was conducted with rodents, its findings imply that when mothers are nurturing their infants, numerous everyday interruptions — even those as seemingly harmless as phone calls and text messages — can have a long-lasting impact.
Dr. Tallie Z. Baram and her colleagues at UCI’s Conte Center on Brain Programming in Adolescent Vulnerabilities show that consistent rhythms and patterns of maternal care seem to be crucially important for the developing brain, which needs predictable and continuous stimuli to ensure the growth of robust neuron networks. Study results appear today in Translational Psychiatry.

The UCI researchers discovered that erratic maternal care of infants can increase the likelihood of risky behaviors, drug seeking and depression in adolescence and adult life. Because cellphones have become so ubiquitous and users have become so accustomed to frequently checking and utilizing them, the findings of this study are highly relevant to today’s mothers and babies … and tomorrow’s adolescents and adults.
“It is known that vulnerability to emotional disorders, such as depression, derives from interactions between our genes and the environment, especially during sensitive developmental periods,” said Baram, the Danette “Dee Dee” Shepard Chair in Neurological Studies.

“Our work builds on many studies showing that maternal care is important for future emotional health. Importantly, it shows that it is not how much maternal care that influences adolescent behavior but the avoidance of fragmented and unpredictable care that is crucial. We might wish to turn off the mobile phone when caring for baby and be predictable and consistent.”

The UCI team — which included Hal Stern, the Ted & Janice Smith Family Foundation Dean of Information & Computer Sciences — studied the emotional outcomes of adolescent rats reared in either calm or chaotic environments and used mathematical approaches to analyze the mothers’ nurturing behaviors.

Despite the fact that quantity and typical qualities of maternal care were indistinguishable in the two environments, the patterns and rhythms of care differed drastically, which strongly influenced how the rodent pups developed. Specifically, in one environment, the mothers displayed “chopped up” and unpredictable behaviors. During adolescence, their offspring exhibited little interest in sweet foods or peer play, two independent measures of the ability to experience pleasure. Known as anhedonia, the inability to feel happy is often a harbinger of later depression. In humans, it may also drive adolescents to seek pleasure from more extreme stimulation, such as risky driving, alcohol or drugs.

Why might disjointed maternal care generate this problem with the pleasure system? Baram said that the brain’s dopamine-receptor pleasure circuits are not mature in newborns and infants and that these circuits are stimulated by predictable sequences of events, which seem to be critical for their maturation. If infants are not sufficiently exposed to such reliable patterns, their pleasure systems do not mature properly, provoking anhedonia.

With her UCI team, Baram is currently studying human mothers and their infants. Video analysis of care, sophisticated imaging technology to measure brain development, and psychological and cognitive testing are being employed to more fully understand this issue. The goal is to see whether what was discovered in rodents applies to people. If so, then strategies to limit chopped-up and unpredictable patterns of maternal care might prove helpful in preventing emotional problems in teenagers.

Read more

Eat when you’re hungry

Share

friends-581753_1920

With the wide availability of convenient foods engineered for maximum tastiness– such as potato chips, chocolates, and bacon double cheeseburgers– in the modern food environment and with widespread advertising, the contemporary consumer is incessantly being bombarded with the temptation to eat. This means that, in contrast to people in traditional societies, people in contemporary societies often eat not on account of hunger but because tasty food is available and beckoning at all hours of the day. New research published in the Journal of the Association for Consumer Research, found that the tendency of today’s consumers to eat when they are not hungry might be less advantageous for health than eating when they are hungry.

The individuals participating in the study were 45 undergraduate students. The participants were first asked to rate their level of hunger and then to consume a meal rich in carbohydrates. To measure how the meal was impacting participants’ health, participants’ blood glucose levels were measured at regular intervals after they consumed the meal. Blood glucose levels tend to rise after a meal containing carbohydrates and it is generally healthier if blood glucose levels rise by a relatively small amount because elevated blood glucose is damaging to the body’s cells.

The results of the study showed that individuals who were moderately hungry before the meal tended to have lower blood glucose levels after consuming the meal than individuals who were not particularly hungry before consuming the meal. These findings suggest that it might be healthier for individuals to eat when they are moderately hungry than when they are not hungry.

Read more

Winter layer of fat

Share

christmas-moose-1061527_1920

People have evolved to have subconscious urges to over-eat, and limited ability to avoid becoming obese, especially in winter, a University of Exeter study has found.

There is not yet an evolutionary mechanism to help us overcome the lure of sweet, fatty and unhealthy food and avoid becoming overweight for understandable and sensible reasons, according to researchers.

This is because in our past being overweight has not posed a significant threat to survival compared to the dangers of being underweight. The urge to maintain body fat is even stronger in winter when food in the natural world is scarce. This explains why we enjoy eating so much at Christmas, and our New Year’s resolutions to lose weight usually fail.

Researchers used computer modelling to predict how much fat animals should store, by assuming that natural selection gives animals, including humans, a perfect strategy to maintain the healthiest weight. Their model predicts how the amount of fat an animal stores should respond to food availability and the risk of being killed by a predator when foraging.

The model shows that the animal should have a target body weight above which it loses weight and below which it tries to gain weight. Simulations showed that there is usually only a small negative effect of energy stores exceeding the optimal level, so subconscious controls against becoming overweight would be weak and so easily overcome by the immediate rewards of tasty food.

Lead author Dr Andrew Higginson, from the College of Life and Environmental Sciences at the University of Exeter, said: “You would expect evolution to have given us the ability to realise when we have eaten enough, but instead we show little control when faced with artificial food. Because modern food today has so much sugar and flavour the urge humans have to eat it is greater than any weak evolutionary mechanism which would tell us not to.

“The model also predicts animals should gain weight when food is harder to find. All animals, including humans, should show seasonal effects on the urge to gain weight. Storing fat is an insurance against the risk of failing to find food, which for pre-industrial humans was most likely in winter. This suggests that New Year’s Day is the worst possible time to start a new diet.”

The evolutionary model also shows that there is no evidence to support the “drifty gene” hypothesis, which some researchers have previously suggested would explain why some people become overweight and others do not

Read more

The sooner your child exercises after birth the faster their gut microbiome gets optimized.

Share

bacteria-108895_1920

The human gut harbors a teeming menagerie of over 100 trillion microorganisms, and researchers at the University of Colorado Boulder have discovered that exercising early in life can alter that microbial community for the better, promoting healthier brain and metabolic activity over the course of a lifetime.

The research, which was recently published in the journal Immunology and Cell Biology, indicates that there may be a window of opportunity during early human development to optimize the chances of better lifelong health.

“Exercise affects many aspects of health, both metabolic and mental, and people are only now starting to look at the plasticity of these gut microbes,” said Monika Fleshner, a professor in CU-Boulder’s Department of Integrative Physiology and the senior author of the new study. “That is one of the novel aspects of this research.”

Microbes take up residence within human intestines shortly after birth and are vital to the development of the immune system and various neural functions. These microbes can add as many 5 million genes to a person’s overall genetic profile and thus have tremendous power to influence aspects of human physiology.

While this diverse microbial community remains somewhat malleable throughout adult life and can be influenced by environmental factors such as diet and sleep patterns, the researchers found that gut microorganisms are especially ‘plastic’ at a young age.

The study found that juvenile rats who voluntarily exercised every day developed a more beneficial microbial structure, including the expansion of probiotic bacterial species in their gut compared to both their sedentary counterparts and adult rats, even when the adult rats exercised as well.

The researchers have not, as of yet, pinpointed an exact age range when the gut microbe community is likeliest to change, but the preliminary findings indicate that earlier is better.

A robust, healthy community of gut microbes also appears to promote healthy brain function and provide anti-depressant effects, Fleshner said. Previous research has shown that the human brain responds to microbial signals from the gut, though the exact communication methods are still under investigation.

“Future research on this microbial ecosystem will hone in on how these microbes influence brain function in a long-lasting way,” said Agniezka Mika, a graduate researcher in CU-Boulder’s Department of Integrative Physiology and the lead author of the new study.

Going forward, the researchers also plan to explore novel means of encouraging positive gut microbe plasticity in adults, who tend to have stable microbial communities that are more resistant to change.

Read more

Skin Cream that makes you healthy and extends your life

Share

skin-care-1050979_1920

A commonly used skin care ingredient is one of several newly identified compounds that can mimic the life-extending effect of a starvation diet, new University of Liverpool research has revealed.

Calorie restriction, a reduction in calorie intake without malnutrition, has been found to slow down the ageing process in several animal models from worms to mammals, and developing drugs that can reproduce this effect, without the side effects, could have widespread human applications.

Now, using complex genetic data analysis and testing, scientists have shown for the first time that allantoin, which is found in botanical extracts of the comfrey plant and is an ingredient of many anti-ageing creams, can mimic the effect of calorie restriction and increase lifespan in worms by more than 20%.

Dr João Pedro de Magalhães, from the University’s Institute of Integrative Biology, who led the study said: “Calorie restriction has been shown to have health benefits in humans and, while more work is necessary, our findings could potentially result in human therapies for age-related diseases.”

To identify potential calorie restriction mimetic compounds, the team made use of existing molecular signatures from human cells treated with a variety of small-molecule drugs.

Using pattern-matching algorithms to make connections between drug compounds and calorie restriction effects, eleven potential compounds were identified. Five of these were then tested in nematode worms.

The researchers found that worms treated with allantoin, rapamycin, trichostatin A and LY-294002 not only lived longer, but also stayed healthier longer. Additionally, when the same compounds were tested in mutant worms they extended lifespan in a way expected from calorie restriction. Further molecular analysis of allantoin suggests it acts by a different mechanism from rapamycin, a well-known longevity drug.

PhD student Shaun Calvert, who carried out the work said: “Testing anti-aging interventions in humans is not practical, so developing computational methods to predict longevity drugs is of great use.

“We have shown so far that our compounds work in worms, but studies in mammalian models are now necessary. The next step for us is to understand the mechanisms by which allantoin extends lifespan, as this could reveal new longevity pathways.”

Tags:
Read more

Too much sleep and too little exercise

Share

woman-918981_1920

Sleeping more than nine hours a night, and sitting too much during the day could be a hazardous combination, particularly when added to a lack of exercise, according to new findings to emerge from the Sax Institute’s 45 and Up Study.

The findings, published in the journal PLOS Medicine, show that a person who sleeps too much, sits too much and isn’t physically active enough is more than four times as likely to die early as a person without those unhealthy lifestyle habits. (Too much sitting equates to more than 7 hours a day and too little exercise is defined as less than 150 minutes a week.)

“Evidence has increased in recent years to show that too much sitting is bad for you and there is growing understanding about the impact of sleep on our health but this is the first study to look at how those things might act together,” said lead author Dr Melody Ding.

“When you add a lack of exercise into the mix, you get a type of ‘triple whammy’ effect. Our study shows that we should really be taking these behaviours together as seriously as we do other risk factors such as levels of drinking and unhealthy eating patterns. ”

Dr Ding and her colleagues from the University of Sydney analysed the health behaviours of more than 230,000 of the participants in the 45 and Up Study — Australia’s largest study — which is looking at the health of our population as we age.

They looked at lifestyle behaviours that are already known to increase the risk of death and disease — smoking, high alcohol intake, poor diet and being physically inactive — and added excess sitting time and too little/too much sleep into the equation. They then looked at different combinations of all of these risk factors to see which groupings had the most impact on a person’s risk of dying prematurely from any cause.

As well as new evidence on the risky combination of prolonged sleep, sitting and lack of exercise, the researchers also found another problematic triple threat: smoking, high alcohol intake and lack of sleep (less than 7 hours a night) is also linked to a more than four-times greater risk of early death.

And several other combinations led to more than double the risk of early death:

Being physically inactive + too much sleep
Being physically inactive + too much sitting
Smoking + high alcohol intake
“The take-home message from this research — for doctors, health planners and researchers — is that if we want to design public health programs that will reduce the massive burden and cost of lifestyle-related disease we should focus on how these risk factors work together rather than in isolation,” said study co-author Professor Adrian Bauman.

“These non-communicable diseases (such as heart disease, diabetes and cancer) now kill more than 38 million people around the world — and cause more deaths than infectious disease. Better understanding what combination of risk behaviours poses the biggest threat will guide us on where to best target scarce resources to address this major — and growing — international problem.”

Tags:
Read more

Healthy sperm healthy baby

Share

match-106369_1920

There is increasing evidence that parents’ lifestyle and the environment they inhabit even long before they have children may influence the health of their offspring. A current study, led by researchers from the Novo Nordisk Foundation Center for Basic Metabolic Research, sheds light on how.

Researchers in Associate Professor Romain Barrès’ laboratory compared sperm cells from 13 lean men and 10 obese men and discovered that the sperm cells in lean and obese men, respectively, possess different epigenetic marks that could alter the next generation’s appetite, as reported in the medical journal Cell Metabolism.

A second major discovery was made as researchers followed six men before and one year after gastric-bypass surgery (an effective intervention to lose weight) to find out how the surgery affected the epigenetic information contained in their sperm cells. The researchers observed an average of 4,000 structural changes to sperm cell DNA from the time before the surgery, directly after, and one year later.

“We certainly need to further examine the meaning of these differences; yet, this is early evidence that sperm carries information about a man’s weight. And our results imply that weight loss in fathers may influence the eating behaviour or their future children,” says Romain Barrès.

Inspiration

“Epidemiological observations revealed that acute nutritional stress, e.g. famine, in one generation can increase the risk of developing diabetes in the following generations,” Romain Barrès states. He also referenced a study that showed that the availability of food in a small Swedish village during a time of famine correlated with the risk of their grandchildren developing cardiometabolic diseases.

The grandchildren’s health was likely influenced by their ancestors’ gametes (sperm or egg), which carried specific epigenetic marks — e.g. chemical additions to the protein that encloses the DNA, methyl groups that change the structure of the DNA once it is attached, or molecules also known as small RNAs. Epigenetic marks can control the expression of genes, which has also been shown to affect the health of offspring in insects and rodents.

Molecular carrier

“In our study, we have identified the molecular carrier in human gametes that may be responsible for this effect,” says Barrès.

By detecting differences in small RNA expressions (where the function is not yet determined) and DNA methylation patterns, the researchers have proven that weight loss can change the epigenetic information men carry in their spermatozoa. In other words, what is transmitted in the father’s sperm can potentially affect the development of a future embryo and, ultimately, it can shape the child’s physiology.

“We did not expect to see such important changes in epigenetic information due to environmental pressure,” says Barrès. “Discovering that lifestyle and environmental factors, such as a person’s nutritional state, can shape the information in our gametes and thereby modify the eating behaviour of the next generation is, to my mind, an important find,” he adds.

Obesity

If we consider it in an obesity context, a worldwide heritable metabolic disorder which is sensitive to environmental conditions (diet and physical activity) the discovery that weight loss in fathers-to-be potentially affects the eating behaviour of their offspring is ground-breaking.

“Today, we know that children born to obese fathers are predisposed to developing obesity later in life, regardless of their mother’s weight. It’s another critical piece of information that informs us about the very real need to look at the pre-conception health of fathers” says Ida Donkin, MD and one of the lead authors of the paper. She continues, “And it’s a message we need to disseminate in society.”

“The study raises awareness about the importance of lifestyle factors, particularly our diet, prior to conception. The way we eat and our level of physical activity before we conceive may be important to our future children’s health and development,” says Soetkin Versteyhe, co-first author of the paper.

It is still early days in this field of research, but the study disrupts the current assumption that the only thing our gametes carry is genetic information, and there is nothing we can do about it. Traits that we once thought were inevitable could prove modifiable, and what we do in life may have implications not only for our own health but also the health of our children and even our grandchildren. This work opens up new avenues for investigations of possible intervention strategies to prevent the transmission of disorders such as obesity to future generations.

Read more

What’s your self image?

Share

fruit-1067192_1920

You’re a careful eater, avoiding high-calorie snacks and meals as a rule. But one day at the lunch counter, instead of ordering the usual salad, you’re tempted by a cheeseburger. Will you give in?

The answer, according to a recent study from the Johns Hopkins Carey Business School, may be influenced by whether you view yourself as more or less of an independent type, and whether you generally try to be ambitious or maintain the status quo.

It’s information that not only could help individuals set goals they may reasonably hope to achieve but also could guide marketers in matching a product to a particular audience.

In their paper, lead author Haiyang Yang of the Johns Hopkins Carey Business School and his two co-authors examine two kinds of “self-construal” — that is, how people view themselves. Someone with an “independent” self-image sees himself as distinct from others, while a person with an “interdependent” view of himself aims to fit into the social structure and maintain harmonious relations with others.

Additionally, the paper identifies two kinds of goals — those of “attainment” and of “maintenance.” Someone with attainment goals seeks to reach a desired state, by losing weight, for example, or adding to a savings account. A person with maintenance goals would seek to keep his weight and savings account at least at their current levels.

Yang and his colleagues say that while previous studies have looked separately at self-construal and goals, their paper is among the first to look at how the two concepts jointly influence consumer behavior. Through six experiments involving more than 2,000 participants in the United States and China, the researchers found that compared to people with a predominately interdependent self-construal, those with a predominately independent self-construal tend to be motivated more by goals of attainment and the accompanying potential for advancement and distinction. However, the more interdependent individuals tend to be motivated more by maintenance goals that emphasize stability and continuity.

“In one of our studies,” Yang said in an interview, “we observed people’s real-life bodyweight goal pursuit behaviors (that is, losing vs. maintaining bodyweight) over a period of 13 months. We found that people who had fewer social ties, and hence were more independent, were more likely to set the goal of reducing as opposed to maintaining bodyweight. Further, after people set their weight-management goals, the more independent individuals were more motivated, as measured by the amount of the money they were willing to bet on their success, to pursue weight-loss goals as opposed to weight-maintenance goals.”

The researchers also found that appeals to a person’s sense of independence or interdependence can influence how goals are set. When study participants were asked about a series of possible actions — adding to a savings account, losing weight, and increasing their college grade-point averages — their motivation for attaining a better state was greater when the actions were posed as benefitting them as individuals, as opposed to benefitting their close social groups (relatives and friends). The opposite pattern emerged for the maintenance-goal version of the actions.

Companies should consider these findings when marketing products and services internationally, with an eye to whether the national culture leans toward independence or interdependence, Yang and his co-authors assert. They state in the paper: “Marketing practitioners should consider engineering purchase environments or consumption contexts to activate respective self-construal, nudging consumers toward goals congruent with firms’ marketing objectives and hence increasing the likelihood of consumers’ adoption of those consumption goals.”

The researchers further assert that consumers can practice the same kind of leverage on themselves by matching their goals to their self-construal (as independent or interdependent people) and thus increasing the motivation to bring their actions to successful and satisfying conclusions.

Read more

Genes that extend your life

Share

back up pics and vids 346

Driven by the quest for eternal youth, humankind has spent centuries obsessed with the question of how it is exactly that we age. With advancements in molecular genetic methods in recent decades, the search for the genes involved in the aging process has greatly accelerated.

Until now, this was mostly limited to genes of individual model organisms such as the C. elegans nematode, which revealed that around one percent of its genes could influence life expectancy. However, researchers have long assumed that such genes arose in the course of evolution and in all living beings whose cells have a preserved a nucleus — from yeast to humans.

Combing through 40,000 genes

Researchers at ETH Zurich and the JenAge consortium from Jena have now systematically gone through the genomes of three different organisms in search of the genes associated with the aging process that are present in all three species — and thus derived from the genes of a common ancestor. Although they are found in different organisms, these so-called orthologous genes are closely related to each other, and they are all found in humans, too.

In order to detect these genes, the researchers examined around 40,000 genes in the nematode C. elegans, zebra fish and mice. By screening them, the scientists wanted to determine which genes are regulated in an identical manner in all three organisms in each comparable aging stage — young, mature and old; i.e. either are they upregulated or downregulated during aging.

As a measure of gene activity, the researchers measured the amount of messenger RNA (mRNA) molecules found in the cells of these animals. mRNA is the transcript of a gene and the blueprint of a protein. When there are many copies of an mRNA of a specific gene, it is very active; the gene is upregulated. Fewer mRNA copies, to the contrary, are regarded as a sign of low activity, explains Professor Michael Ristow, coordinating author of the recently published study and Professor of Energy Metabolism at ETH Zurich.

Out of this volume of information, the researchers used statistical models to establish an intersection of genes that were regulated in the same manner in the worms, fish and mice. This showed that the three organisms have only 30 genes in common that significantly influence the aging process.

Reduce gene activity, live longer

By conducting experiments in which the mRNA of the corresponding genes were selectively blocked, the researchers pinpointed their effect on the aging process in nematodes. With a dozen of these genes, blocking them extended the lifespan by at least five percent.

One of these genes proved to be particularly influential: the bcat-1 gene. “When we blocked the effect of this gene, it significantly extended the mean lifespan of the nematode by up to 25 percent,” says Ristow.

The researchers were also able to explain how this gene works: the bcat-1 gene carries the code for the enzyme of the same name, which degrades so-called branched-chain amino acids. Naturally occurring in food protein building blocks, these include the amino acids L-leucine, L-isoleucine and L-valine.

When the researchers inhibited the gene activity of bcat-1, the branched-chain amino acids accumulated in the tissue, triggering a molecular signalling cascade that increased longevity in the nematodes. Moreover, the timespan during which the worms remained healthy was extended. As a measure of vitality, the researchers measured the accumulation of aging pigments, the speed at which the creatures moved, and how often the nematodes successfully reproduced. All of these parameters improved when the scientists inhibited the activity of the bcat-1 gene.

The scientists also achieved a life-extending effect when they mixed the three branched-chain amino acids into the nematodes’ food. However, the effect was generally less pronounced because the bcat-1 gene was still active, which meant that the amino acids continued to be degraded and their life-extending effects could not develop as effectively.

Conserved mechanism

Ristow has no doubt that the same mechanism occurs in humans. “We looked only for the genes that are conserved in evolution and therefore exist in all organisms, including humans,” he says.

In the present study, he and his Jena colleagues from the Leibniz Institute on Aging, the Leibniz Institute for Natural Product Research and Infection Biology, the Jena University Hospital and the Friedrich Schiller University purposefully opted not to study the impact on humans. But a follow-up study is already being planned. “However we cannot measure the life expectancy of humans for obvious reasons,” says the ETH professor. Instead, the researchers plan to incorporate various health parameters such as cholesterol or blood sugar levels in their study to obtain indicators on the health status of their subjects.

Health costs could be massively reduced

Ristow says that the multiple branched-chain amino acids are already being used to treat liver damage and are also added to sport nutrition products. “However, the point is not for people to grow even older, but rather to stay healthy for longer,” says the internist. The study will deliver important indicators on how the aging process could be influenced and how age-related diseases such as diabetes or high blood pressure could be prevented. In light of unfavourable demographics and steadily increasing life expectancy, it is important to extend the healthy life phase and not to reach an even higher age that is characterised by chronic diseases, argue the researchers. With such preventive measures, an elderly person could greatly improve their quality of life while at the same time cutting their healthcare costs by more than half.

Read more

What is your psychological relationship with food?

Share

delicious-196930_1280

Tens of millions of Americans vow each year to lose weight in the New Year, and while their intentions are good, most of the time their results are not. It’s estimated that only 8 percent of those who make New Year’s resolutions actually keep them.

Even if weight is lost initially, it usually returns. Studies show nearly 2 out of 3 people who lose 5 percent of their total weight will gain it back, and the more weight you lose, the less your chances of keeping it off.

“That’s not surprising,” said Diane Robinson, PhD, a neuropsychologist and Program Director of Integrative Medicine at Orlando Health. “Most people focus almost entirely on the physical aspects of weight loss, like diet and exercise. But there is an emotional component to food that the vast majority of people simply overlook and it can quickly sabotage their efforts.”

A recent national survey of more than a thousand people commissioned by Orlando Health found that 31 percent of Americans think a lack of exercise is the biggest barrier to weight loss, followed by those who say it’s what you eat (26%) and the cost of a healthy lifestyle (17%). Another 12 percent said the biggest barrier to weight loss was the necessary time commitment.

Only 1 in 10, however, thought psychological well-being was a factor. “That may explain why so many of us struggle,” said Robinson. “In order to lose weight and keep it off long term, we need to do more than just think about what we eat, we also need to understand why we’re eating.”

From a very young age we’re emotionally attached to food. As children we’re often given treats, both to console us when we’re upset, and to reward us for good behavior. Most celebrations, like Halloween, Thanksgiving and Valentine’s Day are food-focused, and birthdays are spent sharing cake. Even the mere smell of certain foods, like cookies in grandma’s oven, can create powerful emotional connections that last a lifetime.

“If we’re aware of it or not, we are conditioned to use food not only for nourishment, but for comfort,” said Robinson. “That’s not a bad thing, necessarily, as long as we acknowledge it and deal with it appropriately.”
Whenever the brain experiences pleasure for any reason it reacts the same way.

Whether it’s derived from drugs, a romantic encounter or a satisfying meal, the brain releases a neurotransmitter known as dopamine. “We feel good whenever that process is activated,” said Robinson, “but when we start to put food into that equation and it becomes our reward, it can have negative consequences.”

In fact, researchers have found a link between emotional issues like stress, anxiety and depression, and higher body mass indexes (BMI). Many of us can relate to the idea of overindulging at happy hour after a bad day at the office, for example, or eating a pint of ice cream to help us deal with bad news.

That was common coping mechanism for Shekyra DeCree, of Columbus, Ohio. “As a mental health therapist, my job can be very stressful, and everyday when I got home from work, the first thing I would do is go to the refrigerator,” she said. “That was my way to calm down and relax.”

After recognizing the emotional attachment she had with food, DeCree started making conscious changes. In just over one year, she’s lost more than 100 pounds.

“I’d gone on countless diets and tried to exercise before, but this was different,” she said. “You have to change the way you deal with your emotions, your stress and anxiety. Once I understood the mental aspect, I felt free.”

Robinson offers these tips to help recognize the emotional connection you may have to food:

Keep a daily diary logging your food and your mood, and look for unhealthy patterns. Identify foods that make you feel good and write down why you eat them. Do they evoke a memory or are you craving those foods out of stress? Before you have any snack or meal ask yourself: Am I eating this because I’m hungry? If the answer is no, look for the root of your motive.
The goal is to take emotion out of eating and see food as nourishment, not as a reward or coping mechanism. If you struggle, don’t be shy about finding help. “When we’re focused on the physical aspects of weight loss, many of us have no problem joining a gym or hiring a trainer,” said Robinson. “How about joining a support group or hiring a psychologist?” she said. “If getting your body in shape hasn’t work out yet, maybe this time start with your mind.”

Read more

Individualized diets based on your microbiome

Share

breakfast-827340_1280

Ever wonder why that diet didn’t work? An Israeli study tracking the blood sugar levels of 800 people over a week suggests that even if we all ate the same meal, how it’s metabolized would differ from one person to another. The findings, published November 19 in Cell, demonstrate the power of personalized nutrition in helping people identify which foods can help or hinder their health goals.

Blood sugar has a close association with health problems such as diabetes and obesity, and it’s easy to measure using a continuous glucose monitor. A standard developed decades ago, called the glycemic index (GI), is used to rank foods based on how they affect blood sugar level and is a factor used by doctors and nutritionists to develop healthy diets. However, this system was based on studies that average how small groups of people responded to various foods.

The new study, led by Eran Segal and Eran Elinav of the Weizmann Institute of Science in Israel, found that the GI of any given food is not a set value, but depends on the individual. For all participants, they collected data through health questionnaires, body measurements, blood tests, glucose monitoring, stool samples, and a mobile-app used to report lifestyle and food intake (a total of 46,898 meals were measured). In addition, the volunteers received a few standardized/identical meals for their breakfasts.

As expected, age and body mass index (BMI) were found to be associated with blood glucose levels after meals. However, the data also revealed that different people show vastly different responses to the same food, even though their individual responses did not change from one day to another.

“Most dietary recommendations that one can think of are based on one of these grading systems; however, what people didn’t highlight, or maybe they didn’t fully appreciate, is that there are profound differences between individuals–in some cases, individuals have opposite response to one another, and this is really a big hole in the literature,” says Segal, of Weizmann’s Department of Computer Science and Applied Math.

“Measuring such a large cohort without any prejudice really enlightened us on how inaccurate we all were about one of the most basic concepts of our existence, which is what we eat and how we integrate nutrition into our daily life,” says Elinav, of Weizmann’s Department of Immunology. “In contrast to our current practices, tailoring diets to the individual may allow us to utilize nutrition as means of controlling elevated blood sugar levels and its associated medical conditions.”

Moving Toward Personalized Nutrition

Compliance can be the bane of nutrition studies. Their outcomes rely on participants, away from the laboratory, rigidly following a diet and honestly recording their food intake. In the Weizmann study, the participants (representing a cross-section of Israel’s population and all unpaid) were asked to disrupt their weekly routine in two ways: They were to eat a standardized breakfast such as bread or glucose each morning and also enter all of their meals into a mobile app food diary. In return, the researchers would provide an analysis of the participants’ personalized responses to food, which relied on strict adherence to the protocol. Elinav and Segal say this proved to be a strong motivator, and participant meal reporting closely matched the biometric data obtained from their glucose monitors.

The individualized feedback yielded many surprises. In one case, a middle-aged woman with obesity and pre-diabetes, who had tried and failed to see results with a range of diets over her life, learned that her “healthy” eating habits may have actually been contributing to the problem. Her blood sugar levels spiked after eating tomatoes, which she ate multiple times over the course of the week of the study.

“For this person, an individualized tailored diet would not have included tomatoes but may have included other ingredients that many of us would not consider healthy, but are in fact healthy for her,” Elinav says. “Before this study was conducted, there is no way that anyone could have provided her with such personalized recommendations, which may substantially impact the progression of her pre-diabetes.”

To understand why such vast differences exist between people, the researchers conducted microbiome analyses on stool samples collected from each study participant. Growing evidence suggests gut bacteria are linked to obesity, glucose intolerance, and diabetes, and the study demonstrates that specific microbes indeed correlate with how much blood sugar rises post-meal. By conducting personalized dietary interventions among 26 additional study participants, the researchers were able to reduce post-meal blood sugar levels and alter gut microbiota. Interestingly, although the diets were personalized and thus greatly different across participants, several of the gut microbiota alterations were consistent across participants.

“After seeing this data, I think about the possibility that maybe we’re really conceptually wrong in our thinking about the obesity and diabetes epidemic,” says Segal. “The intuition of people is that we know how to treat these conditions, and it’s just that people are not listening and are eating out of control–but maybe people are actually compliant but in many cases we were giving them wrong advice.”

“It’s common knowledge among dieticians and doctors that their patients respond very differently to assigned diets,” he adds. “We can see in the data that the same general recommendations are not always helping people, and my biggest hope is that we can move this boat and steer it in a different direction.”

The researchers hope to translate what was learned in this basic research project so that it can be applied to a broader audience through further algorithmic developments that would reduce the number of inputs that are needed in order to provide people with personalized nutritional reports.

Read more

Small behavioral changes improve overall health

Share

jog-1005897_1920

Improving your heart health may be as simple as making small behavioral changes — a new study of behavioral health interventions suggests that they are effective at helping people alter their lifestyles and lead to physical changes that could improve overall health.

The findings also indicate a shift is needed in the way such interventions are evaluated by researchers and used by health care providers, said Veronica Irvin of Oregon State University, a co-author of the study just published in the Annals of Behavioral Medicine.

Behavioral treatments such as individual counseling or group training to improve nutrition or physical activity, reduce or stop smoking, or adhere to a drug treatment plan, often are overlooked because medical care providers tend to believe it is too difficult for people the make changes to their established lifestyles, said Irvin, an assistant professor in the College of Public Health and Human Sciences at OSU.

But large clinical drug trials for potential new medications often fail to show that those treatments make patients better, and drugs sometimes are associated with undesirable side effects, she said. Modification of health behavior is another option for health providers and their patients, Irvin explained, but is underutilized in clinical medical practice as well as in public health policy because many providers remain unconvinced that people can change their behavior to improve their health.

She and her co-author, Robert M. Kaplan of the Agency for Healthcare Research and Quality, conducted a comprehensive and systematic review of large-budget studies funded by the National Institutes of Health that involved behavioral interventions such as individual counseling or group training to improve nutrition or physical activity, reduce or stop smoking, or adhere to a drug treatment plan.

More than 80 percent of the randomized clinical trials that included a behavioral intervention reported a significant improvement for the targeted behavior and a significant physiological impact such a reduction in weight or blood pressure. Greater improvements were observed when the intervention simultaneously targeted two behaviors, such as nutrition and physical activity, which are considered lifestyle behaviors.

“This research suggests that behavioral interventions should be taken more seriously,” Irvin said. “It indicates that people are able to achieve realistic behavioral changes and improve their cardiovascular health.”

But the researchers also noted that few of the studies documented morbidity and mortality outcomes that are often required for drug trials. Previous research by Irvin and Kaplan found that most drug trials fail to reduce mortality. Behavioral interventions should be studied in a similar fashion, Irvin said.

“There are more positive outcomes with these trials, but they don’t often measure mortality,” Irvin said. “The next step for behavioral trials should be to measure results using clinical outcomes, such as the number of heart attacks and hospitalizations, experienced by participants.”

Most behavior interventions reviewed for the study showed benefits using surrogate markers for these kinds of clinical events. For example, treatments for high cholesterol have the goal of reducing heart attacks and extending life. Measures of cholesterol are surrogate markers because they are believed to be related to the clinical goal of reducing deaths.

But the surrogate markers are not always predictive of clinical outcomes, which is a potential concern for medical researchers. Future behavioral trials should investigate these clinical events as they would be in a traditional drug trial, Irvin said.

In this study, 17 trials reported a morbidity outcome, with seven showing a significant effect on reducing morbidity outcomes such as hospitalization or cardiovascular events.

Irvin and Kaplan began work on the study while the two worked together in the National Institutes of Health’s Office of Behavior and Social Science Research. They reviewed all large-budget clinical trials evaluating behavioral interventions for the treatment or prevention of cardiovascular disease that had received funding from the National Heart, Lung and Blood Institute or the National Institute of Diabetes & Digestive and Kidney Diseases between 1980 and 2012.

In all, 38 studies were included in the research. They were did not include 20 large-budget trials from the period in this study because no results from those trials have been published.

This underscores the need for more publication of research even if the outcomes were not as expected, Irvin said. Publishing these null outcomes prevents the unnecessary replication of studies and also may inform doctors and patients about which treatments are not likely to be helpful.

Read more

Confidence is established early on

Share

guy-783524_1920

By age 5 children have a sense of self-esteem comparable in strength to that of adults, according to a new study by University of Washington researchers.

Because self-esteem tends to remain relatively stable across one’s lifespan, the study suggests that this important personality trait is already in place before children begin kindergarten.

“Our work provides the earliest glimpse to date of how preschoolers sense their selves,” said lead author Dario Cvencek, a research scientist at the UW’s Institute for Learning & Brain Sciences (I-LABS).

“We found that as young as 5 years of age self-esteem is established strongly enough to be measured,” said Cvencek, “and we can measure it using sensitive techniques.”

The new findings, published in the January 2016 issue of the Journal of Experimental Social Psychology, used a newly developed test to assess implicit self-esteem in more than 200 5-year-old children — the youngest age yet to be measured.

“Some scientists consider preschoolers too young to have developed a positive or negative sense about themselves. Our findings suggest that self-esteem, feeling good or bad about yourself, is fundamental,” said co-author, Andrew Meltzoff, co-director of I-LABS. “It is a social mindset children bring to school with them, not something they develop in school.”

Meltzoff continued: “What aspects of parent-child interaction promote and nurture preschool self-esteem? That’s the essential question. We hope we can find out by studying even younger children.”

Until now no measurement tool has been able to detect self-esteem in preschool-aged children. This is because existing self-esteem tests require the cognitive or verbal sophistication to talk about a concept like “self” when asked probing questions by adult experimenters.

“Preschoolers can give verbal reports of what they’re good at as long as it is about a narrow, concrete skill, such as ‘I’m good at running’ or ‘I’m good with letters,’ but they have difficulties providing reliable verbal answers to questions about whether they are a good or bad person,” Cvencek said.

To try a different approach, Cvencek, Meltzoff and co-author Anthony Greenwald created a self-esteem task for preschoolers. Called the Preschool Implicit Association Test (PSIAT), it measures how strongly children feel positively about themselves.

Adult versions of the IAT, which was first developed by Greenwald, can reveal attitudes and beliefs that people don’t know they have, such as biases related to race, gender, age and other topics.

“Previously we understood that preschoolers knew about some of their specific good features. We now understand that, in addition, they have a global, overall knowledge of their goodness as a person,” said Greenwald.

The task for adults works by measuring how quickly people respond to words in different categories. For instance, the adult implicit self-esteem task measures associations between words like “self” and “pleasant” or “other” and “unpleasant.”

To make the task appropriate for preschoolers who can’t read, the researchers replaced words related to the self (“me,” “not me”) with objects. They used small unfamiliar flags, and the children were told which of the flags were “yours” and “not yours.”

The 5-year-olds in the experiment–which included an even mix of 234 boys and girls from the Seattle area–first learned to distinguish their set of flags (“me”) from another set of flags (“not me”).

Using buttons on a computer, they responded to a series of “me” and “not me” flags and to a series of “good” words from a loudspeaker (fun, happy, good, nice) and “bad” words (bad, mad, mean, yucky). Then, to measure self-esteem, the children had to combine the words and press the buttons to indicate whether the “good” words were associated more with the “me” flags or not.

The results showed that the 5-year-olds associated themselves more with “good” than with “bad,” and this was equally pronounced in both girls and boys.

The researchers also did two more implicit tests to probe different aspects of the self. A gender identity task assessed the children’s sense of whether they are a boy or a girl, and a gender attitude task measured the children’s preference for other children of their own gender, called a “gender in-group preference.”

Children who had high self-esteem and strong own-gender identity also showed stronger preferences for members of their own gender.

Taken together, the findings show that self-esteem is not only unexpectedly strong in children this young, but is also systematically related to other fundamental parts of children’s personality, such as in-group preferences and gender identity.

“Self-esteem appears to play a critical role in how children form various social identities. Our findings underscore the importance of the first five years as a foundation for life,” Cvencek said.

The researchers are following up with the children in the study to examine whether self-esteem measured in preschool can predict outcomes later in childhood, such as health and success in school. They are also interested in the malleability of children’s self-esteem and how it changes with experience.

Read more

Telomeres contribute to inflammatory signaling

Share

dna-1015661_1920

Our chromosomes contain all of our genetic information, and it’s up to telomeres — structures of proteins that cap off and protect our DNA at the tips of chromosomes — to preserve the vital instructions necessary for life. There are even specific molecules like TERRA (Telomeric repeat-containing RNA) that exist specifically to regulate telomeres and promote chromosome end protection.

Now, a new study has found that TERRA can be found outside of cells and serve as a potentially important cell signaling molecule that induces an inflammatory response, and this activity may play an important role in the development of cancer.

The new findings, showing the first evidence of TERRA existing outside of cells, were published online by the Proceedings of the National Academy of Sciences.

Telomeres function very similarly to aglets, the plastic tips that are found on shoelaces. Like aglets, telomeres prevent chromosomes from “unraveling.” Telomeres are made up of short segments of DNA, and as we get older, they gradually erode. Shortening of telomeres is linked to age-related diseases like cancer.

TERRA is an example of RNA that is found in telomeres. The traditional definition of RNA has it acting as an intermediate molecule that takes the coding from a DNA and translates that code into proteins for the cell. However, TERRA is a form of non-coding RNA that is transcribed from the telomere itself to create its own effects in the cell, like how the DNA of the cell is actually read.

In prior studies, researchers from Wistar and other institutions observed that when DNA in telomeres are lost, small proteins called inflammatory cytokines are secreted from the senescent cell. These proteins promote inflammation as a response to stress and are often a hallmark of several chronic illnesses. Wistar scientists wanted to know how telomere dysfunction was linked to this inflammatory behavior.

“Nucleic acid structures like TERRA can have powerful effects on immune pathways, and since we know there is a link between telomere dysfunction and inflammation, it made sense to study TERRA in much greater detail,” said Zhuo Wang, a trainee in the laboratory of Paul Lieberman, Ph.D., at The Wistar Institute and first author of the study. “What we found was a completely new function of TERRA outside of telomeres that can provide us with more information as to how this molecule is linked to the development of certain types of cancer.”

Wistar scientists found TERRA in the extracellular environment outside of the nucleus of the cells while looking for the molecule in a mouse model of medulloblastoma (a type of brain cancer) and a developing embryonic brain as well as human tissue cell culture lines. In blood plasma, TERRA was one of the most highly represented RNA molecules found in the circulating DNA outside of cells. The TERRA found outside of cells is actually shorter and more stable than those found inside the cells.

Why does this cell-free TERRA leave the cell in the first place? The researchers believe that it’s performing a signaling function and specifically signaling that cellular stress or damage has occurred. This prompts the inflammatory response and the release of cytokines.

“This is the first time we’ve observed TERRA outside of cells, but it underscores our lab’s previous research into this RNA molecule and its connection to aging and cancer,” said Lieberman, professor and program leader of the Gene Expression and Regulation program, director of the Center for Chemical Biology and Translational Medicine, and the Hilary Koprowski, M.D., endowed professor at Wistar. “Shortened telomeres have been associated with inflammation, but now we have shown that telomeres directly or indirectly contribute to inflammatory signaling.”

Lieberman said the next steps based on these findings involve assessing the impact of TERRA secretion on the tissue microenvironment and cancer development. They are already coordinating efforts with Jose Conejo-Garcia, M.D., Ph.D., professor and program leader of the Tumor Microenvironment and Metastasis program at Wistar, to use an ovarian cancer mouse model to study this question. They also plan to analyze tumor tissue samples from Wistar’s collaborative partners at the Helen F. Graham Cancer of Christiana Health in Delaware to study how TERRA is correlated with cancer development. Additionally, Wistar is actively seeking partners interested in translating these novel discoveries into companion diagnostics for therapeutics targeting telomerase.

Read more

Overindulgence

Share

chocolate-chip-cookies-940429_1920

We hate to ruin Thanksgiving, but a new report appearing in the Nov. 2015 issue of The FASEB Journal suggests that for some people, overindulgence at the dinner table or at snack time is enough to trigger signs of metabolic disease. Specifically, in some people just one high-calorie shake was enough to make people with metabolic disease worse, while in others, relatively short periods of overeating trigger the beginnings of metabolic disease. This information could be particularly useful for health care providers, nutritionists, and others who counsel people on disease prevention and eating habits.

“Acute effects of diet are mostly small, but may have large consequences in the long run,” said Suzan Wopereis, Ph.D., a researcher involved in the work from TNO, Microbiology and Systems Biology Group in Zeist, The Netherlands. “Our novel approach allows detection of small but relevant effects, thereby contributing to the urgently needed switch from disease-care to health-care, aiming for a life-long optimal health and disease prevention.”

To make this discovery, Wopereis and colleagues used two groups of male volunteers. The first group included 10 healthy male volunteers. The second group included nine volunteers with metabolic syndrome and who had a combination of two or more risk factors for heart disease, such as unhealthy cholesterol levels, high blood pressure, high blood sugar, high blood lipids, and abdominal fat. Both groups had blood samples taken, before and after consuming a high-fat milk-shake. In these blood samples, the researchers measured 61 biomarkers, such as cholesterol and blood sugar. They found that biochemical processes related to sugar metabolism, fat metabolism and inflammation were abnormal in subjects with metabolic syndrome. The 10 healthy male volunteers were also given a snack diet consisting of an additional 1300 kcal per day, in the form of sweets and savory products such as candy bars, tarts, peanuts and crisps for four weeks. The response of the same 61 biomarkers to the challenge test was evaluated. Signaling molecules such as hormones regulating the control of sugar and fat metabolism and inflammation were changed, resembling the very subtle start of negative health effects similar to that affecting those with metabolic disease.

“Eating junk food is one of those situations where our brains say ‘yes’ and our bodies say ‘no,'” said Gerald Weissmann, M.D., Editor-in-Chief of The FASEB Journal. “Unfortunately for us, this report shows that we need to use our brains and listen to our bodies. Even one unhealthy snack has negative consequences that extend far beyond any pleasure it brings.”

Read more

Depression: the early years

Share

girl-517555_1920`

Early life stress is a major risk factor for later episodes of depression. In fact, adults who are abused or neglected as children are almost twice as likely to experience depression.

Scientific research into this link has revealed that the increased risk following such childhood adversity is associated with sensitization of the brain circuits involved with processing threat and driving the stress response. More recently, research has begun to demonstrate that in parallel to this stress sensitization, there may also be diminished processing of reward in the brain and associated reductions in a person’s ability to experience positive emotions.

Researchers at Duke University and the University of Texas Health Sciences Center at San Antonio looked specifically at this second phenomenon in a longitudinal neuroimaging study of adolescents, in order to better understand how early life stress contributes to depression.

They recruited 106 adolescents, between the ages of 11-15, who underwent an initial magnetic resonance imaging scan, along with measurements of mood and neglect. The study participants then had a second brain scan two years later.

The researchers focused on the ventral striatum, a deep brain region that is important for processing rewarding experiences as well as generating positive emotions, both of which are deficient in depression.

“Our analyses revealed that over a two-year window during early to mid-adolescence, there was an abnormal decrease in the response of the ventral striatum to reward only in adolescents who had been exposed to emotional neglect, a relatively common form of childhood adversity where parents are persistently emotionally unresponsive and unavailable to their children,” explained first author Dr. Jamie Hanson.

“Importantly, we further showed that this decrease in ventral striatum activity predicted the emergence of depressive symptoms during this key developmental period,” he added. “Our work is consistent with other recent studies finding deficient reward processing in depression, and further underscores the importance of considering such developmental pathways in efforts to protect individuals exposed to childhood adversity from later depression.”

This study suggests that, in some people, early life stress compromises the capacity to experience enthusiasm or pleasure. In addition, the effect of early life stress may grow over time so that people who initially appear resilient may develop problems later in life.

“This insight is important because it suggests a neural pathway through which early life stress may contribute to depression,” said Dr. John Krystal, Editor of Biological Psychiatry. “This pathway might be targeted by neural stimulation treatments. Further, it suggests that survivors of early life trauma and their families may benefit from learning about the possibility of consequences that might appear later in life. This preparation could help lead to early intervention.”

Read more

Nonverbal communication

Share

singer-690117_1280

When people talk or sing, they often nod, tilt or bow their heads to reinforce verbal messages. But how effective are these head gestures at conveying emotions?

Very effective, according to researchers from McGill University in Montreal. Steven R. Livingstone and Caroline Palmer, from McGill’s Department of Psychology, found that people were highly accurate at judging emotions based on head movements alone, even in the absence of sound or facial expressions.

This finding suggests that visual information about emotional states available in head movements could aid in the development of automated emotion recognition systems or human-interaction robots, the researchers say. Expressive robots could potentially serve a range of functions, particularly where face-to-face communication is important, such as at hotel reception desks and as interactive care robots for the elderly.

Tracking movement, not sound

Using motion-capture equipment to track people’s head movements in three dimensions, Livingstone and Palmer recorded vocalists while they spoke or sang with a variety of emotions. The researchers then presented these video clips to viewers without any sound, with the facial expressions of vocalists hidden so that only their head movements were visible. Viewers were then asked to identify the emotions that the vocalists intended to convey.

“We found that when people talk, the ways in which they move their head reveal the emotions that they’re expressing. We also found that people are remarkably accurate at identifying a speaker’s emotion, just by seeing their head movements,” says Palmer, who holds the Canada Research Chair in Cognitive Neuroscience of Performance.

Research idea emerged from a noisy pub

“While the head movements for happy and sad emotions differed, they were highly similar across speech and song, despite differences in vocal acoustics,” says Livingstone, a former postdoctoral fellow in the Palmer lab and now a postdoctoral fellow at McMaster University. “Although the research was based on North American English speakers, the focus on head movements creates the possibility for studying emotional communication in contexts where different languages are spoken.”

The idea for the study emerged from a noisy pub. “One night in Montreal I was in a bar with my lab mates,” explains Livingstone, “It was a lively evening, with lots of people, dim lights, and some very loud music. At one point my friend started to talk me; I knew he was excited though I couldn’t make out what he was saying or see his face clearly. Suddenly I realized it was the animated way that he was bobbing his head that told me what he was trying to say.”

Adds Palmer, “Our discovery may lead to new applications in situations where sound is not available, such as automated recognition of emotional states in crowd behavior or in hearing impairments, by making use of head movements when watching someone talk. It also has applications in computing and robotics, where the addition of expressive head movements may help make humanoid robots more lifelike and approachable.”

Read more

Making permanent memories

Share

glasses-1010821_1920

Rehearsing information immediately after being given it may be all you need to make it a permanent memory, a University of Sussex study suggests.

Psychologists found that the same area of the brain activated when laying down a memory is also activated when rehearsing that memory.

The findings, published on 27 October 2015 in the Journal of Neuroscience, have implications for any situation in which accurate recall of an event is critical, such as witnessing an accident or crime.

The study showed that the brain region known as the posterior cingulate – an area whose damage is often seen in those with Alzheimer’s – plays a crucial role in creating permanent memories.

This region not only helps us to recall the episodic details of an event but also integrates the memory into our knowledge and understanding, which makes it resistant to forgetting.

The study involved showing participants 26 short videos of clips taken from YouTube of around 40 seconds in length with a narrative element. For example, one called “nasty neighbours” depicted two men playing practical jokes on each other.

For 20 of the videos, the participants were given around 40 seconds after each video to relate either in their heads or out loud details of the video. For the remaining six videos, this rehearsal period was not given.

Up to two weeks later, participants were still able to recall many details of the videos they had rehearsed, whereas the non-rehearsed videos were largely forgotten.

MRI scans revealed that the same area of the brain – the posterior cingulate – was most associated with the benefits of rehearsal. Here, the degree to which brain activity matched when watching and rehearsing the videos predicted how well the videos were remembered a whole week later.

Lead researcher Dr Chris Bird said: “We know that recent memories are susceptible to being lost until a period of consolidation has elapsed. In this study we have shown that a brief period of rehearsal has a huge effect on our ability to remember complex, lifelike events over periods of 1-2 weeks.

“We have also linked this rehearsal effect to processing in a particular part of the brain – the posterior cingulate.

“The findings have implications for any situation where accurate recall of an event is critical, such as witnessing an accident or crime. Memory for the event will be significantly improved if the witness rehearses the sequence of events as soon as possible afterwards.”

Dr Bird’s research group is conducting new research to investigate how these processes relate to memory loss in Alzheimer’s disease.

Read more

What products expectant mothers should avoid

Share

woman-358767_1280

Expectant mothers in their first trimester should avoid certain cosmetics, cleaning agents and medicines, to protect the developing fetal brain from chemicals that can trigger autism, York U health researchers have found.

“The products that we use on a daily basis, such as creams and cosmetics, contain chemicals that could potentially affect a developing baby during pregnancy,” says Professor Dorota Crawford in the School of Kinesiology and Health Science, Faculty of Health.

The list is long: cleaning solvents, pesticides, nonsteroidal anti-inflammatory drugs such as acetylsalicylic acid; misoprostol (a drug used for inducing labor); polychlorinated bisphenyls used as industrial lubricants; polybrominated diphenyl ethers found in wood and textiles; phthalates in PVC flooring, children’s toys, and cosmetics and lotions.

The researchers, Crawford and co-authors Christine Wong and Joshua Wais, report that aside from the type of chemical a pregnant woman is exposed to, the duration, the frequency and the concentration level also impact a developing brain at the prenatal stage.

“We recommend that women learn about health effects from exposure to chemical substances in the environment,” says PhD candidate Wong, adding that assessment information is found in the Integrated Risk Information System (IRIS) database maintained by the US Environmental Protection Agency.

According to the researchers, prenatal brain development undergoes constant changes and its normal functioning depends greatly on the presence of specific genes at any given time. Since environmental factors influence the expression levels of these critical genes, it is important for an expectant mother to be aware and cautious of exposure to these factors

Tags:
Read more

What kind of music are you listening to?

Share

02

Brain imaging reveals how neural responses to different types of music really affect the emotion regulation of persons. The study proves that especially men who process negative feelings with music react negatively to aggressive and sad music.

Emotion regulation is an essential component to mental health. Poor emotion regulation is associated with psychiatric mood disorders such as depression. Clinical music therapists know the power music can have over emotions, and are able to use music to help their clients to better mood states and even to help relieve symptoms of psychiatric mood disorders like depression. But many people also listen to music on their own as a means of emotion regulation, and not much is known about how this kind of music listening affects mental health. Researchers at the Centre for Interdisciplinary Music Research at the University of Jyväskylä, Aalto University in Finland and Aarhus University in Denmark decided to investigate the relationship between mental health, music listening habits and neural responses to music emotions by looking at a combination of behavioural and neuroimaging data. The study was published in August in the journal Frontiers in Human Neuroscience.

“Some ways of coping with negative emotion, such as rumination, which means continually thinking over negative things, are linked to poor mental health. We wanted to learn whether there could be similar negative effects of some styles of music listening,” explains Emily Carlson, a music therapist and the main author of the study.

Participants were assessed on several markers of mental health including depression, anxiety and neuroticism, and reported the ways they most often listened to music to regulate their emotions. Analysis showed that anxiety and neuroticism were higher in participants who tended to listen to sad or aggressive music to express negative feelings, particularly in males. “This style of listening results in the feeling of expression of negative feelings, not necessarily improving the negative mood,” says Dr. Suvi Saarikallio, co-author of the study and developer of the Music in Mood Regulation (MMR) test.

To investigate the brain’s unconscious emotion regulation processes, the researchers recorded the participants’ neural activity as they listened to clips of happy, sad and fearful-sounding music using functional magnetic resonance imaging (fMRI) at the AMI Center of Aalto University. Analysis showed that males who tended to listen to music to express negative feelings had less activity in the medial prefrontal cortex (mPFC). In females who tended to listen to music to distract from negative feelings, however, there was increased activity in the mPFC. “The mPFC is active during emotion regulation,” according to prof. Elvira Brattico, the senior author of the study. “These results show a link between music listening styles and mPFC activation, which could mean that certain listening styles have long-term effects on the brain.”

“We hope our research encourages music therapists to talk with their clients about their music use outside the session,” concludes Emily Carlson, “and encourages everyone to think about the how the different ways they use music might help or harm their own well-being.

Tags: ,
Read more

Honey found to reduce harmful side effects of smoking

Share

honey-924174_1920

Smoking is a known factor in many serious health issues: stroke, myocardial infarction, cardiovascular disease, coronary artery disease, to name but a few. In their recent research in Toxicological & Environmental Chemistry, Syaheedah et al. sought to study what impact antioxidants in honey have on the oxidative stress in smokers.

Smoking introduces free radicals into the body resulting in oxidative stress, decreased antioxidant status and negative health impacts. Past research has shown supplementation of herbs and algae with antioxidant qualities to improve oxidative status in smokers. Honey supplementation has been effective in reducing the toxic effects of cigarette smoke in rats, yet prior to this study the effects of honey supplementation in chronic smokers was yet to be documented.

Honey, a natural product, created by bees and derived from nectar, contains sugars but also minerals, proteins, organic acids and antioxidants. Syaheedah et al. set out to determine the effects of Tualang honey on smokers after a 12-week supplementation on a group of 32 chronic smokers with two equal-sized control groups; one who were not supplemented and a group of non-smokers. Bloods were taken pre- and post-intervention. In the smoking groups, antioxidant status was significantly less than non-smoking pre-intervention. There was high incidence of end products indicating oxidant activity in the smokers, likely caused by the free radicals in cigarettes. Oxidant activity can damage cell structure and function, leading to health problems for the individual. At the end of the 12 weeks, honey group smokers were found to have greatly improved antioxidant status, strongly suggesting the capability of the honey to offset oxidative stress with its antioxidants which have strong free radical ‘scavenging’ activities.

Syaheedah concludes: “Our findings may suggest that honey can be used as a supplement among those who are exposed to free radicals in cigarette smoke either as active or passive smokers in order to protect or reduce the risk of having cardiovascular diseases.”

Tags: ,
Read more

Sleep deprivation is not a modern phenomena

Share

Unicorn Wallpaper

In America, it seems only unicorns get seven or eight hours of sleep a night, and the rest of us suffer. But people may be meant to sleep as little as 6 1/2 hours nightly and were doing so long before the advent of electricity and smartphones, researchers say.

To find that out, they consulted with some of the few people on the planet who live roughly the same lifestyle humans did in the Paleolithic.

Psychiatrist and sleep researcher Jerome Siegel at UCLA’s Semel Institute of Neuroscience and Human Behavior started studying three different hunter-gatherer groups in Africa and South America. “All three don’t have any electricity, don’t have any of the sort of modern electronic developments that many think have reduced our sleep,” he says.

Those hunter-gatherers spent about seven or eight hours a night in bed, but they slept for just five to seven of those hours, according to the study, published Thursday in Current Biology. “It’s clear that the amount of sleep that all of these groups get is at the low end of what we’d see in the United States today,” Siegel says. Sleeping that little has been linked to everything from shorter life span to stomach problems and weight gain in industrial societies.

But unlike many people in the United States or Europe who sleep less than seven hours a night, members of the Hadza in Tanzania, San in Namibia, and Tsimane in Bolivia tend to be very healthy. There’s virtually no obesity, many have very long lives, and nearly everyone in these societies does not have trouble sleeping. “Approximately 20 percent of our population complains of chronic insomnia at some point,” Siegel says. “The two groups we quizzed on this don’t have a word for insomnia.”

That raises a lot of questions about why we think we need eight hours of shuteye. “That classic teaching that adults need seven or eight hours of sleep has to do with population-based evidence,” says Dr. Indira Gurubhagavatula, a sleep expert at the University of Pennsylvania Perelman School of Medicine who was not involved with the study. “This paper questions, is that data flawed? And if so, how or why? Or it could be that the sleep we’re getting is lower quality, and we need more of it to feel restored?”

Siegel thinks that might be because we evolved in the environment’s natural 24-hour pattern of light and temperature, but we’re cut off from that rhythm now. By contrast, these hunter-gatherers go to sleep a few hours after sunset, when the night gets chilly. They wake up when the day begins warming from the sunrise.

Following Earth’s natural tempo in this way could improve the quality of their sleep, says Kristen Knutson, a sleep researcher and biomedical anthropologist at the University of Chicago. Our bodies’ core temperature also cycles this way, regardless of air conditioning or heating. “If their sleep is following the environment’s temperature rhythm more closely and naturally, then their sleep quality may indeed be better than what is happening in the United States,” she says.

Researchers already know that light and temperature play an important role in sleep. Light can reverse jet lag and help set internal clocks, and people fall asleep more easily when their core body temperature falls. This all could contribute to why hunter-gatherers’ sleep less than we do on average, Gurubhagavatula says.

And it could also mean that many non-hunter-gatherers may not need to sleep eight or more hours a night. “I think the beauty of this current study is that maybe we shouldn’t be ramming this requirement down [every person’s] throat so to speak,” she says.

That’s not to say that there aren’t lots of people who are incredibly sleep-deprived, Gurubhagavatula says. Light and temperature aren’t the only things dictating how much we sleep. “It’s our activity and diet and stress level. I see patients who are single parents and have three jobs, and they’ll be lucky to have five hours of sleep and are tired all the time.” Those people need more sleep.

There are other habitual short sleepers in our society — truck drivers, graduate students, and idiot reporters who should know better — with lifestyles vastly different from a hunter-gatherer. “[They’re] not the same as someone in our society who only sleeps 6 1/2 hours,” says Dr. Elizabeth Klerman, a sleep researcher at the Harvard Medical School and Brigham and Women’s Hospital in Boston. So it could be unhealthy for people in industrial societies to sleep that little.

What’s natural for a hunter-gatherer might not be natural for everyone, Siegel agrees. “I don’t think we could just fling someone back into an equatorial lifestyle, and that’ll be entirely beneficial,” he says. But he’s excited about other possibilities. If hunter-gatherers are sleeping better because they’re more in tune with the daily temperature cycle, maybe we can do the same by programming thermostats to echo conditions outside. “That’s a specific aim of my next grant,” he says.

 

by: Angus Rohan Chen

Tags: ,
Read more

Liar Liar Pants on Fire

Share

boy-717151_1280

Kids who are taught to reason about the mental states of others are more likely to use deception to win a reward, according to new research published in Psychological Science, a journal of the Association for Psychological Science.

The findings indicate that developing “theory of mind” (ToM) — a cognitive ability critical to many social interactions — may enable children to engage in the sophisticated thinking necessary for intentionally deceiving another person.

“Telling a lie successfully requires deliberately creating a false belief in the mind of the lie recipient, and ToM could provide an important cognitive tool to enable children to do so,” the researchers write.

Research suggests that children begin to tell lies somewhere around ages 2 and 3, and studies have shown a correlation between children’s theory of mind and their tendency to lie. Psychological scientists Genyue Fu of Hangzhou Normal University in China, Kang Lee of the University of Toronto in Canada, and colleagues wanted to see if they could find causal evidence for a link between the two.

The researchers first conducted a hide-and-seek task to identify children who hadn’t yet started lying. The children were shown a selection of stickers and were asked to pick their favorite one — they were told that they could only keep the sticker if they successfully won 10 candies from the hide-and-seek game. In the game, the child was told to hide a candy under one of two cups while the researcher’s eyes were closed. The researcher then opened his or her eyes, asked the child where the candy was hidden, and chose whichever cup the child pointed to. Thus, the child could only win the candy by lying to the experimenter about its location.

A total of 42 children who never lied — who told the truth about the location of the candy on each of the 10 trials — were selected to continue with the study. The children, who were around 3 years old, were randomly assigned to complete either theory-of-mind training or control tasks focused on quantitative reasoning.

The theory-of-mind training included the standard false-contents task, in which children were shown a pencil box and asked what they thought was inside. When it was revealed that the box didn’t actually contain pencils, they were asked to reason about what other people would think was in the box. The goal of the training was to teach kids that people can know and believe different things — that is, even though the child has learned the true contents of the box, someone else would probably believe that the box contained pencils.

The children completed the training tasks or quantitative tasks every other day, for a total of six sessions. After the sessions were complete, the researchers again tested the children on the theory-of-mind tasks and the hide-and-seek tasks.

As expected, children who received theory-of-mind training showed improvement on the theory-of-mind tasks over time, while the children in the control group did not.

More importantly, the children who received the theory-of-mind training were also more likely to lie in the hide-and-seek task compared to those in the control group. And this difference held over a 30-day period.

While the findings don’t shed light on the specific components of training that underlie the effect, the researchers believe their findings provide concrete evidence for a causal link between theory of mind and social behaviors like lying.

“By increasing their sensitivity to mental states and engaging them in reasoning about false beliefs, we enabled young children not only to quickly apply their newly acquired knowledge to solve a problem in a social situation but also to continue to do so more than a month later,” Lee and colleagues write. “Taken together, these two findings also suggest that children were not just mechanically memorizing what they were taught in the ToM training sessions; rather, they were able to consolidate the knowledge and use it adaptively to solve a social problem they were facing.”

Read more

Available 24/7

Share

woman-882568_1280

The need to be constantly available and respond 24/7 on social media accounts can cause depression, anxiety and decrease sleep quality for teenagers says a study presented at a British Psychological Society conference in Manchester.

The researchers, Dr Heather Cleland Woods and Holly Scott of the University of Glasgow, provided questionnaires for 467 teenagers regarding their overall and night-time specific social media use. A further set of tests measured sleep quality, self-esteem, anxiety, depression and emotional investment in social media which relates to the pressure felt to be available 24/7 and the anxiety around, for example, not responding immediately to texts or posts

Dr Cleland Woods explained: “Adolescence can be a period of increased vulnerability for the onset of depression and anxiety, and poor sleep quality may contribute to this. It is important that we understand how social media use relates to these. Evidence is increasingly supporting a link between social media use and wellbeing, particularly during adolescence, but the causes of this are unclear”.

Analysis showed that overall and night-time specific social media use along with emotional investment in social media were related to poorer sleep quality, lower self-esteem as well as higher anxiety and depression levels

Read more

Maintaining a healthy body weight is now more challenging than ever

Share

flip1

If you are struggling with weight gain, you might be surprised to know that your parents had it easier – they could eat more and exercise less, and still avoid obesity, according to a recent study out of York University’s Faculty of Health.

“Our study results suggest that if you are 25, you’d have to eat even less and exercise more than those older, to prevent gaining weight,” says Professor Jennifer Kuk in the School of Kinesiology and Health Science. “However, it also indicates there may be other specific changes contributing to the rise in obesity beyond just diet and exercise.”

The study, funded by the Canadian Institutes of Health Research, analyzed dietary data of nearly 36,400 American adults collected by the National Health and Nutrition Survey between 1971 and 2008. The available physical activity frequency data, of 14,419 adults in the 1988 to 2006 period was also used.

“We observe that for a given amount of self-reported food intake, people will be about 10 per cent heavier in 2008 than in 1971, and about five per cent heavier for a given amount of physical activity level in 1988 than 2006,” notes Ruth Brown, lead researcher and York U graduate student, adding, “These secular changes may in part explain why we have seen the dramatic rise in obesity.”

The study, “Secular differences in the association between caloric intake, macronutrient intake and physical activity with obesity,” is featured in the upcoming issue of Obesity Research & Clinical Practice.

The researchers point out that although several studies have shown that eating less and exercising more results in weight loss, in the long term, they are proven to be ineffective.

“This is because weight management is actually much more complex than just ‘energy in’ versus ‘energy out’,” says Kuk. “That’s similar to saying your investment account balance is simply your deposits subtracting your withdrawals and not accounting for all the other things that affect your balance like stock market fluctuations, bank fees or currency exchange rates.”

Kuk further explains that our body weight is impacted by our lifestyle and environment, such as medication use, environmental pollutants, genetics, timing of food intake, stress, gut bacteria and even nighttime light exposure

Read more

Exposure to nature provides good health

Share

back up pics and vids 405

Research has found evidence that spending time in nature provides protections against a startling range of diseases, including depression, diabetes, obesity, ADHD, cardiovascular disease, cancer, and many more. How this exposure to green space leads to better health has remained a mystery. After reviewing hundreds of studies examining nature’s effects on health, University of Illinois environment and behavior researcher Ming Kuo believes the answer lies in nature’s ability to enhance the functioning of the body’s immune system.

“I pulled every bit of the research in this area together that I could find, and was surprised to realize I could trace as many as 21 possible pathways between nature and good health–and even more surprised to realize that all but two of the pathways shared a single common denominator,” Kuo said. She said it was remarkable to see how important a role the immune system plays in every one of the diseases that nature protects against.

“The realization that there are so many pathways helps explain not only how nature promotes health, but also why nature has such huge, broad effects on health,” she said. “Nature doesn’t just have one or two active ingredients. It’s more like a multivitamin that provides us with all sorts of the nutrients we need. That’s how nature can protect us from all these different kinds of diseases–cardiovascular, respiratory, mental health, musculoskeletal, etc. — simultaneously.”

One way to understand this relationship between nature, health, and the immune system, Kuo explains, is that exposure to nature switches the body into “rest and digest” mode, which is the opposite of the “fight or flight” mode. When the body is in “fight or flight” mode, it shuts down everything that is immediately nonessential, including the immune system.

“When we feel completely safe, our body devotes resources to long-term investments that lead to good health outcomes–growing, reproducing, and building the immune system,” Kuo said. “When we are in nature in that relaxed state, and our body knows that it’s safe, it invests resources toward the immune system.”

For those who prefer playing a board game or visiting an art gallery to taking a walk in the park, Kuo says some of the same restorative benefits can be obtained. “if you are absorbed and relaxed, chances are your parasympathetic system is happy and your immune system is going to get a boost. That said, these enjoyable indoor activities don’t provide the phytoncides, mycobacterium vaccae, negative air ions, vitamin D-producing sunlight, and other active ingredients found outdoors. So we’d expect a smaller boost than you’d get from being in nature.”

Kuo is the director of the Landscape and Human Health Laboratory at the U of I and has conducted numerous studies of her own linking green space and health. Kuo hopes her exhaustive compilation of studies will provide a map for what researchers in this field might study next.

“Finding that the immune system is a primary pathway provides an answer to the question of ‘how’ nature and the body work in concert to fight disease,” Kuo said.

“How might contact with nature promote human health? Exploring promising mechanisms and a possible central pathway” is published in Frontiers in Psychology and available online.

Read more

Coping with stress

Share

gossip-475029_1280

 

New research out of Queen’s University has found evidence of emotional load sharing between partners in a close relationship. The study, co-authored by PhD candidate Jessica Lougheed, found that a strong relationship with a loved one can help ease stress when placed in difficult situations.

“We wanted to test a new evolutionary theory in psychology called Social Baseline Theory which suggests that humans adapted to be close to other humans,” says Ms. Lougheed. “The idea is that individuals function at a relative deficit when they are farther away from people they trust.”

In their study, Ms. Lougheed and co-authors measured the stress levels of 66 adolescent girls during a spontaneous speech task. Before the speech performance, the participants and their mothers rated the quality of their relationship. During the speeches, researchers tracked the participants’ level of stress via galvanic skin response (measuring the level of skin perspiration). To account for the effect of physical – rather than purely emotional – closeness, the participants’ mothers were instructed either to hold or not hold their daughters’ hand.

The researchers found that physical closeness allowed the participants to manage their stress more efficiently, regardless of how close the mother-daughter pair reported being. However, when physical contact was removed from the equation, only the participants who reported higher relationship quality showed signs of load sharing.

“Our results suggest that we are better equipped to overcome challenging situations when we are closer – either physically or in terms of how we feel in our relationships – to people we trust,” says Ms. Lougheed.

Participants who had reported the lowest level of mother-daughter relationship closeness and lacked physical contact during the task were the least efficient in managing emotional stress.

“We were somewhat surprised to find that mothers’ stress did not vary by physical closeness – after all, it can be stressful for parents to watch their children perform, but being able to offer physical comfort might have lessened the mothers’ stress,” says Ms. Lougheed.

“Thus, emotional load sharing in this context was not a function of the mothers’ stress level, and we expect that it occurred instead through the daughters’ perceptions of how stressful it was to give a speech. That is, higher physical and/or relationship closeness helped the daughters feel like they could overcome the challenging situation.”

The results suggest that physical contact can overcome some difficulties associated with relatively low relationship quality, or that being in a high-quality relationship is helpful for managing emotions in the same way as the physical comfort of a loved one. Lougheed does, however, note that the general level of relationship quality was relatively high in their sample, and that physical contact may function very differently in distressed families

Read more

You’re Lazy, you just stay in bed.

Share

01

Those of you who spend hours at the gym with the aim of burning as many calories as possible may be disappointed to learn that all the while your nervous system is subconsciously working against you. Researchers reporting in the Cell Press journal Current Biology on September 10 have found that our nervous systems are remarkably adept in changing the way we move so as to expend the least amount of energy possible. In other words, humans are wired for laziness.

The findings, which were made by studying the energetic costs of walking, likely apply to most of our movements, the researchers say.

“We found that people readily change the way they walk–including characteristics of their gait that have been established with millions of steps over the course of their lifetime–to save quite small amounts of energy,” says Max Donelan of Simon Fraser University in Canada. “This is completely consistent with the sense that most of us have that we prefer to do things in the least effortful way, like when we choose the shortest walking path, or choose to sit rather than stand. Here we have provided a physiological basis for this laziness by demonstrating that even within a well-rehearsed movement like walking, the nervous system subconsciously monitors energy use and continuously re-optimizes movement patterns in a constant quest to move as cheaply as possible.”

There is a bright side to this, lead author Jessica Selinger adds: “Sensing and optimizing energy use that quickly and accurately is an impressive feat on the part of the nervous system. You have to be smart to be that lazy!”

Donelan, Selinger, and their colleagues wanted to understand why people move the way they do, given that there are countless ways to get from point A to point B. This is partly a question of evolution and learning. But, the researchers wanted to know, to what extent can our bodies adapt movement based on real-time physiological inputs?

To find out, the researchers asked people to walk while they wore a robotic exoskeleton. This contraption allowed the researchers to discourage people from walking in their usual way by making it more costly to walk normally than to walk some other way. More specifically, the researchers made it more difficult for participants to swing their legs by putting resistance on the knee during normal walking, whereas the researchers eased this resistance for other ways of walking.

“We think of our experiment like dropping someone into a new world with all new rules,” Selinger says. “Any walking strategies that may have developed over evolutionary or developmental timescales are now obsolete in this new world.”

This scheme allowed the researchers to test whether people can sense and optimize the cost associated with their movements in real time. And it turns out we can.

The experiment revealed that people adapt their step frequency to converge on a new energetic optimum very quickly–within minutes. What’s more, people do this even when the energy savings is quite small: less than 5%. The findings show that the energetic costs of our activities aren’t just an outcome of our movements, but in fact play a central role in continuously shaping them.

The researchers say they now plan to explore questions about how the human body measures the energetic costs associated with particular ways of moving. They are also keen to know how the body solves what is a very complex optimization problem.

“Walking requires the coordination of literally tens of thousands of muscle motor units,” Donelan says. “How do we so quickly discover the optimal combinations?”

 

Read more

Why drugs for addiction and depression don’t work for some patients

Share

03

New research may help explain why drug treatments for addiction and depression don’t work for some patients.

The conditions are linked to reward and aversion responses in the brain. Working in mice, researchers at Washington University School of Medicine in St. Louis have discovered brain pathways linked to reward and aversion behaviors are in such close proximity that they unintentionally could be activated at the same time.

The findings suggest that drug treatments for addiction and depression simultaneously may stimulate reward and aversion responses, resulting in a net effect of zero in some patients.

The research is published online Sept. 2 in the journal Neuron.

“We studied the neurons that cause activation of kappa opioid receptors, which are involved in every kind of addiction — alcohol, nicotine, cocaine, heroin, methamphetamine,” said principal investigator Michael R. Bruchas, PhD, associate professor of anesthesiology and neurobiology. “We produced opposite reward and aversion behaviors by activating neuronal populations located very near one another. This might help explain why drug treatments for addiction don’t always work — they could be working in these two regions at the same time and canceling out any effects.”

Addiction can result when a drug temporarily produces a reward response in the brain that, once it wears off, prompts an aversion response that creates an urge for more drugs.

The researchers studied mice genetically engineered so that some of their brain cells could be activated with light. Using tiny, implantable LED devices to shine a light on the neurons, they stimulated cells in a region of the brain called the nucleus accumbens, producing a reward response. Cells in that part of the bran are dotted with kappa opioid receptors, which are involved in addiction and depression.

The mice returned over and over again to the same part of a maze when the researchers stimulated the brain cells to produce a reward response. But activating cells a millimeter away resulted in robust aversion behavior, causing the mice to avoid these areas.

“We were surprised to see that activation of the same types of receptors on the same types of cells in the same region of the brain could cause different responses,” said first author Ream Al-Hasani, PhD, an instructor in anesthesiology. “By understanding how these receptors work, we may be able to more specifically target drug therapies to treat conditions linked to reward and aversion responses, such as addiction or depression.”

Read more

Get Smart

Share

albert-einstein-784078_640

By altering a single gene, researchers from the University of Leeds have created unusually intelligent mice that are less likely to recall fear or experience anxiety.

It sheds light on the molecular underpinnings of learning and memory and could form the basis for research into new treatments for age-related cognitive decline, cognitive disorders such as Alzheimer’s disease and schizophrenia, and other conditions.

The researchers altered a gene in mice to inhibit the activity of an enzyme called phosphodiesterase-4B (PDE4B), which is present in many organs of the vertebrate body, including the brain. In behavioral tests, the PDE4B-inhibited mice showed enhanced cognitive abilities: learning faster, remembering events longer and solving complex exercises better than ordinary mice.

For example, the “brainy mice” showed a better ability than ordinary mice to recognise another mouse that they had been introduced to the day before. They were also quicker at learning the location of a hidden escape platform in a test called the Morris water maze.

However, the PDE4B-inhibited mice also showed less recall of a fearful event after several days than ordinary mice.

The published findings are limited to mice and have not been tested in humans, but PDE4B is present in humans. The diminished memory of fear among mice with inhibited PDE4B could be of interest to researchers looking for treatments for pathological fear, typified by Post-Traumatic Stress Disorder (PTSD)

The PDE4B-inhibited mice also showed less anxiety. They spent more time in open, brightly-lit spaces than ordinary mice, which preferred dark, enclosed spaces.

Ordinary mice are naturally fearful of cats, but the PDE4B-inhibited mice showed a decreased fear response to cat urine, suggesting that one effect of inhibiting PDE4B could be an increase in risk-taking behaviour.

So, while the PDE4B-inhibited mice excelled at solving complex exercises, their low levels of anxiety could be counterproductive for a wild mouse.

Dr Steve Clapcote, Lecturer in Pharmacology in the University of Leeds’ School of Biomedical Sciences, led the study. He said: “Cognitive impairments are currently poorly treated, so I’m excited that our work using mice has identified phosphodiesterase-4B as a promising target for potential new treatments”.

The researchers are now working on developing drugs that will specifically inhibit PDE4B. These drugs will be tested in animals to see whether any would be suitable for clinical trials in humans.

Dr Alexander McGirr, a psychiatrist in training at the University of British Columbia, who co-led the study, said: “”In the future, medicines targeting PDE4B may potentially improve the lives of individuals with neurocognitive disorders and life-impairing anxiety, and they may have a time-limited role after traumatic events.”

Dr Laura Phipps of Alzheimer’s Research UK, who were not involved in the study, said:

“This study highlights a potentially important role for the PDE4B gene in learning and memory in mice, but further studies will be needed to know whether the findings could have implications for Alzheimer’s disease or other dementias. We’d need to see how this gene could influence memory and thinking in people to get a better idea of whether it could hold potential as a target to treat Alzheimer’s.

“There is currently a lack of effective treatments for dementia and understanding the effect of genes can be a key early step on the road to developing new drugs. With so many people affected by dementia, it is important that there is research into a wide array of treatment approaches to have the best chance of helping people sooner.”

Read more

Initial small study on mobile health (mHealth) tech

Share

smartphone-695165_1280

Smartphone applications and wearable sensors have the potential to help people make healthier lifestyle choices, but scientific evidence of mobile health technologies’ effectiveness for reducing risk factors for heart disease and stroke is limited, according to a scientific statement from the American Heart Association, published in the association’s journal Circulation.

The new statement reviewed the small body of published, peer-reviewed studies about the effectiveness of mobile health technologies (mHealth) for managing weight, increasing physical activity, quitting smoking and controlling high blood pressure, high cholesterol and diabetes.

“The fact that mobile health technologies haven’t been fully studied doesn’t mean that they are not effective. Self-monitoring is one of the core strategies for changing cardiovascular health behaviors. If a mobile health technology, such as a smartphone app for self-monitoring diet, weight or physical activity, is helping you improve your behavior, then stick with it,” said Lora E. Burke, Ph.D., M.P.H., lead author of the statement and professor of nursing and epidemiology at the University of Pittsburgh.

Currently, one in five American adults use some technology to track health data and the most popular health apps downloaded are related to exercise, counting steps, or heart rate.

The mHealth technologies examined in the statement correspond to the goals in the American Heart Association’s Life’s Simple 7, which are seven simple ways to improve your heart health — eating better, being more active, managing your weight, avoiding tobacco smoke, reducing blood sugar, and controlling both cholesterol and blood pressure. Here are some of the statement’s findings:

Managing Weight –People who include mobile technology in a comprehensive lifestyle program for weight loss were more successful in short-term weight loss compared to those who tried to lose weight on their own, but there isn’t any published data on whether the participants maintained their weight loss beyond 12 months. When considering an mHealth weight loss program, healthcare practitioners should look for one that has many of the same elements as successful person-to-person individualized programs administered by healthcare professionals, which emphasize a calorie-controlled diet, physical activity, self-monitoring or recording food intake and physical activity in a paper or digital diary, personalized feedback and social support.

Physical activity –While the majority of studies show that using an online program boosted physical activity more than not using one, there hasn’t been enough research to show whether wearable physical activity monitoring devices actually help you move more.

Smoking cessation –Mobile phone apps using text messaging to help quit smoking can almost double your chances of quitting, but about 90 percent of people using these apps fail to quit smoking after six months. Mobile health apps used in combination with a traditional quit-smoking program may help smokers kick the habit.

Currently, there is little or no U.S.-based mHealth technology research on diabetes, blood pressure or cholesterol management.

Statement authors reviewed mHealth randomized clinical trials and meta-analyses from the last decade. Most mHealth technology studies were short-term and limited in size.

“Nevertheless, don’t dismiss the possibility that these devices and apps can help you be heart healthy,” Burke said.

To choose a mobile health technology that works for you, ask your healthcare provider, fitness instructor, registered dietitian or similar expert, to help find an effective program, she added.

The statement also encouraged researchers to embrace the challenge of producing the needed evidence regarding how effective these new technologies are and how we can best adopt them into clinical practice to promote better patient health

Read more

Nutrition influences inflammatory processes and help reduce chronic diseases risk

Share

friends

A recent article commissioned by the ILSI Europe Obesity and Diabetes Task Force presents new approaches to capture inflammatory status in humans and to help quantify how much diet can positively modulate inflammation.

Inflammatory responses are often part of the early stages of disease development and controlling them is seen as a key future preventative and therapeutic target.

“Inflammation acts as both a friend and foe, being essential in metabolic regulation, with unresolved low-grade chronic inflammation being a pathological feature of a wide range of chronic conditions including the metabolic syndrome and cardiovascular diseases,” commented Prof. Anne Marie Minihane, University of East Anglia (UK).

Inflammation is a normal component of host defenses and elevated unresolved chronic inflammation is central to a range of chronic diseases. Prevention or control of low-grade inflammation therefore seems to be an attractive target effect for healthy food or food ingredients.

The nutrition status of the individual with for example a deficiency or excess of certain micronutrients (e.g. folate, vitamin B12, vitamin B6, vitamin 1, vitamin E, zinc) may lead to an ineffective or excessive inflammatory response. Studies have showed that high consumption of fat and glucose may induce post-prandial inflammation (manifesting itself after the consumption of a meal), which may have consequences for the development of diabetes and cardiovascular diseases. The Western-style diet, rich in fat and simple sugars but often poor in specific micronutrients, is linked to the increased prevalence of diseases with strong immunogical and autoimmune components, including allergies, food allergies, atopic dermatitis and obesity.

Read more

Let food be your medicine and medicine be your food

Share

Hippocrates_rubens

Among Hippocrates’ many contributions to the field of medicine was his notion of food as medicine and medicine as food:

Here are some powerful foods that can be used in place of medicine:

Buckwheat honey for a cough

Derived from the bee nectar of flowers of the buckwheat grain, buckwheat honey might eventually make its way into every parent’s medicine cabinet. Buckwheat honey is better than cough syrup for nocturnal cough in kids, this is an especially useful food-as-medicine for children under 6 but older than age 1, who are ill advised to take over-the-counter cough medicines.

Pickled foods for diarrhea

Fermented foods include yogurt, kefir, pickled vegetables, miso, kimchi and poi. These foods contain living bacteria that help maintain the health of the digestive tract.. These bacteria-filled foods can be used to prevent and treat antibiotic-associated diarrhea, irritable bowel syndrome, infantile diarrhea and eczema.

Ginger for menstrual cramps

Ginger is a pungent spice originating from Southeast Asia in the form of tea for nausea and abdominal discomfort. Ginger could also be a helpful food-as-medicine for women and works as well as ibuprofen for menstrual cramps.

Peppermint for IBS

Think beyond candy canes and chewing gum. Peppermint is also found in supplement, essential oil and tea forms. When used medicinally, peppermint is prescribed to help treat abdominal cramping and irritable bowel syndrome (IBS) and is the most effective and the least toxic.

Hibiscus tea for high blood pressure

Hibiscus tea has a greater anti-hypertensive effect than blueberries. Infused as an herbal tea, hibiscus flowers contain anthocyanins, which help lower blood pressure. the steeples of the flower are dried and made into a tea drink, which has a tart cranberry taste.

Turmeric for arthritis

Native to southwest India, turmeric has a warm, bitter flavor. Used medicinally it helps treat inflammatory conditions, turmeric is used especially for brain-related conditions and to decrease the risk of Alzheimer’s disease and it can be also be used for arthritis. Add black pepper to turmeric to maximize the disease-fighting benefits, this helps your body absorb more of the curcumin, which is the active ingredient in turmeric that delivers the positive health effects.

Chia seeds for high cholesterol

Despite their tiny size, chia seeds are nutrient-dense and often labeled as a “superfood.”

Steel-cut oatmeal for high LDL cholesterol

This is a no-brainer for lowering LDL there are lots of studies showing that foods high in soluble fiber lower LDL cholesterol. One such study found that eating at least 3 grams of oats daily is associated with lower LDL cholesterol levels.

Beans for high blood sugar levels

Beans are useful in lowering blood sugar levels and managing high cholesterol as well as being loaded with fiber.

Salmon for inflammation

With its pink to orange hue and distinct smell, salmon is one of the best dietary sources of omega-3 fatty acids. These essential fats are an important part of treating any inflammatory or autoimmune condition.

Tags:
Read more

Sunscreen from algae

Share

river-algae-242169_1280

For consumers searching for just the right sunblock this summer, the options can be overwhelming. But scientists are now turning to the natural sunscreen of algae — which is also found in fish slime — to make a novel kind of shield against the sun’s rays that could protect not only people, but also textiles and outdoor materials. They report on their development in the journal ACS Applied Materials & Interfaces.

Existing sunblock lotions typically work by either absorbing ultraviolet rays or physically blocking them. A variety of synthetic and natural compounds can accomplish this. But most commercial options have limited efficiency, pose risks to the environment and human health or are not stable. To address these shortcomings, Vincent Bulone, Susana C. M. Fernandes and colleagues looked to nature for inspiration.

The researchers used algae’s natural sunscreen molecules, which can also be found in reef fish mucus and microorganisms, and combined them with chitosan, a biopolymer from crustacean shells. Testing showed their materials were biocompatible, stood up well in heat and light, and absorbed both ultraviolet A and ultraviolet B radiation with high efficiency.

Read more

U.S. parenting on a downward trajectory?

Share

baby-17342_1280

Social practices and cultural beliefs of modern life are preventing healthy brain and emotional development in children, according to an interdisciplinary body of research presented recently at a symposium at the University of Notre Dame.

“Life outcomes for American youth are worsening, especially in comparison to 50 years ago,” says Darcia Narvaez, Notre Dame professor of psychology who specializes in moral development in children and how early life experiences can influence brain development.

“Ill-advised practices and beliefs have become commonplace in our culture, such as the use of infant formula, the isolation of infants in their own rooms or the belief that responding too quickly to a fussing baby will ‘spoil’ it,” Narvaez says.

This new research links certain early, nurturing parenting practices — the kind common in foraging hunter-gatherer societies — to specific, healthy emotional outcomes in adulthood, and has many experts rethinking some of our modern, cultural child-rearing “norms.”

“Breast-feeding infants, responsiveness to crying, almost constant touch and having multiple adult caregivers are some of the nurturing ancestral parenting practices that are shown to positively impact the developing brain, which not only shapes personality, but also helps physical health and moral development,” says Narvaez.

Studies show that responding to a baby’s needs (not letting a baby “cry it out”) has been shown to influence the development of conscience; positive touch affects stress reactivity, impulse control and empathy; free play in nature influences social capacities and aggression; and a set of supportive caregivers (beyond the mother alone) predicts IQ and ego resilience as well as empathy.

The United States has been on a downward trajectory on all of these care characteristics, according to Narvaez. Instead of being held, infants spend much more time in carriers, car seats and strollers than they did in the past. Only about 15 percent of mothers are breast-feeding at all by 12 months, extended families are broken up and free play allowed by parents has decreased dramatically since 1970.

Whether the corollary to these modern practices or the result of other forces, an epidemic of anxiety and depression among all age groups, including young children; rising rates of aggressive behavior and delinquency in young children; and decreasing empathy, the backbone of compassionate, moral behavior, among college students, are shown in research.

According to Narvaez, however, other relatives and teachers also can have a beneficial impact when a child feels safe in their presence. Also, early deficits can be made up later, she says.

“The right brain, which governs much of our self-regulation, creativity and empathy, can grow throughout life. The right brain grows though full-body experience like rough-and-tumble play, dancing or freelance artistic creation. So at any point, a parent can take up a creative activity with a child and they can grow together.”

Read more

Brain fatigue

Share

picjumbo.comlaptop

Do you ever notice how stress and mental frustration can affect your physical abilities? When you are worried about something at work, do you find yourself more exhausted at the end of the day? This phenomenon is a result of the activation of a specific area of the brain when we attempt to participate in both physical and mental tasks simultaneously.

Ranjana Mehta, Ph.D., assistant professor at the Texas A&M Health Science Center School of Public Health, conducted a study evaluating the interaction between physical and mental fatigue and brain behavior.

The study showed that when we attempt mental tasks and physical tasks at the same time, we activate specific areas, called prefrontal cortex (PFC), in our brain. This can cause our bodies to become fatigued much sooner than if we were solely participating in a physical task.

Typically, endurance and fatigue have been examined solely from a physical perspective, focused primarily on the body and muscles used to complete a specific task. However, the brain is just like any other biological tissue, it can be overused and can suffer from fatigue.

“Existing examinations of physical and mental fatigue has been limited to evaluating cardiovascular, muscular and biomechanical changes,” said Mehta. “The purpose of this study was to use simultaneous monitoring of brain and muscle function to examine the impact on the PFC while comparing the changes in brain behavior with traditional measures of fatigue.”

According to Mehta, study findings show that there were lower blood oxygen levels in the PFC following combined physical and mental fatigue compared to that of just physical fatigue conditions. Through simultaneous examination of the brain and muscle function it is apparent that when participating in highly cognitive tasks, brain resources are divided which may accelerate the development of physical fatigue.

It is critical that researchers consider the brain as well as the body when examining fatigue development and its impact on the body. Interdisciplinary work that combines neurocognitive principles with physiological and biomechanical outcomes can provide us with a comprehensive understanding of what is happening to the body when we perform our daily activities.

“Not a lot of people see the value in looking at both the brain and the body together,” said Mehta. “However, no one does purely physical or mental work; they always do both.”

This study was published online in Human Factors: The Journal of the Human Factors and Ergonomics Society. Co-author of the study is Raja Parasuraman, Ph.D., professor of psychology at George Mason University in Virginia.

Read more

Dean Ornish TED Talk

Share
Healthy diets help prevent, even reverse, some health conditions. Dr. Dean Ornish believes it can also do the same for cancer.

Healthy diets help prevent, even reverse, some health conditions. Dr. Dean Ornish believes it can also do the same for cancer.

Dr. Dean Ornish studied how lifestyle changes could help people with chronic heart disease; he wanted to figure out if there was a way to do the same with patients with some types of cancer. His research at the Preventive Medicine Research Institute clinically demonstrated that cardiovascular illnesses — and, most recently prostate cancer — can be treated and even reversed through diet and exercise.

To listen enter the link below to your URL 

http://www.npr.org/2015/07/31/426842528/can-healthy-eating-reverse-some-cancers

Tags: ,
Read more

Children start making predictions at a few months of age.

Share

baby-821625_1280

Infants can use their expectations about the world to rapidly shape their developing brains, researchers have found.

A series of experiments with infants ages 5 to 7 months has shown that portions of babies’ brains responsible for visual processing respond not just to the presence of visual stimuli, but also to the mere expectation of visual stimuli, according to the researchers from Princeton University, the University of Rochester and the University of South Carolina.

That type of sophisticated neural processing was once thought to happen only in adults and not infants, whose brains are still developing important neural connections.

“We show that in situations of learning and situations of expectations, babies are in fact able to really quickly use their experience to shift the ways different areas of their brain respond to the environment,” said Lauren Emberson, one of the researchers, who will join the Princeton faculty Sept. 1 as an assistant professor of psychology. She comes to Princeton from the University of Rochester, where she is a postdoctoral associate.

The research is described in the article, “Top-down modulation in the infant brain: Learning-induced expectations rapidly affect the sensory cortex at 6 months,” published online June 20 in the Proceedings of the National Academy of Sciences. The other authors are John Richards of the University of South Carolina and Richard Aslin of the University of Rochester.

The researchers exposed one group of infants to a pattern that included a sound — like a honk from a clown horn or a rattle — followed by an image of a red cartoon smiley face. Another group saw and heard the same things, but without any pattern.

The researchers used functional near-infrared spectroscopy, a technology that measures oxygenation in regions of the brain using light, to assess brain activity as the infants were exposed to the sounds and images.

After exposing the infants to the sounds and images for a little over a minute, the researchers began omitting the image. For the infants who had been exposed to the pattern, brain activity was detected in the visual areas of the brain even when the image didn’t appear as expected.

“We find that the visual areas of the infant brain respond both when they see things, which we knew, but also when they expect to see things but don’t,” Emberson said.

The finding could help shed light on the mysteries of neural development, the researchers said.

“Part of the reason I wanted to establish this type of phenomenon in infants is because I think it’s a really good candidate mechanism for how infants are using their experiences to develop their brains,” Emberson said. “There’s a lot of work that shows babies do use their experiences to develop. That’s sort of intuitive, especially if you’re a parent, but we have no idea how the brain is actually using the experiences.”

The findings offer insights that can shape future research in the area, said Janet Werker, a professor and Canada research chair in the Department of Psychology at the University of British Columbia who studies the roots of language acquisition.

“Most exciting to me is the evidence this work provides that from very early in infancy, the cortex is able to set up expectations about incoming events,” said Werker, who was not involved in the research. “This shows that infants not only learn about their external worlds, but are ready — from very early in life — to make predictions about the co-occurrence of events on the basis of very brief previous experience. This work thus has the potential to transform future research on infant learning to focus not on just what infants can learn, but to look at learning as a more active process, focusing more on how learning begets subsequent learning.”

Emberson is continuing to explore the topic by examining the phenomenon in infants who are at risk for poor developmental outcomes, specifically those who were born prematurely. She also is examining whether infants’ visual expectations boost their visual abilities.

The research was primarily funded by the National Institutes of Child Health and Development

Read more

Intestinal bacteria play an important role in inducing anxiety and depression

Share

parallax1

The new study, published in Nature Communications, is the first to explore the role of intestinal microbiota in the altered behavior that is a consequence of early life stress.

“We have shown for the first time in an established mouse model of anxiety and depression that bacteria play a crucial role in inducing this abnormal behaviour,” said Premysl Bercik, senior author of the paper and an associate professor of medicine with McMaster’s Michael G. DeGroote School of Medicine. “But it’s not only bacteria, it’s the altered bi-directional communication between the stressed host — mice subjected to early life stress — and its microbiota, that leads to anxiety and depression.”

It has been known for some time that intestinal bacteria can affect behaviour, but much of the previous research has used healthy, normal mice, said Bercik.

In this study, researchers subjected mice to early life stress with a procedure of maternal separation, meaning that from day three to 21, newborn mice were separated for three hours each day from their mothers and then put back with them.

First, Bercik and his team confirmed that conventional mice with complex microbiota, which had been maternally separated, displayed anxiety and depression-like behaviour, with abnormal levels of the stress hormone corticosterone. These mice also showed gut dysfunction based on the release of a major neurotransmitter, acetylcholine.

Then, they repeated the same experiment in germ-free conditions and found that in the absence of bacteria mice which were maternally separated still have altered stress hormone levels and gut dysfunction, but they behaved similar to the control mice, not showing any signs of anxiety or depression.

Next, they found that when the maternally separated germ-free mice are colonized with bacteria from control mice, the bacterial composition and metabolic activity changed within several weeks, and the mice started exhibiting anxiety and depression.

“However, if we transfer the bacteria from stressed mice into non stressed germ-free mice, no abnormalities are observed. This suggests that in this model, both host and microbial factors are required for the development of anxiety and depression-like behavior. Neonatal stress leads to increased stress reactivity and gut dysfunction that changes the gut microbiota which, in turn, alters brain function,” said Bercik.

He said that with this new research, “We are starting to explain the complex mechanisms of interaction and dynamics between the gut microbiota and its host. Our data show that relatively minor changes in microbiota profiles or its metabolic activity induced by neonatal stress can have profound effects on host behaviour in adulthood.”

Bercik said this is another step in understanding how microbiota can shape host behaviour, and that it may extend the original observations into the field of psychiatric disorders.

“It would be important to determine whether this also applies to humans. For instance, whether we can detect abnormal microbiota profiles or different microbial metabolic activity in patients with primary psychiatric disorders, like anxiety and depression,” said Bercik.

Read more

A child’s scream

Share

child-769030_1280

Our noisy world is no match for a screaming infant. An airplane could be flying by as a house party rages on downstairs while a literal cat fight takes place outside, and still a wailing baby will win your attention. One possible explanation, published July 16 in the journal Current Biology, is that human screams possess a unique acoustic property found to activate not just the auditory brain but also the brain’s fear circuitry.

“If you ask a person on the street what’s special about screams, they’ll say that they’re loud or have a higher pitch,” says study senior author David Poeppel, who heads a speech and language processing lab at New York University. “But there’s lots of stuff that’s loud and there’s lots of stuff that’s high pitched, so you’d want a scream to be genuinely useful in a communicative context.”

Humans make a variety of meaningful noises. Part of what makes us human is how our ears can distinguish speech patterns made from vowels and consonants, which is a step above being able to identify whether a sound is made by a male or female, or by our species or another species. Where in the brain we process this information is known, but there was one area that scientists assumed didn’t have much to do with human communication. This is where screams come in.

After noticing how little research had been done on human screams, Poeppel’s post doc Luc Arnal, now at the University of Geneva, led a series of studies to analyze the properties of screams. Because there is no repository of human screams, the researchers used recordings taken from YouTube videos, popular films, and volunteer screamers, who were asked to give their all in the lab’s sound booth. The researchers plotted the sound waves in a manner that reflects the firing of auditory neurons, and they noticed that screams activate a range of acoustic information that scientists hadn’t considered to be important for communication.

“We found that screams occupy a reserved chunk of the auditory spectrum, but we wanted to go through a whole bunch of sounds to verify that this area is unique to screams,” says Poeppel, who also directs the Frankfurt Max-Planck-Institute Department of Neuroscience. “In a series of experiments, we saw [that] this observation remained true when we compared screaming to singing and speaking, even across different languages. The only exception–and what was peculiar and cool–is that alarm signals (car alarms, house alarms, etc.) also activate the range set aside for screams.”

What sets screams and alarms apart from other sounds is that they have a property called roughness, which refers to how fast a sound changes in loudness. Normal speech patterns only have slight differences in loudness (between 4 and 5 Hz), but screams can modulate very fast (varying between 30 and 150 Hz). When Arnal and his team asked people to judge screams on how frightening they were, those with the highest roughness came across as the most terrifying. Modifying the sound wave of a non-scream to be rougher can also make it scream-like. The researchers then confirmed that increases in roughness correspond to more activation of the fear response in the human amygdala.

The finding is also evidence that acoustical engineers have been tapping into the property of roughness just by trial and error. Alarms and movie shrieks do their job of getting our attention, but perhaps they can be better. “These findings suggest that the design of alarm signals can be further improved,” Arnal says. “The same way a bad smell is added to natural gas to make it easily detectable; adding roughness to alarm sounds may improve and accelerate their processing.”

The researchers plan to continue investigating human screams in the lab, particularly those of infants, to see if their screams are particularly rough. The researchers would also like to apply their analyses to animal screams to learn how much this trait is conserved across species.

“Screaming really works,” Poeppel says. “It is one of the earliest sounds that everyone makes–it’s found across cultures and ages–so we thought maybe this is a way to gain some interesting insights as to what brains have in common with respect to vocalization.”

Read more

… like a hole in my head.

Share

flip2
University of Adelaide researchers have shown that intelligence in animal species can be estimated by the size of the holes in the skull through which the arteries pass and that this connection between intelligence and hole size stems from brain metabolic rate.

“A human brain contains nearly 100 billion nerve cells with connections measured in the trillions,” says project leader, Professor Emeritus Roger Seymour. “Each cell and connection uses a minute amount of energy but, added together, the whole brain uses about 20% of a person’s resting metabolic rate.

“It is not known how humans evolved to this state because direct measurements of brain metabolic rate have not been made in living monkeys and apes. However, we found that it is possible to estimate brain metabolic rate from the size of the arteries that supply the brain with blood.

“Arteries continually adjust their diameter to match the amount of blood that an organ needs by sensing the velocity next to the vessel wall. If it is too fast, then the artery grows larger, too slow and the artery shrinks. If an artery passes through a bone, then simply measuring the size of the hole can indicate the blood flow rate and in turn the metabolic rate of the organ inside.”

Professor Seymour, and former Honours student Sophie Angove, measured the ‘carotid foramina’ (which allow passage of the internal carotid arteries servicing the brain) in primates and marsupials and found large differences.

“During the course of primate evolution, body size increased from small, tree-dwelling animals, through larger monkeys and finally the largest apes and humans,” says Professor Seymour.

“Our analysis showed that on one hand, brain size increased with body size similarly in the two groups. On the other hand, blood flow rate in relation to brain size was very different. The relative blood flow rate increased much faster in primates than in marsupials.

“The significant result was that blood flow rate and presumably brain metabolic rate increased with brain volume much faster than expected for mammals in general. By the time of the great apes, blood flow was about 280% higher than expected.

“The difference between primates and other mammals lies not in the size of the brain, but in its relative metabolic rate. High metabolic rate correlates with the evolution of greater cognitive ability and complex social behaviour among primates.”

Read more

I spy with my little eye

Share

 

 

baby-821625_1280

For the first time researchers have managed to reconstruct infants visual perception of the world. By combining technology, mathematics and previous knowledge of the visual perception of infants, researchers have finally succeeded in showing to an adult audience how much of its environment a newborn baby can actually see.

The results tell us that an infant of 2 to 3 days old can perceive faces, and perhaps also emotional facial expressions, at a distance of 30 centimeters — which corresponds to the distance between a mother and her nursing baby. If the distance is increased to 60 centimeters, the visual image gets too blurred for the baby to perceive faces and expressions.

Tags:
Read more

More evidence that cognitive function declines with high sugar and high fat diet

Share

flip4

A study at Oregon State University indicates that both a high-fat and a high-sugar diet, compared to a normal diet, cause changes in gut bacteria that appear related to a significant loss of “cognitive flexibility,” or the power to adapt and adjust to changing situations.

This effect was most serious on the high-sugar diet, which also showed an impairment of early learning for both long-term and short-term memory.

The findings are consistent with some other studies about the impact of fat and sugar on cognitive function and behavior, and suggest that some of these problems may be linked to alteration of the microbiome — a complex mixture in the digestive system of about 100 trillion microorganisms.

The research was done with laboratory mice that consumed different diets and then faced a variety of tests, such as water maze testing, to monitor changes in their mental and physical function, and associated impacts on various types of bacteria. The findings were published in the journal Neuroscience, in work supported by the Microbiology Foundation and the National Science Foundation.

“It’s increasingly clear that our gut bacteria, or microbiota, can communicate with the human brain,” said Kathy Magnusson, a professor in the OSU College of Veterinary Medicine and principal investigator with the Linus Pauling Institute.

“Bacteria can release compounds that act as neurotransmitters, stimulate sensory nerves or the immune system, and affect a wide range of biological functions,” she said. “We’re not sure just what messages are being sent, but we are tracking down the pathways and the effects.”

Mice have proven to be a particularly good model for studies relevant to humans, Magnusson said, on such topics as aging, spatial memory, obesity and other issues.

In this research, after just four weeks on a high-fat or a high-sugar diet, the performance of mice on various tests of mental and physical function began to drop, compared to animals on a normal diet. One of the most pronounced changes was in what researchers call cognitive flexibility.

“The impairment of cognitive flexibility in this study was pretty strong,” Magnusson said. “Think about driving home on a route that’s very familiar to you, something you’re used to doing. Then one day that road is closed and you suddenly have to find a new way home.”

A person with high levels of cognitive flexibility would immediately adapt to the change, determine the next best route home, and remember to use the same route the following morning, all with little problem. With impaired flexibility, it might be a long, slow, and stressful way home.

This study was done with young animals, Magnusson said, which ordinarily would have a healthier biological system that’s better able to resist pathological influences from their microbiota. The findings might be even more pronounced with older animals or humans with compromised intestinal systems, she said.

What’s often referred to as the “Western diet,” or foods that are high in fat, sugars and simple carbohydrates, has been linked to a range of chronic illnesses in the United States, including the obesity epidemic and an increased incidence of Alzheimer’s disease.

“We’ve known for a while that too much fat and sugar are not good for you,” Magnusson said. “This work suggests that fat and sugar are altering your healthy bacterial systems, and that’s one of the reasons those foods aren’t good for you. It’s not just the food that could be influencing your brain, but an interaction between the food and microbial changes.”

Read more

Snacking

Share

150220110126-large

“Eating too frequently, especially when we’re not hungry, is a major potential cause of weight gain,” said Dr Stephanie Fay writer of “Psychological predictors of opportunistic snacking in the absence of hunger”.

“Excessive portion size and energy-dense foods are often blamed for weight gain but the frequency of eating is a significant contributor too. Some people are more inclined to be susceptible to reward gained from foods.

“This study investigated what would happen when we offered volunteers a chocolate snack right after they’d had as much as they wanted of a similar snack food.

“We also explored what might differentiate people who accepted the additional snack, despite not being hungry, from those who said they’d had enough.

“Three-quarters of the people involved, who were unexpectedly offered a second chocolate snack immediately after being given as much they wanted of another chocolate snack food, ate that one too.

“Contrary to expectations, those who took the snack were better at inhibitory control, indicating that they were making a conscious decision. However, those who ate the most of the extra snack were more impulsive, and more responsive to food reward. They were also heavier (with a higher BMI), which suggests that repeated snacking in the absence of hunger is a risk factor for weight gain.”

Tags: ,
Read more

Consciousness

Share

slide-b-1

The internal dialogue that seems to govern one’s thoughts and actions is far less powerful than people believe, serving as a passive conduit rather than an active force that exerts control, according to a new theory proposed by an SF State researcher.

Associate Professor of Psychology Ezequiel Morsella’s “Passive Frame Theory” suggests that the conscious mind is like an interpreter helping speakers of different languages communicate.

“The interpreter presents the information but is not the one making any arguments or acting upon the knowledge that is shared,” Morsella said. “Similarly, the information we perceive in our consciousness is not created by conscious processes, nor is it reacted to by conscious processes. Consciousness is the middle-man, and it doesn’t do as much work as you think.”

Morsella and his coauthors’ groundbreaking theory, published online on June 22 by the journal Behavioral and Brain Sciences, contradicts intuitive beliefs about human consciousness and the notion of self.

Consciousness, per Morsella’s theory, is more reflexive and less purposeful than conventional wisdom would dictate. Because the human mind experiences its own consciousness as sifting through urges, thoughts, feelings and physical actions, people understand their consciousness to be in control of these myriad impulses. But in reality, Morsella argues, consciousness does the same simple task over and over, giving the impression that it is doing more than it actually is.

“We have long thought consciousness solved problems and had many moving parts, but it’s much more basic and static,” Morsella said. “This theory is very counterintuitive. It goes against our everyday way of thinking.”

According to Morsella’s framework, the “free will” that people typically attribute to their conscious mind — the idea that our consciousness, as a “decider,” guides us to a course of action — does not exist. Instead, consciousness only relays information to control “voluntary” action, or goal-oriented movement involving the skeletal muscle system.

Compare consciousness to the Internet, Morsella suggested. The Internet can be used to buy books, reserve a hotel room and complete thousands of other tasks. Taken at face value, it would seem incredibly powerful. But, in actuality, a person in front of a laptop or clicking away on a smartphone is running the show — the Internet is just being made to perform the same basic process, without any free will of its own.

The Passive Frame Theory also defies the intuitive belief that one conscious thought leads to another. “One thought doesn’t know about the other, they just often have access to and are acting upon the same, unconscious information,” Morsella said. “You have one thought and then another, and you think that one thought leads to the next, but this doesn’t seem to be the way the process actually works.”

The theory, which took Morsella and his team more than 10 years to develop, can be difficult to accept at first, he said.

“The number one reason it’s taken so long to reach this conclusion is because people confuse what consciousness is for with what they think they use it for,” Morsella said. “Also, most approaches to consciousness focus on perception rather than action.”

The theory has major implications for the study of mental disorders, Morsella said. “Why do you have an urge or thought that you shouldn’t be having? Because, in a sense, the consciousness system doesn’t know that you shouldn’t be thinking about something,” Morsella said. “An urge generator doesn’t know that an urge is irrelevant to other thoughts or ongoing action.”

The study of consciousness is complicated, Morsella added, because of the inherent difficulty of applying the conscious mind to study itself.

“For the vast majority of human history, we were hunting and gathering and had more pressing concerns that required rapidly executed voluntary actions,” Morsella said. “Consciousness seems to have evolved for these types of actions rather than to understand itself.”

Story Source: San Francisco State University. The original item was written by Beth Tagawa. Note: Materials may be edited for content and length.

Read more

The value of fermented foods

Share

cabbage-711538_1280

A possible connection between fermented foods, which contain probiotics, and social anxiety symptoms, is the focus of recent study. The study is just the first in a series that the researchers have planned to continue exploring the mind-gut connection, including another examination of the data to see whether a correlation exists between fermented food intake and autism symptoms.

Psychologists have traditionally looked to the mind to help people living with mental health issues. But a recent study led by William & Mary researchers shows that the stomach may also play a key role, suggesting that the old adage “you are what you eat” is more than a cliché.

W&M Psychology Professors Matthew Hilimire and Catherine Forestell recently joined with University of Maryland School of Social Work Assistant Professor Jordan DeVylder to investigate a possible connection between fermented foods, which contain probiotics, and social anxiety. The researchers found that young adults who eat more fermented foods have fewer social anxiety symptoms, with the effect being greatest among those at genetic risk for social anxiety disorder as measured by neuroticism.

The journal Psychiatry Research accepted the study in April for publication in August.

“It is likely that the probiotics in the fermented foods are favorably changing the environment in the gut, and changes in the gut in turn influence social anxiety,” said Hilimire. “I think that it is absolutely fascinating that the microorganisms in your gut can influence your mind.”

The researchers designed a questionnaire that was included in a mass testing tool administered in the university’s Introduction to Psychology courses during the fall 2014 semester; about 700 students participated. The questionnaire asked students about the fermented foods over the previous 30 days; it also asked about exercise frequency and the average consumption of fruits and vegetables so that the researchers could control for healthy habits outside of fermented food intake, said Hilimire.

“The main finding was that individuals who had consumed more fermented foods had reduced social anxiety but that was qualified by an interaction by neuroticism. What that means is that that relationship was strongest amongst people that were high in neuroticism,” Hilimire said.

The secondary finding was that more exercise was related to reduced social anxiety. Although the researchers were pleased to see the findings so clearly support their hypothesis, the study is just the first in a series they have planned to continue exploring the mind-gut connection, including another examination of the data to see whether a correlation exists between fermented food intake and autism symptoms, said Hilimire.

The researchers will also soon create an experimental version of the study. Without that experimental phase, the researches can’t make a causative connection between eating fermented foods and reduced social anxiety.

“However, if we rely on the animal models that have come before us and the human experimental work that has come before us in other anxiety and depression studies, it does seem that there is a causative mechanism,” said Hilimire. “Assuming similar findings in the experimental follow-up, what it would suggest is that you could augment more traditional therapies (like medications, psychotherapy or a combination of the two) with fermented foods — dietary changes — and exercise, as well.”

DeVylder noted that research over the past several years has increasingly supported a close relationship between nutrition and mental health. “This study shows that young adults who are prone towards anxiety report less social anxiety if they frequently consume fermented foods with probiotics. These initial results highlight the possibility that social anxiety may be alleviated through low-risk nutritional interventions, although further research is needed to determine whether increasing probiotic consumption directly causes a reduction in social anxiety,” he said

Read more

A set of well-hidden lymphatic vessels

Share

shutterstock_153800186In a stunning discovery that overturns decades of textbook teaching, researchers at the University of Virginia School of Medicine have determined that the brain is directly connected to the immune system by vessels previously thought not to exist. That such vessels could have escaped detection when the lymphatic system has been so thoroughly mapped throughout the body is surprising on its own, but the true significance of the discovery lies in the effects it could have on the study and treatment of neurological diseases ranging from autism to Alzheimer’s disease to multiple sclerosis.

“Instead of asking, ‘How do we study the immune response of the brain?’ ‘Why do multiple sclerosis patients have the immune attacks?’ now we can approach this mechanistically. Because the brain is like every other tissue connected to the peripheral immune system through meningeal lymphatic vessels,” said Jonathan Kipnis, PhD, professor in the UVA Department of Neuroscience and director of UVA’s Center for Brain Immunology and Glia (BIG). “It changes entirely the way we perceive the neuro-immune interaction. We always perceived it before as something esoteric that can’t be studied. But now we can ask mechanistic questions.”

“We believe that for every neurological disease that has an immune component to it, these vessels may play a major role,” Kipnis said. “Hard to imagine that these vessels would not be involved in a [neurological] disease with an immune component.”

New Discovery in Human Body

Kevin Lee, PhD, chairman of the UVA Department of Neuroscience, described his reaction to the discovery by Kipnis’ lab: “The first time these guys showed me the basic result, I just said one sentence: ‘They’ll have to change the textbooks.’ There has never been a lymphatic system for the central nervous system, and it was very clear from that first singular observation — and they’ve done many studies since then to bolster the finding — that it will fundamentally change the way people look at the central nervous system’s relationship with the immune system.”

Even Kipnis was skeptical initially. “I really did not believe there are structures in the body that we are not aware of. I thought the body was mapped,” he said. “I thought that these discoveries ended somewhere around the middle of the last century. But apparently they have not.”

‘Very Well Hidden’

The discovery was made possible by the work of Antoine Louveau, PhD, a postdoctoral fellow in Kipnis’ lab. The vessels were detected after Louveau developed a method to mount a mouse’s meninges — the membranes covering the brain — on a single slide so that they could be examined as a whole. “It was fairly easy, actually,” he said. “There was one trick: We fixed the meninges within the skullcap, so that the tissue is secured in its physiological condition, and then we dissected it. If we had done it the other way around, it wouldn’t have worked.”

After noticing vessel-like patterns in the distribution of immune cells on his slides, he tested for lymphatic vessels and there they were. The impossible existed. The soft-spoken Louveau recalled the moment: “I called Jony [Kipnis] to the microscope and I said, ‘I think we have something.'”

As to how the brain’s lymphatic vessels managed to escape notice all this time, Kipnis described them as “very well hidden” and noted that they follow a major blood vessel down into the sinuses, an area difficult to image. “It’s so close to the blood vessel, you just miss it,” he said. “If you don’t know what you’re after, you just miss it.”

“Live imaging of these vessels was crucial to demonstrate their function, and it would not be possible without collaboration with Tajie Harris,” Kipnis noted. Harris, a PhD, is an assistant professor of neuroscience and a member of the BIG center. Kipnis also saluted the “phenomenal” surgical skills of Igor Smirnov, a research associate in the Kipnis lab whose work was critical to the imaging success of the study.

Alzheimer’s, Autism, MS and Beyond

The unexpected presence of the lymphatic vessels raises a tremendous number of questions that now need answers, both about the workings of the brain and the diseases that plague it. For example, take Alzheimer’s disease. “In Alzheimer’s, there are accumulations of big protein chunks in the brain,” Kipnis said. “We think they may be accumulating in the brain because they’re not being efficiently removed by these vessels.” He noted that the vessels look different with age, so the role they play in aging is another avenue to explore. And there’s an enormous array of other neurological diseases, from autism to multiple sclerosis, that must be reconsidered in light of the presence of something science insisted did not exist

Read more

95% of the world’s population has health problems

Share

parallax0

Just one in 20 people worldwide (4·3%) had no health problems in 2013, with a third of the world’s population (2·3 billion individuals) experiencing more than five ailments, according to a major new analysis from the Global Burden of Disease Study (GBD) 2013, published in The Lancet.

Moreover, the research shows that, worldwide, the proportion of lost years of healthy life (disability-adjusted life years; DALYS [1]) due to illness (rather than death) rose from around a fifth (21%) in 1990 to almost a third (31%) in 2013.

As the world’s population grows, and the proportion of elderly people increases, the number of people living in suboptimum health is set to rise rapidly over coming decades, warn the authors.

The findings come from the largest and most detailed analysis to quantify levels, patterns, and trends in ill health and disability around the world between 1990 and 2013.

In the past 23 years, the leading causes of health loss have hardly changed. Low back pain, depression, iron-deficiency anemia, neck pain, and age-related hearing loss resulted in the largest overall health loss worldwide (measured in terms of YLD — Years Lived with Disability — ie, time spent in less than optimum health [2]) in both 1990 and 2013.

In 2013, musculoskeletal disorders (ie, mainly low back pain, neck pain, and arthritis) and mental and substance abuse disorders (predominantly depression, anxiety, and drug and alcohol use disorders) accounted for almost half of all health loss worldwide.

Importantly, rates of disability are declining much more slowly than death rates. For example, while increases in rates of diabetes have been substantial, rising by around 43% over the past 23 years, death rates from diabetes increased by only 9%.

“The fact that mortality is declining faster than non-fatal disease and injury prevalence is further evidence of the importance of paying attention to the rising health loss from these leading causes of disability, and not simply focusing on reducing mortality,” [3] says Theo Vos, lead author and Professor of Global Health at the Institute of Health Metrics and Evaluation, University of Washington, USA.

The GBD 2013 Disease and Injury Incidence and Prevalence Collaborators analysed 35,620 sources of information on disease and injury from 188 countries between 1990 and 2013 to reveal the substantial toll of disabling disorders and the overall burden on health systems from 301 acute and chronic diseases and injuries, as well as 2337 health consequences (sequelae) that result from one or more of these disorders.

Key findings include:

In 2013, low back pain and major depression ranked among the top ten greatest contributors to disability in every country, causing more health loss than diabetes, chronic obstructive pulmonary disease, and asthma combined.

Worldwide, the number of individuals with several illnesses rapidly increased both with age and in absolute terms between 1990 and 2013. In 2013, about a third (36%) of children aged 0-4 years in developed countries had no disorder compared with just 0·03% of adults older than 80 years. Furthermore, the number of individuals with more than ten disorders increased by 52% between 1990 and 2013.

Eight causes of chronic disorders — mostly non-communicable diseases — affected more than 10% of the world population in 2013: cavities in permanent teeth (2·4 billion), tension-type headaches (1·6 billion), iron-deficiency anemia (1·2 billion), glucose-6-phosphate dehydrogenase deficiency trait (1·18 billion), age-related hearing loss (1·23 billion), genital herpes (1·12 billion), migraine (850 million), and ascariasis (800 million; giant intestinal roundworm).

The number of years lived with disability increased over the last 23 years due to population growth and aging (537·6 million to 764·8 million), while the rate (age-standardised per 1000 population) barely declined between 1990 and 2013 (115 per 1000 people to 110 per 1000 people).

The main drivers of increases in the number of years lived with disability were musculoskeletal, mental, and substance abuse disorders, neurological disorders, and chronic respiratory conditions. HIV/AIDS was a key driver of rising numbers of years lived with disability in sub-Saharan Africa.

There has also been a startling increase in the health loss associated with diabetes (increase of 136%), Alzheimer’s disease (92% increase), medication overuse headache (120% increase), and osteoarthritis (75% increase).

In central Europe, falls cause a disproportionate amount of disability and health burden, ranking as the second leading cause of disability in 11 of 13 countries. In many Caribbean nations anxiety disorders ranked more highly, and diabetes was the third greatest contributor to disability in Mexico, Nicaragua, Panama, and Venezuela. Disability from past war and conflict was the leading contributor to health loss in Cambodia, Nicaragua, Rwanda, and ranked second in Vietnam.

According to Professor Vos, “Large, preventable causes of health loss, particularly serious musculoskeletal disorders and mental and behavioural disorders, have not received the attention that they deserve. Addressing these issues will require a shift in health priorities around the world, not just to keep people alive into old age, but also to keep them healthy.”

FOOTNOTES:

This study was funded by the Bill & Melinda Gates Foundation.

[1] Years of healthy life lost are measured in terms of disability adjusted life years (DALYS). These are worked out by combining the number of years of life lost as a result of early death and the number of years lived with disability.

[2] Years lived with disability (YLD) calculated by combining prevalence (proportion of the population with the disorder in any given year) and the general public’s assessment of the severity of health loss (disability weight).

Read more

A balanced diet may be more important than you think

Share

1410385362001-KC HOSVA LIT-2 credit kim carroll

The importance of nutrition for maintaining mental health has been highlighted by recent research. The human brain needs an adequate intake of key nutrients, such as polyunsaturated fatty acids Omega-3, essential amino acids, B-group vitamins (B12 and folate), vitamin D and minerals like zinc, magnesium and iron. A balanced and high-quality diet, such as the Mediterranean, provides all of these,

An international study involving the Faculty of Medicine and Dentistry of the University of Valencia, recently published in The Lancet Psychiatry, highlights the importance of nutrition for maintaining mental health. Lecturer of Psychiatry Vicent Balanzá has participated in this study.

Lecturer of Psychiatry Vicent Balanzá, also a psychiatrist at La Fe University Hospital, participated in the scientific review made by members of the International Society for Nutritional Psychiatry Research (ISNPR) on the importance, research and future of nutritional medicine, as “it has been proven that the quality of diet and the deficiencies in certain essential nutrients are determining factors for physical and mental health.”

In fact, nutrition “has become a key factor for the high prevalence and incidence of very frequent mental diseases, such as depression. A balanced diet is as important in psychiatry as it is in other medical specialties such as cardiology or endocrinology,” says Balanzá.

ISNPR is a scientific society founded in 2013, the purpose of which is to promote high-quality scientific research on the prevention and treatment of psychiatric disorders by means of nutritional interventions. Balanzá is a member of its executive committee and he claims that in order to supply optimum performance, the human brain “needs an adequate intake of key nutrients, such as polyunsaturated fatty acids Omega-3, essential amino acids, B-group vitamins (B12 and folate), vitamin D and minerals like zinc, magnesium and iron. A balanced and high-quality diet, such as the Mediterranean, provides all of these, but in cases of deficiencies, nutritional supplements are advisable.”

A broad approach to Psychiatry

With this publication in ‘The Lancet Psychiatry’, the world’s experts in nutritional psychiatry propose a debate on the growing role of diet in psychiatry and mental health. “At the population level, we had scientific evidence that Mediterranean diet is associated with a lower risk of cardiovascular disease, diabetes and cognitive impairment. Now we also know that it reduces the risk of depression. These are strong arguments to preserve a cultural -and wholesome- treasure that has been transmitted over time,” stresses Vicent Balanzá.

Moreover, the aetiology of mental illnesses is extremely complex and, therefore, so is their treatment. “Expecting that anyone with mental health problems would recover only with medicines is a very limited view of reality. In our article we argue that the future of psychiatry requires a broader approach in which nutritional factors are essential in order to provide better health outcomes, functioning and quality of life,” concludes the researcher.

Read more

Terrible Twos

Share

baby-724894_1280

The next time your toddler acts adventurous, shy, fidgety or cuddly, you might be able to blame the bacteria in his gut.

Researchers from The Ohio State University studied microbes from the gastrointestinal tracts of children between the age of 18 and 27 months, and found that the abundance and diversity of certain bacterial species appear to impact behavior, particularly among boys. The correlation exists even after the scientists factored in history of breastfeeding, diet and the method of childbirth — all of which are known to influence the type of microbes that populate a child’s gut.

Study authors say they aren’t looking for a way to help parents modify the ‘terrible twos,’ but for clues about how — and where — chronic illnesses like obesity, asthma, allergies and bowel disease start.

“There is substantial evidence that intestinal bacteria interact with stress hormones- the same hormones that have been implicated in chronic illnesses like obesity and asthma,” said Lisa Christian, PhD, a researcher with Ohio State’s Institute for Behavioral Medicine Research. “A toddler’s temperament gives us a good idea of how they react to stress. This information combined with an analysis of their gut microbiome could ultimately help us identify opportunities to prevent chronic health issues earlier.”

Christian and study co-author, microbiologist Michael Bailey, PhD, studied stool samples from 77 girls and boys, and found that children with the most genetically diverse types of gut bacteria more frequently exhibited behaviors related with positive mood, curiosity, sociability and impulsivity. In boys only, researchers reported that extroverted personality traits were associated with the abundances of microbes from the Rikenellaceae and Ruminococcaceae families and Dialister and Parabacteroides genera.

“There is definitely communication between bacteria in the gut and the brain, but we don’t know which one starts the conversation,” said Dr. Bailey, who is currently a researcher with Nationwide Children’s Hospital and a member of Ohio State’s Institute for Behavioral Medicine Research. “Maybe kids who are more outgoing have fewer stress hormones impacting their gut than shy kids. Or maybe the bacteria are helping mitigate the production of stress hormones when the child encounters something new. It could be a combination of both.”

Overall, associations of temperament with the gut microbiome in girls were fewer and less consistent than boys. However, in girls, behaviors like self-restraint, cuddliness and focused attention were associated with a lower diversity of gut bacteria, while girls with an abundance of Rikenellaceae appeared to experience more fear than girls with a more balanced diversity of microbes.

To identify correlations between gut bacteria and temperament, researchers asked mothers to assess kid’s behavior using a questionnaire which measures 18 different traits that feed into three composite scales of emotional reactivity: Negative Affect, Surgency/Extraversion and Effortful Control. Scientists looked at the different genetic types and relative quantity of bacteria found in the toddler’s stool samples along with their diets.

The average gastrointestinal tract contains 400-500 different species of bacteria, and most of them belong to one of ten phyla of bacteria. Advancements in DNA-based methods have allowed scientists to identify bacteria in stool, along with the relative concentration of those bacteria — giving them a much more accurate look at the diversity and composition of the microbial community.

“In the past, bacteria were cultured from samples in the lab, and scientists assumed that what grew was an accurate reflection of what was in the gut,” said Dr. Bailey. “Now we can see that’s not the case. All of the predominant bacteria that we found in our study have been previously linked to either changes in behavior or immune response, so I think we are definitely on the right track.”

Similar to other child behavior studies, researchers separated their findings by gender to analyze temperament. Overall, the study found few differences in the abundance and types of gut microbiota between girls and boys.

While scientists believe that the microbiome is generally set by the age of two, there are dramatic changes in gut microbes that take place during and after birth, as babies pick up bacteria from their mothers during labor and through breastfeeding. Babies born via C-section will have different microbes than babies delivered vaginally.

However, the researchers found that gut bacterial composition wasn’t impacted by delivery method, diet or length of breast feeding. The authors acknowledge that their study didn’t delve deeply into individual diets, but looked generally at when food types were introduced and the types and frequency of food consumed daily.

“In this study, the associations between temperament and the gut microbiome that we saw weren’t due to differences in the diets of children. However, it is possible that effects of diet would emerge if we used a more detailed assessment. It is certainly possible that the types or quantities of food that children with different temperaments choose to eat affect their microbiome. ” said Dr. Christian, who also holds appointments in the departments of psychiatry, psychology and obstetrics/gynecology at Ohio State’s College of Medicine.

Both researchers say that parents shouldn’t try to change their child’s gut microbiome just yet. Scientists still don’t know what a healthy combination looks like, or what might influence its development.

“The bacterial community in my gut is going to look different than yours — but we are both healthy. The perfect microbiome will probably vary from person to person,” said Dr. Bailey.

Read more

Can epigenetic regulation control aging in humans?

Share

friendship-659707_1280

Can the process of aging be delayed or even reversed? Research led by specially appointed Professor Jun-Ichi Hayashi from the University of Tsukuba in Japan has shown that, in human cell lines at least, it can. They also found that the regulation of two genes involved with the production of glycine, the smallest and simplest amino acid, is partly responsible for some of the characteristics of aging.

Professor Hayashi and his team made this exciting discovery while in the process of addressing some controversial issues surrounding a popular theory of aging.

This theory, the mitochondrial theory of aging, proposes that age-associated mitochondrial defects are controlled by the accumulation of mutations in the mitochondrial DNA. Abnormal mitochondrial function is one of the hallmarks of aging in many species, including humans. This is mostly due to the fact that the mitochondrion is the so-called powerhouse of the cell as it produces energy in a process called cellular respiration. Damage to the mitochondrial DNA results in changes or mutations in the DNA sequence. Accumulation of these changes is associated with a reduced lifespan and early onset of aging-related characteristics such as weight and hair loss, curvature of the spine and osteoporosis.

There is, however, a growing body of conflicting evidence that has raised doubts about the validity of this theory. The Tsukuba team in particular has performed some compelling research that has led them to propose that age-associated mitochondrial defects are not controlled by the accumulation of mutations in the mitochondrial DNA but by another form of genetic regulation. The research, published this month in the journal Nature’s Scientific Reports, looked at the function of the mitochondria in human fibroblast cell lines derived from young people (ranging in age from a fetus to a 12 year old) and elderly people (ranging in age from 80-97 years old). The researchers compared the mitochondrial respiration and the amount of DNA damage in the mitochondria of the two groups, expecting respiration to be reduced and DNA damage to be increased in the cells from the elderly group. While the elderly group had reduced respiration, in accordance with the current theory, there was, however, no difference in the amount of DNA damage between the elderly and young groups of cells. This led the researchers to propose that another form of genetic regulation, epigenetic regulation, may be responsible for the age-associated effects seen in the mitochondria.

Epigenetic regulation refers to changes, such as the addition of chemical structures or proteins, which alter the physical structure of the DNA, resulting in genes turning on or off. Unlike mutations, these changes do not affect the DNA sequence itself. If this theory is correct, then genetically reprogramming the cells to an embryonic stem cell-like state would remove any epigenetic changes associated with the mitochondrial DNA. In order to test this theory, the researchers reprogrammed human fibroblast cell lines derived from young and elderly people to an embryonic stem cell-like state. These cells were then turned back into fibroblasts and their mitochondrial respiratory function examined. Incredibly, the age-associated defects had been reversed — all of the fibroblasts had respiration rates comparable to those of the fetal fibroblast cell line, irrespective of whether they were derived from young or elderly people. This indicates that the aging process in the mitochondrion is controlled by epigenetic regulation, not by mutations.

The researchers then looked for genes that might be controlled epigenetically resulting in these age-associated mitochondrial defects. Two genes that regulate glycine production in mitochondria, CGAT and SHMT2, were found. The researchers showed that by changing the regulation of these genes, they could induce defects or restore mitochondrial function in the fibroblast cell lines. In a compelling finding, the addition of glycine for 10 days to the culture medium of the 97 year old fibroblast cell line restored its respiratory function. This suggests that glycine treatment can reverse the age-associated respiration defects in the elderly human fibroblasts.

These findings reveal that, contrary to the mitochondrial theory of aging, epigenetic regulation controls age-associated respiration defects in human fibroblast cell lines. Can epigenetic regulation also control aging in humans? That theory remains to be tested, and if proven, could result in glycine supplements giving our older population a new lease of life

Read more

Can aging be cured?

Share
"Things that only have a 50% chance of happening 20 years from now are supposed to sound like science fiction." - Aubrey de Grey

“Things that only have a 50% chance of happening 20 years from now are supposed to sound like science fiction.” – Aubrey de Grey

Cambridge researcher Aubrey de Grey argues that aging is merely a disease — and a curable one at that. Listen below as he calls for identifying the components that cause human tissue to age, and designing remedies for each— forestalling disease and prolonging healthy life.

http://www.npr.org/2015/05/22/408025154/can-aging-be-cured

 

 

Tags: ,
Read more

Obese mothers are compromising their child’s immune system at time of birth

Share

baby-724894_1280

Obesity can complicate pregnancy by increasing the mother’s risk of having gestational diabetes, preeclampsia, preterm birth or a baby with birth defects. Maternal obesity is also linked to several adverse health outcomes for the infant that can persist into adulthood, such as type-2 diabetes, heart disease and mortality.

But when exactly does the immune system of babies born to obese mothers get compromised? Very early in the baby’s life, according to a new study by a research team led by Ilhem Messaoudi of the University of California, Riverside.

“A number of studies have linked maternal obesity — starting pregnancy with excess weight and gaining a lot of weight during pregnancy — to a higher incidence of cardiovascular disease and asthma in children,” said Messaoudi, an associate professor of biomedical sciences in the School of Medicine at UC Riverside. “Our study offers potential links between changes in the offspring’s immune system and the increased susceptibility and incidence of these diseases later in life.”

The team analyzed umbilical cord blood samples of infants born to lean, overweight and obese mothers, and found that pre-pregnancy maternal weight has a significant impact on the immune system of the neonate, putting such children at risk for potential diseases such as heart disease and asthma.

“We found that very specific immune cells in circulation — monocytes and dendritic cells -isolated from babies born to moms with high BMI were unable to respond to bacterial antigens compared to babies born to lean moms,” Messaoudi said. “Such babies also showed a reduction in ‘CD4 T-cells.’ Both of these changes could result in compromised responses to infection and vaccination.”

Further, the researchers found that cells (eosinophils) that play a role in allergic response and asthma pathogenesis were significantly reduced in the umbilical cord blood of babies born to obese mothers. One potential explanation for these observations is that these cells have already moved into the lungs, which could explain the increased incidence of asthma observed later in life in children born to obese mothers.

The research is the first to show the link between maternal obesity during pregnancy and neonatal immune outcomes, and shows that changes in immunity are already detectable at birth and could persist for the lifetime of the child into adulthood.

“This could change how we respond to vaccination and how we respond to asthma-inducing environmental antigens,” Messaoudi said. “As we know, in the first two years of life, children typically receive plenty of vaccines. The questions that arise are: Are the responses to vaccines in infants born to obese moms also impaired in the first two years of life? Should we change how often we vaccinate children born to obese moms? Should we change practices of how much and how often we vaccinate?”

Messaoudi sees the research paper as a launching point for further studies and a call to action.

“If you are thinking of becoming or are already pregnant, talk to your ob-gyn about weight management, weight gain and the ideal targets for weight gain,” she said. “When moms come in for prenatal visits, doctors tell them about smoking, recreational drug use, and alcohol. But they should be talking also about weight and weight management. Obesity has serious repercussions for maternal health. It is associated with low fertility and success with pregnancy. Rates of gestational diabetes, preeclampsia, placental abruption — all of these risks increase dramatically with weight gain and obesity. So it is important to talk to your doctor about ideal weight entering into pregnancy and throughout pregnancy.”

The pilot study, performed on 39 mothers in Portland, Ore., is published online in PubMed and will soon appear in the journal Pediatric Allergy and Immunology.

Read more

Fidgetey kids need movement to concentrate better

Share

bubbles

Are you a pen-clicker? A hair-twirler? A knee-bouncer? Did you ever get in trouble for fidgeting in class? Don’t hang your head in shame. All that movement may be helping you think.

A new study suggests that for children with attention disorders, hyperactive movements meant better performance on a task that requires concentration. The researchers gave a small group of boys, ages 8 to 12, a sequence of random letters and numbers. Their job: Repeat back the numbers in order, plus the last letter in the bunch. All the while, the kids were sitting in a swiveling chair.

For the subjects diagnosed with attention deficit hyperactivity disorder, or ADHD, moving and spinning in the chair were correlated with better performance. For typically developing kids, however, it was the opposite: The more they moved, the worse they did on the task.

Dustin Sarver at the University of Mississippi Medical Center is the lead author of this study. ADHD is his field, and he has a theory as to why fidgeting helps these kids.

“We think that part of the reason is that when they’re moving more they’re increasing their alertness.”

That’s right — increasing. The prevailing scientific theory on attention disorders holds that they are caused by chronic underarousal of the brain. That’s why stimulants are prescribed as treatment. Sarver believes that slight physical movements “wake up” the nervous system in much the same way that Ritalin does, thus improving cognitive performance.

However, he explains, alertness occurs on a “rainbow curve.” You want to maintain a “Goldilocks” level of alertness — not too much, not too little. That’s why moving around didn’t help the typically developing kids; it might even have distracted them.

Lots of popular classroom-management advice focuses on controlling students’ postures and movements, on the theory that sitting still is synonymous with thinking well. For example, Doug Lemov’s “Teach Like A Champion” model, used in many charter schools, uses the acronym SLANT, for “Sit up, Listen, Ask and answer questions, Nod, Track the speaker.”

This is one small study, not meant to provide conclusive evidence one way or another. But in his role as an ADHD researcher, Sarver often finds himself in conversation with teachers who ask him for his opinion.

Sarver tells them that it may make more sense to grant kids with ADHD some leeway — not to get out of their desk constantly or distract other students, but to move around as they need to.

“When I tell a kid, ‘Sit down, don’t move, stop tapping, stop bouncing,’ the kids are spending all their mental energy concentrating on that rule. And that doesn’t allow them to concentrate on what we’re asking them to do, which is their homework.”

Read more

UC Berkley making good drugs

Share

volkswagen-158463_1280

Whether you’re brainy, brawny or both, you may someday benefit from a drug found to rejuvenate aging brain and muscle tissue.

Researchers at the University of California, Berkeley, have discovered that a small-molecule drug simultaneously perks up old stem cells in the brains and muscles of mice, a finding that could lead to drug interventions for humans that would make aging tissues throughout the body act young again.

“We established that you can use a single small molecule to rescue essential function in not only aged brain tissue but aged muscle,” said co-author David Schaffer, director of the Berkeley Stem Cell Center and a professor of chemical and biomolecular engineering. “That is good news, because if every tissue had a different molecular mechanism for aging, we wouldn’t be able to have a single intervention that rescues the function of multiple tissues.”

The drug interferes with the activity of a growth factor, transforming growth factor beta 1 (TGF-beta1), that Schaffer’s UC Berkeley colleague Irina Conboy showed over the past 10 years depresses the ability of various types of stem cells to renew tissue.

“Based on our earlier papers, the TGF-beta1 pathway seemed to be one of the main culprits in multi-tissue aging,” said Conboy, an associate professor of bioengineering. “That one protein, when upregulated, ages multiple stem cells in distinct organs, such as the brain, pancreas, heart and muscle. This is really the first demonstration that we can find a drug that makes the key TGF-beta1 pathway, which is elevated by aging, behave younger, thereby rejuvenating multiple organ systems.”

The UC Berkeley team reported its results in the current issue of the journal Oncotarget. Conboy and Schaffer are members of a consortium of faculty who study aging within the California Institute for Quantitative Biosciences (QB3).

Depressed stem cells lead to aging

Aging is ascribed, in part, to the failure of adult stem cells to generate replacements for damaged cells and thus repair the body’s tissues. Researchers have shown that this decreased stem cell activity is largely a result of inhibitory chemicals in the environment around the stem cell, some of them dumped there by the immune system as a result of chronic, low-level inflammation that is also a hallmark of aging.

In 2005, Conboy and her colleagues infused old mice with blood from young mice – a process called parabiosis – reinvigorating stem cells in the muscle, liver and brain/hippocampus and showing that the chemicals in young blood can actually rejuvenate the chemical environment of aging stem cells. Last year, doctors began a small trial to determine whether blood plasma from young people can help reverse brain damage in elderly Alzheimer’s patients.

Such therapies are impractical if not dangerous, however, so Conboy, Schaffer and others are trying to track down the specific chemicals that can be used safely and sustainably for maintaining the youthful environment for stem cells in many organs. One key chemical target for the multi-tissue rejuvenation is TGF-beta1, which tends to increase with age in all tissues of the body and which Conboy showed depresses stem cell activity when present at high levels.

Five years ago, Schaffer, who studies neural stem cells in the brain, teamed up with Conboy to look at TGF-beta1 activity in the hippocampus, an area of the brain important in memory and learning. Among the hallmarks of aging are a decline in learning, cognition and memory. In the new study, they showed that in old mice, the hippocampus has increased levels of TGF-beta1 similar to the levels in the bloodstream and other old tissue.

Using a viral vector that Schaffer developed for gene therapy, the team inserted genetic blockers into the brains of old mice to knock down TGF-beta1 activity, and found that hippocampal stem cells began to act more youthful, generating new nerve cells.

Drug makes old tissue cleverer

The team then injected into the blood a chemical known to block the TGF-beta1 receptor and thus reduce the effect of TGF-beta1. This small molecule, an Alk5 kinase inhibitor already undergoing trials as an anticancer agent, successfully renewed stem cell function in both brain and muscle tissue of the same old animal, potentially making it stronger and more clever, Conboy said.

“The key TGF-beta1 regulatory pathway became reset to its young signaling levels, which also reduced tissue inflammation, hence promoting a more favorable environment for stem cell signaling,” she said. “You can simultaneously improve tissue repair and maintenance repair in completely different organs, muscle and brain.”

The researchers noted that this is only a first step toward a therapy, since other biochemical cues also regulate adult stem cell activity. Schaffer and Conboy’s research groups are now collaborating on a multi-pronged approach in which modulation of two key biochemical regulators might lead to safe restoration of stem cell responses in multiple aged and pathological tissues.

“The challenge ahead is to carefully retune the various signaling pathways in the stem cell environment, using a small number of chemicals, so that we end up recalibrating the environment to be youth-like,” Conboy said. “Dosage is going to be the key to rejuvenating the stem cell environment.”

Read more

Are there six DNA bases?

Share

150504101254_1_540x360

DNA (deoxyribonucleic acid) is the main component of our genetic material. It is formed by combining four parts: A, C, G and T (adenine, cytosine, guanine and thymine), called bases of DNA combine in thousands of possible sequences to provide the genetic variability that enables the wealth of aspects and functions of living beings.

Two more bases: the Methyl- cytosine and Methyl-adenine

In the early 80s, to these four “classic” bases of DNA was added a fifth: the methyl-cytosine (mC) derived from cytosine. And it was in the late 90’s when mC was recognized as the main cause of epigenetic mechanisms: it is able to switch genes on or off depending on the physiological needs of each tissue.

In recent years, interest in this fifth DNA base has increased by showing that alterations in the methyl-cytosine contribute to the development of many human diseases, including cancer.

Today, an article published in Cell by Manel Esteller, director of the Epigenetics and Cancer Biology Program of the Bellvitge Biomedical Research Institute (IDIBELL), ICREA researcher and Professor of Genetics at the University of Barcelona, ​​describes the possible existence of a sixth DNA base, the methyl-adenine (mA), which also help determine the epigenome and would therefore be key in the life of the cells.

In bacteria and in complex organisms

“It was known for years that bacteria, evolutionarily very distant living organisms of us, had mA in its genome with a protective function against the insertion of genetic material from other organisms. But it was believed that this was a phenomenon of primitive cells and it was very static” describes Manel Esteller.

“However, this issue of CELL publishes three papers suggesting that more complex cells called eukaryotes such as the human body cells, also present the sixth DNA base. These studies suggest that algae, worms and flies possess mA and it acts to regulate the expression of certain genes, thus constituting a new epigenetic mark. This work has been possible thanks to the development of analytical methods with high sensitivity because levels of mA in described genomes are low. In addition it seems that mA would play a specific role in stem cells and early stages of development, “explains the researcher.

“Now the challenge we face is to confirm this data and find out whether mammals, including humans, we also have this sixth DNA base, and consider what its role is”.

Article: Heyn H, Esteller M. An Adenine Code for DNA: A Second Life for N6-Methyladenine. Cell (2015).http://dx.doi.org/10.1016/j.cell.2015.04.021

Read more

Your Urine

Share

doctor-563429_1280

You might think it’s easy to guess if a person is at risk of becoming overweight or developing diabetes. The behavioral traits are pretty clear – that person might exercise less or eat more. He or she might have high blood pressure, or might have gained weight.

But now there’s another place to find evidence of those risk factors: in a person’s pee.

Researchers are finding clues about the metabolism in human urine – most recently in more than 2,000 samples kept frozen in the basement of Imperial College, in London.

“You would be amazed how much frozen piss we have,” Jeremy Nicholson, a professor of biological chemistry and the head of the National Phenome Center at Imperial College, tells Shots. Nicholson is part of a team of researchers that used spectroscopy to analyze the molecules in urine. Their results were published Wednesday in Science Translational Medicine.

The molecules the researchers sought aren’t exactly waste, even though they’re found in urine. They’re called metabolites, which are byproducts of the body’s metabolic processes. “The kidney doesn’t want those things at that particular moment,” says Nicholson. “But nonetheless, a lot of metabolites that are in [urine] are still very useful to the body.”

Nicholson and his colleagues tested two urine samples from each donor, one about three weeks after the other. This gave them a sense of which metabolites the body excretes regularly and which they excrete because the person ate certain foods. For example, if a person has eaten a lot of citrus lately, he will excrete a lot of a metabolite called proline betaine.

And then there are metabolites that remain consistent over long periods of time — these metabolites represent what Nicholson and his colleagues call the person’s metabolic phenotype. He used statistics to correlate the presence of certain metabolites with lifestyle habits from each donor, like their typical diet and how much they exercised.

Some metabolites showed very different ratios for obese people compared to those with normal weight, like the ratio of leucine to ketoleucine. Leucine is an amino acid that the skeletal muscles use for energy, and ketoleucine is one of the first products in the breakdown process.

But breaking leucine into ketoleucine depends on a particular set of enzymes, and if you’re active, the enzymes work better, Nicholson says. “You lose part of the ability to burn calories by not exercising because the enzymes that do it are turned off.”

The leucine-to-ketoleucine ratio is an indicator that dates back to 1969, when a group of researchers correlated certain levels of amino acids with blood insulin levels.

“Since the late 60s or early 70s, people have been trying to figure out the association with obesity,” Robert Gerszten, an associate professor at Harvard Medical School, tells Shots. “This paper very nicely adds to the breadth of metabolic disturbances that are associated with human obesity.”

Still, Gerszten is skeptical that certain enzymes simply switch on and off depending on whether someone exercises or not. There could be a genetic component as well. “There are lots of pathways that are aberrant in obesity,” he says. The enzyme converting leucine to ketoleucine is just one opportunity for error.

Nicholson estimates that about half of the metabolites he identified in the study have never been linked to obesity. That shows how complex the human metabolism is, he says, and leads him to the conclusion that there is no simple treatment for obesity. “If you’ve got so many different compartments and different pathways, how does a single drug fix all of those in one go?” he asks. “Obesity probably is not druggable in the traditional sense.”

But next, he and his colleagues are trying to follow up with donors from this study and on additional urine samples from people in China. “The Chinese have become a lot fatter in the 15 years because they’ve westernized,” he explains. They hope to make the study slightly more predictive by showing which metabolites appear in urine 15 years before a person develops obesity or diabetes.

Still, finding that the metabolites were different 15 years ago doesn’t mean they can predict a person’s health. After all, no one’s metabolites will guarantee a skinny or overweight life; with genes, behavior and the environment all weighing in, it’s a lot more complicated than that.

Tags: ,
Read more

Take a walk

Share

Walk on the beach-family

We know that sitting all day is hazardous to our health, increasing the risk of obesity, diabetes, hypertension, inflammation and atherosclerosis. It all sounds pretty dismal, since many of today’s jobs require us to be nearly glued to our computer screens. But a tiny two-minute break may help offset that hazard, researchers say.

People who got up and moved around for at least two minutes every hour had a 33 percent lower risk of dying, according to researchers the University Of Utah School Of Medicine.

“It was fascinating to see the results because the current national focus is on moderate or vigorous activity,” says Dr. Srinivasan Beddhu, lead author of the study and a professor of medicine. “To see that light activity had an association with lower mortality is intriguing.”

The researchers looked at data from 3,626 participants in the federal National Health and Nutrition Examination Survey (NHANES), who wore accelerometers to measure the intensity of their activity during the day. They looked at “light” activities, things like walking around the office, up and down a few flights of stairs, even a short walk to get a cup of coffee. Beddhu says: “Our study suggests that even small changes can have a big impact.”

People who had chronic kidney disease saw a 41 percent lower risk of dying in the time period studied, which was just under three years on average. And those 383 people were also the most sedentary, spending 41 minutes of each hour immobile, compared to 34 minutes in the group as a whole. The study was published Thursday in the Clinical Journal of the American Society of Nephrology.

While many of us are getting standing desks to combat the ills of sitting all day, Beddhu says there’s not enough evidence to confirm solid benefit from standing all day versus sitting. So he suggests even if you do stand all day you may want to add a spin around the office every hour. Day to day, week to week, it adds up, he says.

If you’re awake say, 16 hours a day and you move about two minutes every hour, that adds up to 32 minutes a day. Multiply that by five work days, you end up with 160 minutes per work week of light activity. And if you can do more, even better, says Beddhu. “If you can do five minutes every hour, you can actually end up burning 1,000 additional calories a week.” That can decrease fat tissue and help maintain or even lose weight.

As for that two-minute burst of activity satisfying federal guidelines for physical activity, forget it. You still have to engage in at least 150 minutes of moderate to vigorous physical activity every week. This more intense activity has numerous benefits of its own, including strengthening the cardiovascular system.

Tags: ,
Read more

Hunger sensitive neurons

Share

delicious-196930_1280

Scientists have found that a set of neurons is responsible for the unpleasant feelings associated with hunger. The neurons do not drive an animal to eat, but rather teach an animal to respond to sensory cues that signal the presence of food.

If you’re finding it difficult to stick to a weight-loss diet, scientists at the Howard Hughes Medical Institute’s Janelia Research Campus say you can likely blame hunger-sensitive cells in your brain known as AGRP neurons. According to new experiments, these neurons are responsible for the unpleasant feelings of hunger that make snacking irresistible.

The negative emotions associated with hunger can make it hard to maintain a diet and lose weight, and these neurons help explain that struggle, says Scott Sternson, a group leader at Janelia. In an environment where food is readily available, their difficult-to-ignore signal may seem like an annoyance, but from an evolutionary point of view, they make sense. For earlier humans or animals in the wild, pursuing food or water can mean venturing into a risky environment, which might require some encouragement. “We suspect that what these neurons are doing is imposing a cost on not dealing with your physiological needs,” he adds.

AGRP neurons do not directly drive an animal to eat, but rather teach an animal to respond to sensory cues that signal the presence of food. “We suspect that these neurons are a very old motivational system to force an animal to satisfy its physiological needs. Part of the motivation for seeking food is to shut these neurons off,” says Sternson, whose team also demonstrated that a different set of neurons is specialized to generate unpleasant feelings of thirst. Sternson and his colleagues published their findings in the journal Nature on April 27, 2015.

Hunger affects nearly every cell in the body, and several types of neurons are dedicated to making sure an animal eats when energy stores are low. But Sternson says that until now, what scientists had learned about those neurons had not completely matched up to something we already know: hunger is unpleasant.

“There was an early prediction that there would be neurons that make you feel bad when you were hungry or thirsty. This made sense from an intuitive point of view, but all of the neurons that had been looked at seemed to have the opposite effect,” he says. In earlier studies, researchers found that neurons that promoted eating did so by increasing positive feelings associated with food. In other words — not surprisingly — hunger makes food tastes better.

Some scientists had begun to suspect their ideas about a negative signal in the brain motivating hunger might be wrong. But their knowledge of the system was incomplete. AGRP neurons, located in a regulatory area of the brain known as the hypothalamus, were clearly involved in feeding behaviors: When the body lacks energy, AGRP neurons become active, and when AGRP neurons are active, animals eat. But no one had yet investigated those cells’ strategy for generating that motivation.

Read more

Smartphone babies

Share

baby-102477_1280

More than one-third of babies are tapping on smartphones and tablets even before they learn to walk or talk, and by 1 year of age, one in seven toddlers is using devices for at least an hour a day.

The American Academy of Pediatrics discourages the use of entertainment media such as televisions, computers, smartphones and tablets by children under age 2. Little is known, however, when youngsters actually start using mobile devices.

Researchers developed a 20-item survey to find out when young children are first exposed to mobile media and how they use devices. The questionnaire was adapted from the “Zero to Eight” Common Sense Media national survey on media use in children.

Parents of children ages 6 months to 4 years old who were at a hospital-based pediatric clinic that serves a low-income, minority community were recruited to fill out the survey. Participants were asked about what types of media devices they have in their household, children’s age at initial exposure to mobile media, frequency of use, types of activities and if their pediatrician had discussed media use with them.

Results from 370 parents showed that 74 percent were African-American, 14 percent were Hispanic and 13 percent had less than a high school education. Media devices were ubiquitous, with 97 percent having TVs, 83 percent having tablets, 77 percent having smartphones and 59 percent having Internet access.

Children younger than 1 year of age were exposed to media devices in surprisingly large numbers: 52 percent had watched TV shows, 36 percent had touched or scrolled a screen, 24 percent had called someone, 15 percent used apps and 12 percent played video games.

By 2 years of age, most children were using mobile devices.

Lead author Hilda Kabali, MD, a third-year resident in the Pediatrics Department at Einstein Healthcare Network, said the results surprised her.

“We didn’t expect children were using the devices from the age of 6 months,” she said. “Some children were on the screen for as long as 30 minutes.”

Results also showed 73 percent of parents let their children play with mobile devices while doing household chores, 60 percent while running errands, 65 percent to calm a child and 29 percent to put a child to sleep.

Time spent on devices increased with age, with 26 percent of 2-year-olds and 38 percent of 4-year-olds using devices for at least an hour a day.

Finally, only 30 percent of parents said their child’s pediatrician had discussed media use with them.

Read more

Calorie Restriction

Share

marathon-310943_1280

Calorie restriction has long been studied as a way to extend lifespan in animals. It has been associated with the ability to reduce the risks of cardiovascular and other diseases and to improve overall health. Now, researchers at Chang Gung University in Taiwan have found that calorie restriction can also be beneficial to muscles, improving muscle metabolism and mass at an important time — during middle age. The article “Late-onset Caloric Restriction Alters Skeletal Muscle Metabolism by Modulating Pyruvate Metabolism” is published ahead of print in the American Journal of Physiology-Endocrinology and Metabolism.

“To date, caloric restriction (CR) is the only non-pharmaceutical and non-genetic strategy that increases the lifespan of animals and provides health benefits,” the research team wrote. “Regarding skeletal muscle, an organ that is critical for movement and fuel metabolism, studies have reported that CR attenuates age-related muscle loss.”

Calorie restriction is thought to have a protective effect on muscle cells and may help cells better use antioxidants, avoid damage caused by free radicals and function better. While studies that observed the effects of lifelong calorie restriction have shown mixed results in animals of different ages, recent studies have suggested that age may play a role in how CR affects individual animals. The research team hypothesized that because CR can help reprogram metabolism, the most benefit can be reaped from aging muscles in which cellular metabolism is impaired.

Researchers focused on two pathways that produce energy in muscles, glycolysis (sugar metabolism) and mitochondrial oxidative phosphorylation (OXPHOS) in both young and middle-aged rats that were fed a normal diet or a calorie-restricted diet. In the 14-week study, rats on the calorie-restricted diet received 10 percent calorie restriction in the first week, 25 percent restriction in the second and 40 percent restriction for the remaining 12 weeks. The control rats received no calorie restriction. After 14 weeks, the researchers studied changes in the rats’ muscles.
“We investigated whether CR reprogrammed muscle metabolism and whether this improvement was associated with the observed increase in muscle mass. In addition, we examined whether the CR-induced changes were age-dependent,” the researchers wrote.

Not surprisingly, the middle-aged rats had less muscle mass than the young rats did. However, while 14 weeks of calorie restriction did not significantly affect the middle-aged rats, it reduced muscle mass in the young rats. Calorie restriction slowed the glycolytic rate in the muscles and increased the cells’ dependency for OXPHOS versus glycolysis in older rats, which was linked to improvement of normalized muscle mass. The team also found that “14 weeks of CR reprogrammed cellular metabolism, where the relative contribution of OXPHOS and glycolysis in muscles of middle-aged rats with CR was similar to that in muscles of young rats.”

Read more

How to approach your boss about a treadmill desk

Share

m-1503-86 007

New research shows impact treadmill desks have on job performance and show the adverse effects of sedentary office work.  h

If you happen to be interested in using a treadmill desk, your greatest challenge may be convincing your boss. Fortunately, two BYU researchers have good news: People on treadmill desks perform cognitive tasks nearly as well as those at sitting desks, despite the fact that they’re walking.

In a study published Wednesday in PLOS One, exercise science professor James LeCheminant and neuroscientist Michael Larson report their findings after putting treadmill desks to the test. The duo compared the cognitive performance of healthy adults sitting at a desk to those using treadmill desks while carrying out the same tasks.

While they found the walkers showed slight decreases in processing speed, attention and typing words per minute—tasks that require fine-motor skills or heavy concentration—the overall drop was not enough to warrant major concern.

“Though statistically significant, we are not talking about major differences between the treadmill walking and sitting conditions,” LeCheminant said. “Rather, these are very modest differences.”

In other words, the health benefits of a walking desk shown by previous research appear to outweigh the slight drop in productivity that comes with such a setup.

LeCheminant and Larson strongly support the use of treadmill desks, sit-stand desks, and any other efforts to safely improve physical activity in the work place. In fact, Larson is planning on getting a treadmill desk himself in light of his research.

With that in mind, here’s a quick list of things treadmill desk hopefuls can tell their boss (based on previous research as well as LeCheminant and Larson’s findings):

  • Academic research supports them as a way to increase physical activity at work.
  • Thinking abilities only drop slightly when using one, but not below average performance marks.
  • Sitting at a desk all day could be shaving years off your life

A neuroscience professor at a research institution is getting one after studying them
“For health alone it’s great, but if the cognitive decline is small, than you bet it’s worth it,” Larson said. “The health benefits likely outweigh any slight performance dips you may get from implementing the treadmill desk.”

For the study, the duo and their BYU student team assessed 75 healthy adults, half using treadmill desks, half using regular desks, for a 45-minute period. The 37 people on treadmill desks walked at a speed of 1.5 mph. Walkers saw a 9 percent drop in cognitive processing speed, attention and working memory and a 13-word-per-minute drop in typing.

Their findings show that treadmill desks may be most appropriate to use during tasks that are less cognitive-demanding (checking email) or do not require a great deal of fine-motor skills (non-typing tasks).

Larson said professor friends who use treadmill desks step off when they have a task that requires high concentration, such as intense reading and editing. That said, researchers were surprised to find treadmill walkers retained their learning just as well as sitters, even if it took them longer to process it in the first place.

“They’re not going to get it as fast, but in the long run they’re going to get it,” Larson said. “While walkers don’t learn as fast as the sitters, they were able to retain the information as well later on.”

Researchers say there is room for much more exploration on the subject. One question their study did not address was whether individuals improve their cognitive performance or typing performance with practice on a treadmill desk over time.

“Simply put, we still need to have more data to determine the effectiveness of treadmill desks for a host of other outcomes related to cognition and productivity,” LeCheminant said.

Read more

These eight nutrients that may help keep your brain in good shape.

Share

walnuts-552975_1280

As people age they can experience a range of cognitive issues from decreased critical thinking to dementia and Alzheimer’s disease.

1. Cocoa Flavanols: Cocoa flavanols have been linked to improved circulation and heart health, and preliminary research shows a possible connection to memory improvement as well. A study showed cocoa flavanols may improve the function of a specific part of the brain called the dentate gyrus, which is associated with age-related memory (Brickman, 2014).

2. Omega-3 Fatty Acids: Omega-3 fatty acids have long been shown to contribute to good heart health are now playing a role in cognitive health as well. A study on mice found that omega-3 polyunsaturated fatty acid supplementation appeared to result in better object recognition memory, spatial and localizatory memory (memories that can be consciously recalled such as facts and knowledge), and adverse response retention (Cutuli, 2014). Foods rich in omega-3s include salmon, flaxseed oil, and chia seeds.

3. Phosphatidylserine and Phosphatidic Acid: Two pilot studies showed that a combination of phosphatidylserine and phosphatidic acid can help benefit memory, mood, and cognitive function in the elderly (Lonza, 2014).

4. Walnuts: A diet supplemented with walnuts may have a beneficial effect in reducing the risk, delaying the onset, or slowing the progression of Alzheimer’s disease in mice (Muthaiyah, 2014).

5. Citicoline: Citicoline is a natural substance found in the body’s cells and helps in the development of brain tissue, which helps regulate memory and cognitive function, enhances communication between neurons, and protects neural structures from free radical damage. Clinical trials have shown citicoline supplements may help maintain normal cognitive function with aging and protect the brain from free radical damage. (Kyowa Hakko USA).

6. Choline: Choline, which is associated with liver health and women’s health, also helps with the communication systems for cells within the brain and the rest of the body. Choline may also support the brain during aging and help prevent changes in brain chemistry that result in cognitive decline and failure. A major source of choline in the diet are eggs.

7. Magnesium: Magnesium supplements are often recommended for those who experienced serious concussions. Magnesium-rich foods include avocado, soy beans, bananas and dark chocolate.

8. Blueberries: Blueberries are known to have antioxidant and anti-inflammatory activity because they boast a high concentration of anthocyanins, a flavonoid that enhances the health-promoting quality of foods. Moderate blueberry consumption could offer neurocognitive benefits such as increased neural signaling in the brain centers.

Institute of Food Technologists (IFT)

Read more

There is currently no genetic or epigenetic test available to assess autism risk.

Share

baby-22194_1280

In a small study, Johns Hopkins researchers found that DNA from the sperm of men whose children had early signs of autism shows distinct patterns of regulatory tags that could contribute to the condition.

Autism spectrum disorder (autism) affects one in 68 children in the U.S. Although studies have identified some culprit genes, most cases remain unexplained. But most experts agree that autism is usually inherited, since the condition tends to run in families. In this study, investigators looked for possible causes for the condition not in genes themselves, but in the “epigenetic tags” that help regulate genes’ activity.

“We wondered if we could learn what happens before someone gets autism,” says Andrew Feinberg, M.D., M.P.H., the King Fahd Professor of Molecular Medicine and director of the Center for Epigenetics at the Johns Hopkins University School of Medicine. “If epigenetic changes are being passed from fathers to their children, we should be able to detect them in sperm,” adds co-lead investigator Daniele Fallin, Ph.D., professor and chair of the Department of Mental Health in the Bloomberg School of Public Health and director of the Wendy Klag Center for Autism and Developmental Disabilities.

In addition to being easier to sample than egg cells from women, sperm are more susceptible to environmental influences that could alter the epigenetic tags on their DNA. Feinberg, Fallin and their team assessed the epigenetic tags on DNA from sperm from 44 dads. The men were part of an ongoing study to assess the factors that influence a child early on, before he or she is diagnosed with autism. The study enrolls pregnant mothers who already have a child with autism and collects information and biological samples from these mothers, the new baby’s father and the babies themselves after birth. Early in the pregnancy, a sperm sample was collected from fathers enrolled in the study. One year after the child was born, he or she was assessed for early signs of autism using the Autism Observation Scale for Infants (AOSI).
The researchers collected DNA from each sperm sample and looked for epigenetic tags at 450,000 different positions throughout the genome. They then compared the likelihood of a tag being in a particular site with the AOSI scores of each child. They found 193 different sites where the presence or absence of a tag was statistically related to the AOSI scores.

When they looked at which genes were near the identified sites, they found that many of them were close to genes involved in developmental processes, especially neural development. Of particular interest was that four of the 10 sites most strongly linked to the AOSI scores were located near genes linked to Prader-Willi syndrome, a genetic disorder that shares some behavioral symptoms with autism. Several of the altered epigenetic patterns were also found in the brains of individuals with autism, giving credence to the idea that they might be related to autism.

A detailed report of their findings will be published online in the International Journal of Epidemiology on April 15.

 

Read more

Going bald?

Share

charles-darwin-62967_1280

If there’s a cure for male pattern baldness, it might hurt a little. A team led by USC Stem Cell Principal Investigator Cheng-Ming Chuong has demonstrated that by plucking 200 hairs in a specific pattern and density, they can induce up to 1,200 replacement hairs to grow in a mouse. These results are published in the April 9 edition of the journal Cell.

The study began a couple of years ago when first author and visiting scholar Chih-Chiang Chen arrived at USC from National Yang-Ming University and Veterans General Hospital, Taiwan. As a dermatologist, Chen knew that hair follicle injury affects its adjacent environment, and the Chuong lab had already established that this environment in turn can influence hair regeneration. Based on this combined knowledge, they reasoned that they might be able to use the environment to activate more follicles.

To test this concept, Chen devised an elegant strategy to pluck 200 hair follicles, one by one, in different configurations on the back of a mouse. When plucking the hairs in a low-density pattern from an area exceeding six millimeters in diameter, no hairs regenerated. However, higher-density plucking from circular areas with diameters between three and five millimeters triggered the regeneration of between 450 and 1,300 hairs, including ones outside of the plucked region.

Working with Arthur D. Lander from the University of California, Irvine, the team showed that this regenerative process relies on the principle of “quorum sensing,” which defines how a system responds to stimuli that affect some, but not all members. In this case, quorum sensing underlies how the hair follicle system responds to the plucking of some, but not all hairs.

Through molecular analyses, the team showed that these plucked follicles signal distress by releasing inflammatory proteins, which recruit immune cells to rush to the site of the injury. These immune cells then secrete signaling molecules such as tumor necrosis factor alpha (TNF-α), which, at a certain concentration, communicate to both plucked and unplucked follicles that it’s time to grow hair.

Read more

Decisions in shopping

Share

friends

Say you’re out shopping for basic household goods — perhaps orange juice and soup. Or light bulbs. Or diapers for your young child. How do you choose the products you buy? Is it a complicated decision, or a simple one?

It could be complex: Factors like price, quality, and brand loyalty may run through your mind. Indeed, some scholars have developed complicated models of consumer decision-making, in which people accumulate substantial product knowledge, then weigh that knowledge against the opportunity to explore less-known products.

But in a new paper, MIT researchers suggest that your brain is making a simpler calculation when you shop: You are most likely deploying an “index strategy,” a straightforward ranking of products. It may not be an absolutely perfect calculation, given all the available information, but the study suggests that an index strategy comes very close to being optimal, and is a far easier way for consumers to make their choices.

 

 

Read more

Add spinach, kale, collards and mustard greens to your diet

Share

salad-374173_1280

Increasing consumption of green leafy vegetables offers a simple, affordable and non-invasive way of protecting your brain from Alzheimer’s disease and dementia.

Researchers tracked the diets and cognitive abilities of more than 950 older adults for an average of five years and saw a significant decrease in the rate of cognitive decline for study participants who consumed greater amounts of green leafy vegetables. People who ate one to two servings per day had the cognitive ability of a person 11 years younger than those who consumed none.

When the researchers examined individual nutrients linked with slowing cognitive decline, they found that vitamin K, lutein, folate and beta-carotene were most likely helping to keep the brain healthy.

Read more

When to test your children’s eyesight

Share

kids-tv

This is for everyone whose parents said, “Sitting too close to the TV is going to ruin your eyes.” In other words, pretty much all of us.

Sitting too close to the TV doesn’t predict nearsightedness, according to a study that tracked the vision of thousands of children over 20 years. Nor does doing a lot of close work.

Instead, as early as age 6 a child’s refractive error — the measurements used for an eyeglass prescription — best predicts the risk.

One-third of adults are nearsighted, and the problem typically develops between ages 8 and 12.

Children are not great about telling parents that they can’t see the board in class, and the letter-chart screening tests used by schools and pediatricians are less than ideal, according to Karla Zadnik, dean of the College of Optometry at Ohio State University and lead author of the study. It was published Thursday in JAMA Ophthalmology.

“Just measuring how well they can read the chart won’t capture that key piece of information,” she says.

Zadnik began the study in California back in 1989, expanding it to include almost 5,000 ethnically diverse children around the United States. The children’s eyes were measured regularly, and parents were quizzed on health and habits.

In this analysis, the researchers looked at 13 potential risk factors for nearsightedness, or myopia.

Having nearsighted parents increases the risk for myopia, the study found, but it’s not as strong a predictor as is refractive error. Doing close work or watching TV close up weren’t risk factors.

Earlier work by this group found that children who spent more time outdoors were less likely to be nearsighted, but it’s unclear why that would be. One theory is that being exposed to sunlight or higher vitamin D levels could make a difference. The study was funded by the National Institutes of Health.

In the study, children whose refractive error was less than +0.75 diopters (which is slightly farsighted) in first grade were most likely to become nearsighted. As a child gets older that number drops, so that by sixth grade a child with no refractive error is at risk. Myopia is defined as a refractive error of -0.75 diopters or more.

Optometrists and ophthalmologists measure refractive error by changing lenses in front of a patient’s eye and asking, “Which is better, 1 or 2?”

There’s no way to prevent nearsightedness, but finding out if children are at risk could make it more likely that they’ll not go through a year of school squinting at the board, Zadnik says.

“A parent might say, ‘Oh, my child is at high risk, I want to be sure he’s getting regular eye examinations,’ ” she says.

Children’s eye exams are required to be covered under the Affordable Care Act, but the states differ on just what they’re covering, so it pays to check.

Talking with Zadnik, you get the sense that she’s happy to debunk some of those parental admonitions.

“I had a grandfather who was an optometrist,” she says. “He used to tell me that at the end of every page I should look up at something across the room. I am very nearsighted.”

Read more

Junk Food Addiction

Share

fast-food-154556_1280

Researchers have shown there are two critical windows during the developmental pathway to adulthood when exposure to junk food is most harmful, particularly for female offspring.

Mothers who eat junk food while pregnant are programming their babies to be addicted to a high fat, high sugar diet by the time they are weaned. However these latest studies reveal there may be a chance to turn around this junk food addiction in two critical windows─equating to late pregnancy and in adolescence in humans.

“Our research suggests that too much junk food consumed late in pregnancy for humans has the potential to be more harmful to the child than excess junk food early in the pregnancy,” says Dr Jessica Gugusheff, post-doctoral researcher in the School of Agriculture, Food and Wine.
“Importantly, it also indicates that if excess junk food was consumed by the mother in those early stages of pregnancy, there may be a chance to reduce those negative effects on the baby by eating a healthy diet in late pregnancy. “The second critical window is adolescence and we’ve found differences between males and females. Our experiments showed that eating a healthy diet during adolescence could reverse the junk-food preference in males but not females.”

The junk food preference is believed to result from a desensitisation of the normal reward system (the opioid and dopamine signalling pathway) fuelled by highly palatable high fat, high sugar diets. Offspring with less sensitive reward systems need more fat and sugar to get the same “good feeling.”

This brain area grows at its fastest during these critical windows and is therefore most susceptible to alteration at these times.

Read more

Can the consumption of fatty foods change your behavior and your brain?

Share

lonely-273629_1280

High-fat diets have long been known to increase the risk for medical problems, including heart disease and stroke, but there is growing concern that diets high in fat might also increase the risk for depression and other psychiatric disorders.

A new study published in the current issue of Biological Psychiatry raises the possibility that a high-fat diet produces changes in health and behavior, in part, by changing the mix of bacteria in the gut, also known as the gut microbiome.

The human microbiome consists of trillions of microorganisms, many of which reside in the intestinal tract. These microbiota are essential for normal physiological functioning. However, research has suggested that alterations in the microbiome may underlie the host’s susceptibility to illness, including neuropsychiatric impairment.

“This paper suggests that high-fat diets impair brain health, in part, by disrupting the symbiotic relationship between humans and the microorganisms that occupy our gastrointestinal tracks,” commented Dr. John Krystal, Editor of Biological Psychiatry.

Indeed, these findings provide evidence that diet-induced changes to the gut microbiome are sufficient to alter brain function even in the absence of obesity. This is consistent with prior research, which has established an association between numerous psychiatric conditions and gastrointestinal symptoms, but unfortunately, the mechanisms by which gut microbiota affect behavior are still not well understood.

Read more

Are smartphone users lazy?

Share

lifestyle

Our smartphones help us find a phone number quickly, provide us with instant directions and recommend restaurants, but new research indicates that this convenience at our fingertips is making it easy for us to avoid thinking for ourselves.

The study, from researchers at the University of Waterloo and published in the journal Computers in Human Behavior, suggests that smartphone users who are intuitive thinkers — more prone to relying on gut feelings and instincts when making decisions — frequently use their device’s search engine rather than their own brainpower. Smartphones allow them to be even lazier than they would otherwise be.

“They may look up information that they actually know or could easily learn, but are unwilling to make the effort to actually think about it,” said Gordon Pennycook, co-lead author of the study, and a PhD candidate in the Department of Psychology at Waterloo.

In contrast, analytical thinkers second-guess themselves and analyze a problem in a more logical sort of way. Highly intelligent people are more analytical and less intuitive when solving problems.

“Decades of research has revealed that humans are eager to avoid expending effort when problem-solving and it seems likely that people will increasingly use their smartphones as an extended mind,” said Nathaniel Barr, the other lead author of the paper, and a postdoctoral researcher at Waterloo.

In three studies involving 660 participants, the researchers examined various measures including cognitive style ranging from intuitive to analytical, plus verbal and numeracy skills. Then they looked at the participants’ smartphone habits.

Participants in the study who demonstrated stronger cognitive skills and a greater willingness to think in an analytical way spent less time using their smartphones’ search-engine function.

“Our research provides support for an association between heavy smartphone use and lowered intelligence,” said Pennycook. “Whether smartphones actually decrease intelligence is still an open question that requires future research.”

The researchers say that avoiding using our own minds to problem-solve might have adverse consequences for aging.

“Our reliance on smartphones and other devices will likely only continue to rise,” said Barr. “It’s important to understand how smartphones affect and relate to human psychology before these technologies are so fully ingrained that it’s hard to recall what life was like without them. We may already be at that point.”

The results also indicate that use of social media and entertainment applications generally did not correlate to higher or lower cognitive abilities.

Tags:
Read more

Epigenetics and our evolving genes

Share

bubbles

In the early 19th century, Jean-Baptiste Lamarck was run out of bio-town for daring to suggest that evolution can take place in one generation; he argued that if giraffes stretch their necks to reach the upper branches of trees, their necks will lengthen and this beneficial trait will be passed to their progeny.

In other words, Lamarck was saying that evolution isn’t the very slow and apparently haphazard process Darwin described. And today, of course, the AP Biology review (or any other relevant text) states something like, “We now know that Lamarck’s theory was wrong. This is because acquired changes (changes at a ‘macro’ level in somatic cells) cannot be passed on to germ cells.” Cut and dried, case closed … but not so fast.

The Case of the Voodoo Tomatoes

Until very recently, “transgenerational inheritance” was a concept typically banned from all polite geneticists’ conversations. But then doubts began to creep in when scientists performed experiments and observed the various nifty tricks and speed with which various bacteria adapted to new environments.

The experimenters realized two things: First, there was a very low likelihood that rapid adaptation was taking place due to random, beneficial mutations. Second, given how fast a trait like antibiotic resistance could spread within a species and across many species of microbes, there had to be some real-time evolutionary reset mechanism. So a few brave souls revived the term “epigenetics,” first coined in 1942 by Conrad H. Waddington, a British scientist

Most early epigeneticists were ignored or written off as “voodoo biologists.” What they preached was such a radically different discipline from core genetics that as long as their experiments were confined to bacteria, the outcomes and modes of action could be considered a fluke.

But then came tomatoes, in which scientists observed and quantified transgenerational changes from mother to daughter to granddaughter tomato after exposure to drought, extreme cold, or great heat. The discoveries kept piling on; in 2013, a Cornell team demonstrated that epigenetics, not gene code, was a critical factor when trying to figure out when and why a tomato ripens.

Similar epigenetic effects were discovered in worms, fruit flies, and rodents; a creative and slightly mean-spirited experiment involved letting mice smell sweet almonds and then shocking their feet. Soon mice were terrified of the smell of almonds. When these mice reproduced, the kids were never shocked, but they were still quite afraid of the same smell. So were the grandkids. The brains of all three generations had modified M71 glomeruli, the specific neurons sensitive to that type of smell. We do not yet know how many generations epigenetic tags can survive for, but in rats the effects can last at least four generations. In worms, disrupting epigenetic control mechanisms can have consequences persisting for 70 generations.

This implies that an environmental stimulus (for example, famine, stress, toxins, affection) can be transmitted via the nervous, endocrine, or immune systems to the DNA in each cell, which in turn sets switches that express hereditary code to silence or activate in a particular situation.

Under siege by some invaders? Flip a few switches to cope. Fall harvest plentiful? Flip a few switches to store fat, procreate, and ramp up metabolism. A plague in the neighborhood? Flip a few switches to enhance resistance.

Your DNA genome has “on/off” chemical switches that collectively are known as your epigenome. So your epigenome is unique and changes every time a switch is flipped. Because your epigenome’s switches are considered reversible when they are passed from parent to child, many scientists view this to be “soft evolution,” i.e., not guaranteed to be as enduring as when a mutation arises in the core DNA genome.

The epigenome can be passed on, sometimes reversed, sometimes reinforced. Unlike in classic Mendelian genetics, it is hard to predict and quantify, so you can just imagine how this variation in experimental outcomes has driven many careful, traditional scientists who believed the DNA code was the be‑all and end-all of heredity completely crazy. They would try to eliminate all the variables, use genetically identical rats, and sometimes get completely different results. So it is no surprise that for decades epigenetics was ignored or pooh-poohed by funders, senior biologists, and science magazines. There was no reliable way to trace the precipitating event and no way to easily predict which individuals would be affected in future generations.

So how do our epigenomes become informed about life around us, particularly the epigenome of a fetus or a yet‑to‑be‑conceived child? Most of the science points to our neural, endocrine, and immune systems. Our brains, glands, and immune cells sense the outside world and secrete hormones, growth factors, neurotransmitters, and other biological signaling molecules to tell every organ in the body that it needs to adapt to a changing world

As we experience stress, love, aging, fear, pleasure, infection, pain, exercise, or hunger, various hormones adjust various physical responses within our bodies. Hormones surge through our blood; changes in cortisol, testosterone, estrogen, interleukin, leptin, insulin, oxytocin, thyroid hormone, growth hormone, and adrenaline make us behave and develop in different ways. And they signal to our epigenomes, “Time to flip some switches!”

Genes get shut off or turned on as the world around us changes.

The Book of Life

Soft evolution is analogous to an annotated book. The basic text and argument of the book remain the same. But if the text is gradually surrounded by margin notes and comments, then those who read different annotations of the exact same book may end up with very different learning, depending on who annotated the particular copy they borrowed, how they treated the original text, how the reader decided to interpret the interplay between the original printed text and the annotations, and whether some of the annotations were erased or modified by other readers.

There are multiple ways to add in rapid, inherited epigenetic adaptations without any change in the core DNA code. One basic and common mechanism is DNA methylation: Enzymes in our cells attach a methyl group (CH3) to a cytosine (C) located next to a guanine (G) in our DNA, forming a methylated island. This tells the gene that follows next, “Shhh, do not express yourself.”

One of the key reasons for human diversity is that about 70 percent, or roughly 14,000, of our genes have these “on/off” switches plus random mutations among them, so there are countless combinations of ways that these switches are flipped in the human population.

Sperm and eggs get a nearly fresh start: An estimated 90 percent of the switches are erased before conception occurs, which means most epigenetic memories are lost. But there is still a lot of recent data moving from generation to generation. (Those who described sperm as simple bags of DNA with a tail could never explain why sperm had so many receptors for so many hormones not directly related to reproduction, including leptin, one of the obesity genes, as well as 19 growth factors, cytokines, and neurotransmitters.)

Epigenetic switches can be flipped on and off in sperm, eggs, or embryos, so your kids and grandkids can share your environmental experiences and knowledge, and be better prepared for the environment they will soon be entering. For instance, if you were a male smoker, and your brother was not, 28 epigenetic signals in your sperm would be different from his. Sperm are listening.

At conception, your grandchildren listen to distant tales, and sometimes pass them on.

 

An excerpt from Evolving Ourselves by Juan Enriquez and Steve Gullans. 

Tags:
Read more

Zinc Lozenges May Shorten the Duration of Colds

Share

5226290116_4c46f5e023_o1

According to a meta-analysis published in BMC Family Practice, high dose zinc acetate lozenges shortened the duration of common-cold associated nasal discharge by 34%, nasal congestion by 37%, scratchy throat by 33%, and cough by 46%.

The common cold is an infection caused by over a hundred viruses, and it is a major cause of days off school or work and visits to a doctor. A previous meta-analysis of three randomized trials found that high dose zinc acetate lozenges shorten the duration of colds by 42%. Since all of the three studies reported the duration of diverse respiratory symptoms and of systemic symptoms such as muscle ache and headache, Harri Hemilä from Helsinki, Finland and Elizabeth Chalker from Sydney, Australia decided to investigate whether there are differences in the effect of zinc lozenges on different common-cold symptoms.

When zinc acetate lozenges dissolve in the mouth, zinc ions are released into the saliva of the pharyngeal region where the levels are consequently high. Therefore the effects of zinc lozenges might be greatest on symptoms of the pharyngeal region such as sore throat, and less on nasal symptoms. However, when Hemilä and Chalker pooled together the results of the three studies, they found no evidence that the effects of zinc lozenges are less for nasal symptoms compared with respiratory symptoms originating from lower anatomical regions.

According to the calculations by Hemilä and Chalker, high dose zinc acetate lozenges shortened the duration of nasal discharge by 34%, nasal congestion by 37%, sneezing by 22%, scratchy throat by 33%, sore throat by 18%, hoarseness by 43%, and cough by 46%. Furthermore, they found strong evidence that zinc lozenges also shortened the duration of muscle ache by 54%. On the other hand, there was no evidence of zinc effect on the duration of headache and fever. However, the latter two symptoms were infrequent in the three studies and therefore no definite conclusions can be drawn on headache and fever.

Adverse effects of zinc were minor in the three studies. Therefore Hemilä and Chalker conclude from their research that “zinc acetate lozenges releasing zinc ions at doses of about 80 mg/day may be a useful treatment for the common cold, started within 24 hours, for a time period of less than two weeks.”

Read more

Classical music modulates genes that are responsible for brain functions

Share

B_2PqctWcAABawa

Although listening to music is common in all societies, the biological determinants of listening to music are largely unknown. According to a latest study, listening to classical music enhanced the activity of genes involved in dopamine secretion and transport, synaptic neurotransmission, learning and memory, and down-regulated the genes mediating neurodegeneration. Several of the up-regulated genes were known to be responsible for song learning and singing in songbirds, suggesting a common evolutionary background of sound perception across species.

Listening to music represents a complex cognitive function of the human brain, which is known to induce several neuronal and physiological changes. However, the molecular background underlying the effects of listening to music is largely unknown. A Finnish study group has investigated how listening to classical music affected the gene expression profiles of both musically experienced and inexperienced participants. All the participants listened to W.A. Mozart’s violin concert Nr 3, G-major, K.216 that lasts 20 minutes.

Listening to music enhanced the activity of genes involved in dopamine secretion and transport, synaptic function, learning and memory. One of the most up-regulated genes, synuclein-alpha (SNCA) is a known risk gene for Parkinson’s disease that is located in the strongest linkage region of musical aptitude. SNCA is also known to contribute to song learning in songbirds.

“The up-regulation of several genes that are known to be responsible for song learning and singing in songbirds suggest a shared evolutionary background of sound perception between vocalizing birds and humans”, says Dr. Irma Järvelä, the leader of the study.

In contrast, listening to music down-regulated genes that are associated with neurodegeneration, referring to a neuroprotective role of music.

“The effect was only detectable in musically experienced participants, suggesting the importance of familiarity and experience in mediating music-induced effects”, researchers remark.

The findings give new information about the molecular genetic background of music perception and evolution, and may give further insights about the molecular mechanisms underlying music therapy.

Read more

Nix the late-night snacks

Share

bg51

If you’re looking to improve your heart health by changing your diet, when you eat may be just as important as what you eat.

Researchers at San Diego State University and the Salk Institute for Biological Studies found that by limiting the time span during which fruit flies could eat, they could prevent aging- and diet-related heart problems. The researchers also discovered that genes responsible for the body’s circadian rhythm are integral to this process. What’s more the benefits of a time-restricted diet weren’t exclusive to young flies, when the researchers introduced these dietary time restrictions to older flies, their hearts became healthier, too. Some degree of heart protection persisted even for flies that went back to eating whenever they wanted.

Previous research has found that people who tend to eat later in the day and into the night have a higher chance of developing heart disease than people who cut off their food consumption earlier. “So what’s happening when people eat late?” asked Girish Melkani, a biologist at SDSU whose research focuses on cardiovascular physiology. “They’re not changing their diet, just the time.”

This study shows the benefits of time-restricted feeding for obesity, metabolic diseases and type-2 diabetes.

All together, these results reinforce the idea that the daily eating pattern has a profound impact on both the body and the brain. Humans don’t consume the same food every day and our lifestyle is a major determinant of when we can and cannot eat. But at the very minimum, this study offer some context in which we should be pursuing such questions in humans.

Melkani is optimistic that the results could one day translate into cardiac- and obesity-related health benefits for humans. “Time-restricted feeding would not require people to drastically change their lifestyles, just the times of day they eat,” Melkani said. “The take-home message then would be to cut down on the late-night snacks.”

Read more

Drug dramatically increases healthy lifespan

Share

back up pics and vids 325

A research team from The Scripps Research Institute (TSRI), Mayo Clinic and other institutions has identified a new class of drugs that in animal models dramatically slows the aging process—alleviating symptoms of frailty, improving cardiac function and extending a healthy lifespan.

The new research was published March 9 online ahead of print by the journal Aging Cell.

The scientists coined the term “senolytics” for the new class of drugs.

“We view this study as a big, first step toward developing treatments that can be given safely to patients to extend healthspan or to treat age-related diseases and disorders,” said TSRI Professor Paul Robbins, PhD, who with Associate Professor Laura Niedernhofer, MD, PhD, led the research efforts for the paper at Scripps Florida. “When senolytic agents, like the combination we identified, are used clinically, the results could be transformative.”

“The prototypes of these senolytic agents have more than proven their ability to alleviate multiple characteristics associated with aging,” said Mayo Clinic Professor James Kirkland, MD, PhD, senior author of the new study. “It may eventually become feasible to delay, prevent, alleviate or even reverse multiple chronic diseases and disabilities as a group, instead of just one at a time.”

Finding the Target

Senescent cells—cells that have stopped dividing—accumulate with age and accelerate the aging process. Since the “healthspan” (time free of disease) in mice is enhanced by killing off these cells, the scientists reasoned that finding treatments that accomplish this in humans could have tremendous potential.

The scientists were faced with the question, though, of how to identify and target senescent cells without damaging other cells.

The team suspected that senescent cells’ resistance to death by stress and damage could provide a clue. Indeed, using transcript analysis, the researchers found that, like cancer cells, senescent cells have increased expression of “pro-survival networks” that help them resist apoptosis or programmed cell death. This finding provided key criteria to search for potential drug candidates.

Using these criteria, the team homed in on two available compounds—the cancer drug dasatinib (sold under the trade name Sprycel®) and quercetin, a natural compound sold as a supplement that acts as an antihistamine and anti-inflammatory.

Further testing in cell culture showed these compounds do indeed selectively induce death of senescent cells. The two compounds had different strong points. Dasatinib eliminated senescent human fat cell progenitors, while quercetin was more effective against senescent human endothelial cells and mouse bone marrow stem cells. A combination of the two was most effective overall.

Remarkable Results

Next, the team looked at how these drugs affected health and aging in mice.

“In animal models, the compounds improved cardiovascular function and exercise endurance, reduced osteoporosis and frailty, and extended healthspan,” said Niedernhofer, whose animal models of accelerated aging were used extensively in the study. “Remarkably, in some cases, these drugs did so with only a single course of treatment.”

In old mice, cardiovascular function was improved within five days of a single dose of the drugs. A single dose of a combination of the drugs led to improved exercise capacity in animals weakened by radiation therapy used for cancer. The effect lasted for at least seven months following treatment with the drugs. Periodic drug administration of mice with accelerated aging extended the healthspan in the animals, delaying age-related symptoms, spine degeneration and osteoporosis.

The authors caution that more testing is needed before use in humans. They also note both drugs in the study have possible side effects, at least with long-term treatment.

The researchers, however, remain upbeat about their findings’ potential. “Senescence is involved in a number of diseases and pathologies so there could be any number of applications for these and similar compounds,” Robbins said. “Also, we anticipate that treatment with senolytic drugs to clear damaged cells would be infrequent, reducing the chance of side effects.”

Read more

Mediterranean diet cuts heart disease risk by nearly half

Share

06

Adults who closely followed the Mediterranean diet were 47 percent less likely to develop heart disease over a 10-year period compared to similar adults who did not closely follow the diet, according to a study to be presented at the American College of Cardiology’s 64th Annual Scientific Session in San Diego.

Among the study’s participants, adherence to the Mediterranean diet was more protective than physical activity. The study, conducted in Greece, bolsters evidence from earlier studies pointing to the diet’s health benefits and is the first to track 10-year heart disease risk in a general population. Most previous studies have focused on middle-aged people.

“Our study shows that the Mediterranean diet is a beneficial intervention for all types of people–in both genders, in all age groups, and in both healthy people and those with health conditions,” said Ekavi Georgousopoulou, a Ph.D. candidate at Harokopio University in Athens, Greece, who conducted the study along with Demosthenes B. Panagiotakos, Ph.D., professor at Harokopio University. “It also reveals that the Mediterranean diet has direct benefits for heart health, in addition to its indirect benefits in managing diabetes, hypertension and inflammation.”

The study is based on data from a representative sample of more than 2,500 Greek adults, ages 18 to 89, who provided researchers with their health information each year from 2001 to 2012. Participants also completed in-depth surveys about their medical records, lifestyle and dietary habits at the start of the study, after five years and after 10 years.

Overall, nearly 20 percent of the men and 12 percent of the women who participated in the study developed or died from heart disease, a suite of conditions that includes stroke, coronary heart disease caused by the buildup of plaque in the heart’s arteries, acute coronary syndromes such as heart attack, and other diseases. Other studies have shown Greeks and Americans have similar rates of heart disease and its risk factors.

The researchers scored participants’ diets on a scale from 1 to 55 based on their self-reported frequency and level of intake for 11 food groups. Those who scored in the top-third in terms of adherence to the Mediterranean diet, indicating they closely followed the diet, were 47 percent less likely to develop heart disease over the 10-year follow-up period as compared to participants who scored in the bottom-third, indicating they did not closely follow the diet. Each one-point increase in the dietary score was associated with a 3 percent drop in heart disease risk.

This difference was independent of other heart disease risk factors including age, gender, family history, education level, body mass index, smoking habits, hypertension, diabetes and high cholesterol, all of which the researchers adjusted for in their analysis.

The analysis also confirmed results of previous studies indicating that male gender, older age, diabetes and high C-reactive protein levels, a measure of inflammation, are associated with an increased risk for heart disease.

While there is no set Mediterranean diet, it commonly emphasizes fresh fruits and vegetables, whole grains, beans, nuts, fish, olive oil and even a glass of red wine. Earlier research has shown that following the traditional Mediterranean diet is linked to weight loss, reduced risk of diabetes, lower blood pressure and lower blood cholesterol levels, in addition to reduced risk of heart disease.

“Because the Mediterranean diet is based on food groups that are quite common or easy to find, people around the world could easily adopt this dietary pattern and help protect themselves against heart disease with very little cost,” Georgousopoulou said.

Among study participants, women tended to follow the Mediterranean diet more closely than did men. Despite the fact that Greece is the cradle of the Mediterranean diet, urbanization has led many Greeks to adopt a more Western diet over the past four decades, he said.

The study was limited to participants living in and around Athens, Greece, so the sample does not necessarily reflect the health conditions or dietary patterns of people in more rural areas or the rest of the world. However, previous studies have also linked the Mediterranean diet with reduced cardiovascular risks, including the Nurses’ Health Study, which included nearly 75,000 American nurses who were tracked over a 30-year period. Additional studies in other adult populations would further advance understanding of the diet’s influence on heart disease risk.

The study, “Adherence to Mediterranean is the Most Important Protector Against the Development of Fatal and Non-Fatal Cardiovascular Event: 10-Year Follow-up (2002-12) Of the Attica Study,” will be presented on March 15 at 9:30 a.m. PT/12:30 p.m. ET/4:30 p.m. UTC at the American College of Cardiology’s 64th Annual Scientific Session in San Diego. The meeting runs March 14-16

Read more

Keep learning to get healthy.

Share

origin_5620923155-1280x851

Several studies have indicated a connection between learning and health. In a recently published study from University West and Linnaeus University the researchers found that the health of school teachers is related to their level of work integrated learning.

A random sample of 229 teachers at 20 schools in Västra Götaland responded to a questionnaire which included previously tested measures of health, quality and work integrated learning. The resulting data showed a highly significant statistical correlation between the measures.
This indicates that in order to be healthy, teachers need not only teach — they must also learn and develop themselves. An ultimate state of learning is characterised by a sensation of flow, which has been described by researchers as a state of complete immersion in an activity in a way that is maximally effective while at the same time highly enjoyable. In the study, was also tested the relationship between an operationalised measure of flow and the health of the teachers and again there were a strong correlation.
According to Yvonne Lagrosen, Associate Professor in Quality Management at University West, a sense of flow implies that the workload is perceived as lower. “Doing something that you are interested in, gives you a positive stimulation and the workload seems less high. At the same time, the challenge cannot be too big, there must be a balance between the demands an your own control of your work situation. “What this research indicates is that to be healthy, we need to constantly learn and develop, in our profession and as people. If we enjoy our work to the extent that we are completely absorbed in it, as in the state of flow, we should have the optimal possibilities for a healthy influence from our work. So find a job that you really enjoy and make sure that you learn and develop at it,” said Yvonne Lagrosen.

Read more

Binge-Eating Drug Vyvanse

Share

images-2

 

Response to February 24, 2015 article in the New York Times Business Section “Maker of a Drug to Treat Binge-Eating Marketed the Disease First”

Dear Editor:

Alarmingly, the FDA’s fast-track approval of Shire Phama’s Vyvanase appears to have undercut the Standard of Care recommendations for obesity and being overweight with at least 2 co-morbid conditions to physicians and effectively puts the public’s safety at risk. In 2012, after extensive multicenter academic trials over 10 years, the US Preventive Services Task Force issued guideline recommendations for Intensive Behavioral Therapy (IBT) for obesity consisting of intensive counseling on Diet, Physical Activity, and Behavioral Therapy prior to instituting pharmacotherapy and lastly bariatric surgery for therapy failures. Trading one addictive disorder for another, without giving IBT a fair chance, is incompatible with the public’s best interests and best medical practice. When delivered properly, IBT can safely and effectively help obese patients, including binge eaters, to “unlearn” bad habits and acquire new, healthy habits for significant weight loss and optimal health for a lifetime.

Respectfully,
Samuel H. Sadow, M.D., FACS, FCCP

Read more

Suffering from depression? Serotonin boosting medications make it harder for patients to recover.

Share

woman-looking-out-from-work-window-850x400

The science behind many anti-depressant medications appears to be backwards, say the authors of a paper that challenges the prevailing ideas about the nature of depression and some of the world’s most commonly prescribed medications.

The authors of the paper, posted by the journal Neuroscience & Biobehavioral Reviews, combed existing research for evidence to support the theory that has dominated nearly 50 years of depression research: that depression is related to low levels of serotonin in the gaps between cells in the brain.

The low-serotonin theory is the basis for commonly prescribed anti-depressant medications called selective serotonin re-uptake inhibitors, or SSRIs, which keep the neurotransmitter’s levels high by blocking its re-absorption into the cells that release it. Those serotonin-boosting medications actually make it harder for patients to recover, especially in the short term, says lead author Paul Andrews, an assistant professor of Psychology, Neuroscience & Behaviour at McMaster. “It’s time we rethink what we are doing,” Andrews says. “We are taking people who are suffering from the most common forms of depression, and instead of helping them, it appears we are putting an obstacle in their path to recovery.”

When depressed patients on SSRI medication do show improvement, it appears that their brains are actually overcoming the effects of anti-depressant medications, rather than being assisted directly by them. Instead of helping, the medications appear to be interfering with the brain’s own mechanisms of recovery.

“We’ve seen that people report feeling worse, not better, for their first two weeks on anti-depressants,” Andrews says. “This could explain why.”

It is currently impossible to measure exactly how the brain is releasing and using serotonin, the researchers write, because there is no safe way to measure it in a living human brain. Instead, scientists must rely on measuring evidence about levels of serotonin that the brain has already metabolized, and by extrapolating from studies using animals.

The best available evidence appears to show that there is more serotonin being released and used during depressive episodes, not less, the authors say. The paper suggests that serotonin helps the brain adapt to depression by re-allocating its resources, giving more to conscious thought and less to areas such as growth, development, reproduction, immune function, and the stress response.
Andrews, an evolutionary psychologist, has argued in previous research that anti-depressants leave patients in worse shape after they stop using them, and that most forms of depression, though painful, are natural and beneficial adaptations to stress

Read more

Fast food addiction

Share

150220110126-large

A new University of Michigan study confirms what has long been suspected: highly processed foods like chocolate, pizza and French fries are among the most addictive.

This is one of the first studies to examine specifically which foods may be implicated in “food addiction,” which has become of growing interest to scientists and consumers in light of the obesity epidemic.

Previous studies in animals conclude that highly processed foods, or foods with added fat or refined carbohydrates (like white flour and sugar), may be capable of triggering addictive-like eating behavior. Clinical studies in humans have observed that some individuals meet the criteria for substance dependence when the substance is food.

Despite highly processed foods generally known to be highly tasty and preferred, it is unknown whether these types of foods can elicit addiction-like responses in humans, nor is it known which specific foods produce these responses, said Ashley Gearhardt, U-M assistant professor of psychology.

Unprocessed foods, with no added fat or refined carbohydrates like brown rice and salmon, were not associated with addictive-like eating behavior.

Individuals with symptoms of food addiction or with higher body mass indexes reported greater problems with highly processed foods, suggesting some may be particularly sensitive to the possible “rewarding” properties of these foods, said Erica Schulte, a U-M psychology doctoral student and the study’s lead author.

“If properties of some foods are associated with addictive eating for some people, this may impact nutrition guidelines, as well as public policy initiatives such as marketing these foods to children,” Schulte said.

Nicole Avena, assistant professor of pharmacology and systems therapeutics at Icahn School of Medicine at Mount Sinai in New York City, and a co-author on the study, explained the significance of the findings.

“This is a first step towards identifying specific foods, and properties of foods, which can trigger this addictive response,” she said. “This could help change the way we approach obesity treatment. It may not be a simple matter of ‘cutting back’ on certain foods, but rather, adopting methods used to curtail smoking, drinking and drug use.”

Future research should examine whether addictive foods are capable of triggering changes in brain circuitry and behavior like drugs of abuse, the researchers said.

Read more

For university students, walking beats sitting

Share

 

150126083929-large

Walking classrooms are better for not only for students’ physical health, but classroom engagement, a study from Sweden’s KTH Royal Institute of Technology shows.

What began in a response to a physical activity challenge for the computer science faculty at KTH has become a study in how education and fitness can be combined to improve both physical well-being, and classroom discussions.

University lecturer Olle Bälter improvised his “walking seminar” in media technology at KTH during the spring of 2014, in response to a competition in which staff were recording the number of hours they and their students spent sitting, as opposed to being active.

Taking his group of 10 students for a stroll through a wooded park near the Stockholm campus, Bälter immediately began to see results.

“Students feel freer to talk when they are outdoors than when they are in the classroom,” Bälter says. His experience seemed consistent with a paper that he cites as an inspiration — a Stanford University study linking creativity with physical activity.

Now Bälter and his colleagues are adding their experience to the body of knowledge supporting more activity in education. In an article presented at the Lund Institute of Technology eighth pedagogical inspiration conference in December, Bälter and coauthors Björn Hedin and Helena Tobiasson reported that a significant majority of the students surveyed preferred the walk seminars over traditional seminars.

Notably, 21 of 23 students surveyed said that after the workshops they felt better than after typical, sedentary seminars; and no one thought they felt worse. Furthermore, 17 of the 23 students believed that communication was better.

“It is noticeable how much easier it is for individual students to express their views on these walking seminars, particularly when the class is split into smaller groups,” Bälter says.

Second-year student Frida Haugsbakk agrees. “Everyone chipped in, even those who were too shy to speak in larger groups,” he says. “On the walk, students can address another student directly, while the others simply listen and enter the discussion later on.”

Read more

Mindfulness training can improve your health

Share

slider1

Over the past decade, there have been many encouraging findings suggesting that mindfulness training can improve a broad range of mental and physical health problems. Yet, exactly how mindfulness positively impacts health is not clear.

Carnegie Mellon University’s J. David Creswell — whose cutting-edge work has shown how mindfulness meditation reduces loneliness in older adults and alleviates stress — and his graduate student Emily K. Lindsay have developed a model suggesting that mindfulness influences health via stress reduction pathways. Their work, published in “Current Directions in Psychological Science,” describes the biological pathways linking mindfulness training with reduced stress and stress-related disease outcomes

If mindfulness training is improving people’s health, how does it get under the skin to affect all kinds of outcomes?” asked Creswell, associate professor of psychology in CMU’s Dietrich College of Humanities and Social Sciences. “We offer one of the first evidence-based biological accounts of mindfulness training, stress reduction and health.”

Creswell and Lindsay highlight a body of work that depicts the biological mechanisms of mindfulness training’s stress reduction effects. When an individual experiences stress, activity in the prefrontal cortex — responsible for conscious thinking and planning — decreases, while activity in the amygdala, hypothalamus and anterior cingulate cortex — regions that quickly activate the body’s stress response — increases. Studies have suggested that mindfulness reverses these patterns during stress; it increases prefrontal activity, which can regulate and turn down the biological stress response.

Excessive activation of the biological stress response increases the risk of diseases impacted by stress (like depression, HIV and heart disease). By reducing individuals’ experiences of stress, mindfulness may help regulate the physical stress response and ultimately reduce the risk and severity of stress-related diseases.

Creswell believes by understanding how mindfulness training affects different diseases and disorders, researchers will be able to develop better interventions, know when certain treatments will work most effectively and identify people likely to benefit from mindfulness training.

Read more

Using more senses makes better sense

Share

unnamed

“Atesi” – what sounds like a word from the Elven language of Lord of the Rings is actually a Vimmish word meaning “thought”. Scientists from the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig have used Vimmish, an artificial language specifically developed for scientific research, to study how people can best memorise foreign-language terms. According to the researchers, it is easier to learn vocabulary if the brain can link a given word with different sensory perceptions. The motor system in the brain appears to be especially important: When someone not only hears vocabulary in a foreign language, but expresses it using gestures, they will be more likely to remember it. Also helpful, although to a slightly lesser extent, is learning with images that correspond to the word. Learning methods that involve several senses, and in particular those that use gestures, are therefore superior to those based only on listening or reading.

For most students, the very thought of learning new vocabulary evokes a groan. Rote learning of long lists of words must surely be one of the most unpopular types of schoolwork. That said, many schools and language courses have now understood that learning outcomes improve if vocabulary, for example, is presented not just as a word, but also as an image. The multisensory learning theory states that the brain learns more easily when several senses are stimulated in parallel.

The results obtained by the Leipzig-based researchers confirm this. For their study the scientists used Vimmish, an artificial language they developed themselves, which follows similar phonetic rules to Italian. This ensured that the vocabulary was equally new to all participants. Over the course of a week, young women and men were to memorise the meaning of abstract and concrete Vimmi-nouns under different conditions. In the first experiment, the subjects heard the word and then observed a corresponding image or a gesture. In the second experiment, they symbolically drew the corresponding word in the air or expressed it with a gesture. The researchers then checked whether the participants could still recall the term at different times after the learning period.

“The subjects’ recollection was best in relation to terms they themselves had expressed using gestures. When they heard the term and its translation and also observed a corresponding image, they were also better able to remember the translation. By contrast, however, tracing a term or observing a gesture was no better than just hearing the term”, explains Katja Mayer of the Max Planck Institute for Human Cognitive and Brain Sciences. The way a term was learned was even reflected in the subjects’ brain activity. In this way, areas of the brain responsible for the motor system were active when a subject translated a term previously learned through gesture, while areas of the visual system were active in the case of words learned with the help of images.

This suggests that the brain learns foreign words more easily when they are associated with information from different sensory organs. It may be that these associations are mutually reinforcing, imprinting the source-language term and its translation more deeply in the mind. “If for example we follow a new term with a gesture, we create additional input that facilitates the brain’s learning”, says Katharina von Kriegstein, head of the study at the Max Planck Institute for Human Cognitive and Brain Sciences. The scientists now want to discover whether the activity in the motor and visual centres is actually the cause of the improved learning outcomes. They plan to do this by activating the neurons in these regions using electrodes and measuring the impact on learning outcomes.

It is not only in learning vocabulary that the multisensory principle applies; other studies have shown that multisensory input also facilitates word recognition in the subject’s own language. “If we’re on the phone with someone we know, for example, the areas of the brain responsible for facial recognition are active during the phone call. It seems that the brain simulates the information not being captured by the eyes and creates it for itself”, explains von Kriegstein.

Thus, we learn with all our senses. Taste and smell also have a role in learning, and feelings play an important part too. But does multisensory learning work according to the principle: the more senses, the better? “That could well be so,” says von Kriegstein, “but we don’t know how much the learning outcomes improve with the addition of more senses. Ideally, however, the individual sensory impressions should match one another. In other words, to learn the Spanish word for apple, the subject should make an apple gesture, taste an apple or look at a picture of an apple.”

Read more

2015 will be defined by a return to basic exercises and the continued integration of technology. 

Share

 

 

 

If you’re sporting a heart-rate monitor while banging out your last set of push-ups you’re on the cutting edge of fitness.

Workouts you can do anywhere

Par8075536

Gym class hero

Remember all of the pull-ups, sit-ups and push-ups you did in high school gym class? Those exercises that made up the Presidential Fitness Test remain the foundation of effective workouts, said Walter Thompson, a professor of kinesiology and health at Georgia State University. “There is nothing really new about body-weight training, but really smart people in clubs have been packaging body-weight programs, and the public is buying them,” said Thompson, author of the American College of Sports Medicine’s annual fitness forecast.

The report, which is based on survey responses from more than 3,400 health and fitness professionals, ranked bodyweight exercises as the top trend for this year. Kevin Mullins, a strength coach and personal trainer at Equinox, said boot camp classes and short, intense interval workouts helped body-weight exercises regain popularity. “It’s a good way for people to get comfortable with strength training,” he said. “A lot of people are going from those moves to deadlifts and squats.”

POWERWALKING0031410554466-502

Tracking every step you take

An app on you smart phone can determine how many steps you make each day. Runners have long used heart-rate monitors to measure the intensity of their workouts, but these days people are wearing all sorts of devices to track their calorie intake or number of steps taken during the day.

Recording that sort of biometric data is likely take off this year amid a wider selection of smartwatches and wearable technology that hit the market during the holiday season. The latest generation of smartwatches and fitness bands have sensors that can monitor just about anything.

But before you drop $200 on a device that tracks your steps, sleep, calories and everything else, focus on the functions you really want, said Anthony Wall, director of professional education at the American Council on Exercise. “GPS is a super-cool feature, but ask yourself if you’ll ever use it,” he said. “Pay attention to functionality and the battery life so you’re not charging the device all time.”

 

 

Read more

Advice for working out in cold weather

Share

PHO-10Feb04-203289

● Dress appropriately

Layer your clothing so you can remove items as you warm up. Ideally, the outer layer would be windproof (check the label to make sure it still “breathes” to let moisture out) and the inner layer would wick moisture away from your body. Cover your head, hands and feet. Mittens are warmer than gloves. For longer runs or windy bike rides, try layering thin gloves under some larger mittens. Wool or wool-blend socks will feel warmer than cotton when damp. Hats are great, but a headband or earmuffs might be more comfortable for some people.

● Stay hydrated

Drinking throughout the day is the best strategy in any season, but especially in winter because cold-weather exercise might make it harder to think about drinking cold water.

● Apply sunscreen

You can still get a burn in winter if you are outside long enough. Also wear UV-protective sunglasses in strong daylight and in snowy conditions.

● Make yourself visible

Shorter days mean more workouts in the dark. Wear reflectors or LED blinkers on your clothing or equipment. Brightly colored clothing can also enhance visibility during low-light or nighttime workouts.

● Beware of ice

Roads, trails, sidewalks and even grassy areas can have icy patches, so try to think about those surfaces if the temperature is below freezing.

● Warm up and cool down

In cold temperatures it is especially important to take time for the transition from low- to high-level activity and back again, but work quickly enough to avoid becoming chilled and uncomfortable. Five minutes of a low level of activity is usually enough, but for more intense exercise, a two-step warm-up might be smart.

When you’re finished, remove your cold, wet clothes in exchange for something warm and dry as soon as possible. A hot shower might be tempting, but a warm shower is a better idea. If your skin is chilled and a bit numb, you might not know that the shower is actually too hot.

● Be safe

In any extreme conditions, tell someone where you’re going, what you’re planning to do and when you expect to be back.

Read more

3 Tips for Healthy Eating

Share

pasta

1. Think ahead. “Boil up some eggs and prep vegetables to keep in the fridge. Take a weekend day to prep dishes that freeze well, such as soups, chilis and stews, and have them in containers in the freezer.”

2. Don’t forgot to have staples on hand. “Keep a stocked pantry with quick-cooking whole grains, low-sodium canned tomatoes and beans, pouches of salmon and tuna. Keep frozen fruit and vegetables and frozen shrimp on hand.”

3. Purchase pre-cut produce. “Take advantage of healthy convenience foods, such as pre-washed lettuce, pre-cut mushrooms and cubed squash.

Read more

Researchers map direct gut-brain connection.

Share

150106095120-large

 

Scientists have mapped a cell-to-cell connection between the gut and the nervous system that may be a more direct route to signaling satiety than the release of hormones in the blood. The new system may change researchers’ understanding of how we sense being full, and how that sensation might be affected by gastric bypass surgery.

After each one of those big meals you ate over the holidays, the cells lining your stomach and intestines released hormones into the bloodstream to signal the brain that you were full and should stop eating.

Researchers at Duke University have now mapped out another system, a cell-to-cell connection between the gut and the nervous system, that may be more direct than the release of hormones in the blood.
The new system may change researchers’ understanding of how we sense being full, and how that sensation might be affected by gastric bypass surgery. The findings, which appeared Jan. 2, 2015 in the Journal of Clinical Investigation, also shed light on a potential new mechanism giving foodborne viruses access to the brain.
“The study supports the idea that there could be a real biology of gut feelings,” said Diego Bohórquez, an assistant professor of medicine at Duke. “As soon as food contacts the wall of the gut, the brain will know in real time what’s going on in the gut,” said Bohórquez, who conducted the study as a postdoctoral researcher in the lab of Dr. Rodger Liddle, a professor of gastroenterology.

Several years ago, Liddle’s team developed methods to visualize a type of cell found scattered throughout the lining of the mouse gut that is remarkably similar to a neuron. Although the cells have a normal shape on the gut’s surface, their underside bears a long arm.
“The question was, why would a cell that is supposed to just release hormones have a whole arm? There had to be a target on the other side,” Bohórquez said.
Dubbed ‘neuropods,’ these special arms are nurtured by support cells known as glia that work with neurons, which suggested at the time that they could be involved in a neuronal circuit.
In the new study, researchers traced the contacts of the neuropods in greater detail, finding that they came close to individual nerve fibers, but not blood vessels, in the small and large intestine. They found that about 60% of neuropods contacted sensory neurons, supporting the notion that they could be involved in gut sensation.
The group went a step further, showing that neuropods and neurons not only contact one another, but they connect. In a dish, single sensory neurons isolated from the brain reached out to contact a neuropod that was, on a cellular scale, about half a football field away.
“For us, that was a point of no return,” Bohórquez said. “It said that these cells know how to get closer to neurons,” though how exactly is unclear. The connection was especially surprising because no one had ever cultured these cells in complete isolation from their neighbors, he added.
Neuropods are so much like neurons — they contain much of the same machinery for sending and receiving signals — that the researchers then tried infecting the colons of mice with a disabled version of the rabies virus, which moves through the body initially by infecting neurons. The virus is routinely used as a laboratory tool for visualizing a single connection from one neuron to another.

“That provides a pathway where rabies can go from the lumen of the gut to the nervous system,” said Rodger Liddle, who is a member of the Duke Institute for Brain Sciences. “It implies you might be able to get rabies by eating rabies. Maybe this is a pathway whereby other viruses could infect the nervous system.”
The new study focused on connections between neuropods and neurons closest to the intestine, but the team is now working to trace the whole path from the gut to brain.

###


Other authors on the study include Duke University Medical Center’s Rafiq Shahid, Alan Erdmann, Alex Kreger, Yu Wang, Nicole Calakos and Fan Wang.

This work was supported by the National Institutes of Health (R01DK091946 and F32DK094704) and the Department of Veterans Affairs (I01BX002230).

Citation: “Neuroepithelial circuit formed by innervation of sensory enteroendocrine cells,” Diego V. Bohórquez, Rafiq A. Shahid, Alan Erdmann, Alex M. Kreger, Yu Wang, Nicole Calakos, Fan Wang. Journal of Clinical Investigation, January 2, 2015. DOI: 10.1172/JCI78361.

Read more

Make a New Year’s resolution to manage your diabetes

Share

doctor visit

MAYWOOD, Ill. (December 31, 2014) – José Rodriguez often skipped breakfast and lunch only to eat a large dinner at the end of the day. Despite his erratic eating habits, Mr. Rodriguez thought he was healthy until a routine blood test revealed he had type 2 diabetes.

“I was shocked. I didn’t expect to be a diabetic,” Mr. Rodriguez said. “I told my doctor that I would do whatever it takes to manage my disease.”

Mr. Rodriguez was referred to the diabetes education program at Loyola University Medical Center. A certified diabetes educator worked with him to make diet and lifestyle changes. This included cutting out soda, monitoring the food he eats, altering his portions, exercising more and checking his blood sugar.

“I didn’t like vegetables. That was a challenge for me,” Mr. Rodriguez said. “But once I learned more about eating well and portions, it helped me get my diabetes under control. I now eat well-balanced meals and exercise four to five times a week for an hour at a time.”

Approximately 29 million Americans have diabetes. Symptoms of diabetes include:

Urinating often
Feeling very thirsty
Feeling very hungry – even though you are eating
Extreme fatigue
Blurry vision
Cuts/bruises that are slow to heal
Tingling, pain or numbness in the hands/feet (type 2)

Early detection and treatment can decrease the risk of developing complications from diabetes. The new year is a good time to see a doctor if you think you have diabetes.

 

Read more

Trying to lose weight this New Year? Here are five strategies to avoid.

Share

1-1267103944zulE

 

MAYWOOD, Ill. (December 30, 2014) – Is your New Year’s resolution to lose weight? Here are five bad strategies to avoid, according to Aaron Michelfelder, MD, of Loyola University Health System:

Bad Strategy No. 1: I’ll lose weight at the gym. Working out is good for your health and can help to maintain your weight. But exercise alone is not very effective in shedding pounds. To lose weight, you will need to eat fewer calories.

Bad Strategy No. 2: I’ll have to dramatically change my diet. A radical change is not necessary. A more effective strategy is to simply cut back a few hundred calories a day. When going to a restaurant, for example, eat an apple before dinner to dull your appetite, then skip the bread before the main dish arrives. Eat smaller portions and ask for a to-go container.

Bad Strategy No. 3: Weight-loss supplements will make it easier. Supplements burn more muscle than fat. And when you stop taking them, you will gain back more fat than muscle, making you worse off than if you had never taken them in the first place.

Bad Strategy No. 4: I want to be like contestants on “The Biggest Loser” and shed pounds quickly. A more realistic – and healthy – strategy is to try to lose 1-2 pounds per week. If you cut back 500 calories a day (such as a bagel and cream cheese), you will lose a pound a week. If you cut back just 250 calories a day (one candy bar) you will lose 2 pounds a month. “This will provide the slow-and-steady type of weight loss that will be long-lasting,” Dr. Michelfelder said.

Bad Strategy No. 5: I give up. I’ll never get down to a normal weight, so why even try? Do not despair if you do not get down to a trim, normal weight (defined as a body mass index of between 18.5 and 24.9). If you are overweight or obese, losing 10 percent of your body weight will improve your appearance and have significant health benefits, such as lower blood pressure and a reduced risk of diabetes. Even losing as little as 5 pounds will be good for your joints.

As a family physician, Dr. Michelfelder fields a lot of questions every January from patients who have resolved to lose weight. He advises them not to try to go it alone.

“When you tell other people you are trying to lose weight, they will give you their support, and stop shoving cake and candy your way,” Dr. Michelfelder said.

“For the New Year, most of us should add some weight loss to our resolutions,” Dr. Michelfelder said. “Obesity is now so common in the United States that it causes more disease and years of life lost than smoking.”

Read more

Weigh In

Share

lifestyle

Stepping on the scale is common among dieters but how does the frequency of weigh-ins impact weight? A new study in PLOS ONE showed that the more frequently dieters weighed themselves the more weight they lost, and if participants went more than a week without weighing themselves, they gained weight.

The researchers analyzed 2,838 weight measurements (up to a years’ worth of weigh-ins) from 40 overweight individuals (with a body mass index of 25 and over) who indicated that weight loss was a personal goal or concern. The researchers found that weight loss was related to how often individuals weighed themselves. “The more often you weigh yourself the more weight you lose,” says to lead author Elina Helander from Tempere Univeristy of Technology in Finland. This observational study cannot prove causation – it may be that less serious dieters weight themselves less or that dieters who stop losing weight stop weighting themselves. The average time that participants could go between weighting without gaining weight was 5.8 days or about a weekly weigh-in.

Weigh yourself at least once a week if you wish to lose weight, and weighing yourself everyday may help you stay on track. A previous study by the same research team found that your weight naturally fluctuates throughout the week and that most people weigh the least on Wednesday.

 

 

Read more

Hunter-gatherer past shows our fragile bones result from physical inactivity since invention of farming.

Share

shutterstock_153800186

Latest analysis of prehistoric bones show there is no anatomical reason why a person born today could not develop the skeletal strength of a prehistoric forager or a modern orangutan. Findings support the idea that activity throughout life is the key to building bone strength and preventing osteoporosis risk in later years, say researchers.

Compared with other primates and our early human ancestors, we modern humans have skeletons that are relatively lightweight — and scientists say that basically may be because we got lazy.

Biological anthropologist Habiba Chirchir and her colleagues at the Smithsonian’s National Museum of Natural History were studying the bones of different primates including humans. When they looked at the ends of bones near the joints, where the inside of the bone looks almost like a sponge, they were struck by how much less dense this spongy bone was in humans compared with chimpanzees or orangutans.

“So the next step was, what about the fossil record? When did this feature evolve?” Chirchir wondered.

Their guess was that the less dense bones showed up a couple of million years ago, about when Homo erectus, a kind of proto-human, left Africa. Having lighter bones would have made it a lot easier to travel long distances, Chirchir speculated.

But after examining a bunch of early human fossils, she realized their guess was wrong. “This was absolutely surprising to us,” she says. “The change is occurring much later in our history.”

The lightweight bones don’t appear until about 12,000 years ago. That’s right when humans were becoming less physically active because they were leaving their nomadic hunter-gatherer life behind and settling down to pursue agriculture.

A report on the work appeared Monday in the Proceedings of the National Academy of Sciences, along with a study from a different research group that came to much the same conclusion.

Those researchers looked at the bones of people in more recent history who lived in farming villages nearly 1,000 years ago and compared them with the bones of people who had lived nearby, earlier, as foragers.

The bones of people from the farming communities were less strong and less dense than those of the foragers, whose measured bone strength was comparable to similar-size nonhuman primates.

“We see a similar shift, and we attribute it to lack of mobility and more sedentary populations,” says Timothy Ryan, an associate professor of anthropology at Penn State University. “Definitely physical activity and mobility is a critical component in building strong bones.”

http://health.wusf.usf.edu/post/when-humans-quit-hunting-and-gathering-their-bones-got-wimpy

Read more

How Healthy is your State?

Share

Wellness_Trends

Americans are now living longer than ever — the average life expectancy hit a record high of 78.8 years this year — yet our nation still ranks 34th in terms of life expectancy, and we spend significantly more on health than any other country in the world.

http://www.beckershospitalreview.com/population-health/the-healthiest-unhealthiest-states-in-america-where-does-your-state-rank.html

Read more

BMI related cancers are on the rise

Share

Wellness_Trends

 

“Our findings add support for a global effort to address the rising trends in obesity. The global prevalence of obesity in adults has doubled since 1980. If this trend continues it will certainly boost the future burden of cancer, particularly in South America and North Africa, where the largest increases in the rate of obesity have been seen over the last 30 years.”

The Lancet Oncology: Overweight and obesity linked to nearly 500 000 new cancers in 2012 worldwide (25 November 2014)

Read Full Article Here

Read more

Obesity Epidemic Costs World $2 Trillion a Year, Study Says

Share

slider1

The global obesity epidemic is costing the world economy $2 trillion a year in health-care costs, investments to mitigate its impact and lost productivity, according to a new study published Wednesday by the McKinsey Global Institute (MGI).

The economic research arm of consulting firm McKinsey notes that figure is roughly equivalent to the gross domestic product of countries such as Italy and Russia.

(more…)

Read more