TV again tied to poor sleep among kids

In another blow to kids’ pleas to watch more television before bed, a new study suggests increased TV time is linked to less sleep. What’s more, black, Latino and other minority children slept less when they had TV sets in their bedrooms.

“Inadequate sleep in childhood is associated with health outcomes, including attention problems, school performance and an increased risk of obesity,” Elizabeth Cespedes told Reuters Health. Cespedes is the study’s lead author from the Harvard School of Public Health in Boston.

“We wanted to know if television viewing may be associated with shorter sleep duration in children,” she said. For the new report, she and her colleagues used data from an existing study of mothers and children who lived in the Boston area. The study included 1,864 children who were born between 1999 and 2003. Mothers reported how much television their child watched at six months old and then every year until age seven.

Mothers also reported whether children slept with a television in their bedroom every year starting midway through the study.

The average time children slept each day decreased from about 12 hours at six months to about 10 hours at seven years, and total TV viewing increased from about one hour per day to 1.6 hours.

The proportion of children who slept with a TV in their bedroom increased from 17 percent to 23 percent between ages four and seven years, too. Children typically sleep less as they get older, the researchers noted. Still, each extra hour of TV watching added to their lifetime average was tied to a seven-minute decrease in daily sleep.

That association was stronger for boys than girls, according to findings published in Pediatrics. “I think in our case it’s possible that the content of the television watched may be different for boys than girls,” Cespedes said. “The content may be especially disruptive.”

She and her colleagues also found that sleeping with a TV in the bedroom was tied to 31 fewer minutes of sleep per day among racial and ethnic minority children. The effect of a TV in the bedroom was not as strong among white, non-Hispanic children.

Cespedes said it’s hard to know why minority children would be more affected by having a TV in the bedroom. “At all time points, racial and ethnic minority children in our study were sleeping a bit less and watching more television,” she said.

Dr. Heidi Connolly, a sleep specialist who was not involved with the new study, said the research is one of several recent papers that point toward a negative effect of TV on sleep.

“This doesn’t seem like very much, but if you think about it, seven minutes every night by the time you get to the end of the week you’re already a half hour short on sleep,” Connolly, from the University of Rochester Medicine’s Golisano Children’s Hospital in New York, said.

The American Academy of Pediatrics recommends against children younger than two years old watching any television. It also recommends limiting older children’s screen time to no more than one or two hours per day.

“I think it’s unreasonable to expect that kids aren’t going to watch TV,” Connolly told Reuters Health. “It’s pervasive in our culture. But you do want to limit screen time to less than two hours per day.” Connolly also said people sometimes say their children need the TV on to sleep, but that’s not the case.

She said consistent bedtimes, regular bedtime routines and a TV-free comfortable sleeping environment are good sleep behaviors.

Source: orlando sentinel


Too much animal protein tied to higher diabetes risk

People who eat the most protein, especially from animal sources, are more likely to be diagnosed with type 2 diabetes, according to a study of European adults.

The new study did not randomly assign participants to eat different amounts of protein, which would have yielded the strongest evidence. Instead, it compared the diets of people who went on to develop diabetes and those who did not get the disease.

But the findings do align with other studies. “Several previous studies have found that higher intake of total protein, especially animal protein, are associated with long-term risk of developing diabetes,” said Dr. Frank Hu, from the Harvard School of Public Health in Boston. Hu, who was not involved in the new study, researches prevention of diabetes through diet and lifestyle.

“Substantial amounts of animal protein come from red meat and processed meat, which have been consistently associated with increased risk of diabetes,” he told Reuters Health in an email.

For the new report, researchers examined data from a large previous study of adults in eight European countries spanning 12 years. The study collected data on participants’ diet, physical activity, height, weight and waist circumference, then followed them to see who developed diabetes.

A team of researchers led by Monique van Nielen of Wageningen University in the Netherlands selected 11,000 people who developed type 2 diabetes from the data and 15,000 people without diabetes for comparison.

Overall, the adults in the study commonly ate about 90 grams of protein per day. Those who ate more tended to have a higher weight-to-height ratio and to eat more fiber and cholesterol than people who ate less protein.

After accounting for other diabetes risk factors, every additional 10 grams of protein people consumed each day was tied to a six percent higher chance that they would develop diabetes.

Dividing participants into five groups based on how much protein they ate, the researchers found those who ate the most, or around 111 grams per day, were 17 percent more likely to develop diabetes than those who ate the least, or around 72 grams per day.

Specifically, those who ate the most animal protein, or 78 grams per day, were 22 percent more likely to be diagnosed with diabetes than those who ate the least, around 36 grams per day, according to results published in Diabetes Care.

That’s only a modest increase on an individual level, Hu said.

People who ate the most protein got about 15 percent of their calories from red meat, processed meat, poultry, fish and dairy, which appears to be too much, Hu said.  “More importantly, higher intake of animal protein often comes along with other undesirable nutrients such as saturated fat, cholesterol and sodium,” he said.

The association between animal protein and diabetes risk appeared to be strongest among obese women. Plant protein, on the other hand, was not linked to diabetes.

“In other studies, plant protein sources such as nuts, legumes and whole grains have been associated with lower risk of diabetes,” Hu said. “Therefore, replacing red meat and processed meat with plant sources of protein is important for diabetes prevention.”

Generally people associate high-fat and high-carbohydrate diets with diabetes risk, but this study underscores that protein is an important nutrient to consider as well, Paolo Magni said. Magni, from the Institute of Endocrinology at the University of Milan in Italy, was not involved in the new study.

“As a general rule, I would suggest to eat normal portions of red meat not more than two times per week, poultry and fish three to four times per week, skimmed milk or yogurt maybe not every day,” Magni told

Cheese, preserved meats and cold cuts should be minimized, he said. “Pay attention to both quantity and food sources of protein,” Hu said. It’s probably a good idea for people with a family history of diabetes to replace at least some red meat with nuts, legumes or whole grains, he said.

Source: reuters


Low blood sugar tied to ‘hangry’ fights with spouse

We’ve all been “hangry,” so hungry that we become angry at the slightest frustration or provocation. But could low blood sugar make you so hangry you’d abuse your spouse?

In an effort find out, scientists asked married couples to secretly stick pins into a voodoo doll representing their spouses, and blast noise in their spouses’ ears. The results, released Monday in the Proceedings of the National Academy of Sciences (PNAS), does appear to show a link between lower blood sugar and marital spats.

Led by Brad Bushman, a professor of communication and psychology at The Ohio State University, the experiment tested a hypothesis about self-control.

The researchers had 107 couples monitor their blood glucose levels with over-the-counter monitors once in the morning and once in the evening for 21 days. Every evening the partners were to privately stick needles into voodoo dolls to indicate how angry they were with their spouses, zero meaning not at all, up to a high of 51.

Even after controlling for a number of variables like overall relationship satisfaction, people with lower glucose levels stuck more pins in the dolls. There was no difference between men and women in how they were affected.

Then Bushman had couples come into his lab to play a simple computer game against each other while sitting in different rooms. In fact, they were playing against a computer and the results were rigged so they’d win and lose about the same number of rounds.

As a punishment for “losing” a round, the victor could play an obnoxious noise — a combination of fingernails on chalkboards and other irritating sounds like an airhorn — into the earphones of the loser at a volume the victor selected, up to about the level of a smoke alarm. (Actually, the computer controlled the noise level.)

Those people with lower glucose levels, and who stuck more pins into the dolls, also tended to blast the noise.

“Thus,” Bushman and colleagues wrote, “low glucose levels might be one factor that contributes to intimate partner violence.”

Many experts believe that self-control can be depleted like a battery, as illustrated in one famous 1998 study: Two groups of hungry people were placed in a room containing a plate of freshly-baked cookies and a plate of radishes. One group was told they could eat cookies, the other told they could eat only radishes. Both groups were then asked to complete a puzzle they didn’t know was unsolvable.

The cookie group worked twice as long on the puzzle. People in the radish group gave up sooner because they had to exercise more self-control to avoid eating the cookies. So there was less willpower left to work on a frustrating puzzle.

A number of factors can deplete self-control, said Brandon Schmeichel, a professor of psychology at Texas A&M University who studies this phenomenon. Performing a task that requires self-control — like not eating a cookie when you really want one, or doing math problems, or filling out your 1040 form — can do it, as can mood, alcohol, and one’s ability to keep your eye on long-term goals, rather than short-term impulses. “That can be difficult to do,” Schmeichel said.

Glucose is a more controversial factor. Proponents argue that the brain uses a lot of energy, especially the pre-frontal cortex that exerts control over our baser instincts and helps us reason. Low glucose can leave the brain low on gas. And being low on gas weakens self-control.

But others point out that some studies suggest self-control is not always limited, and that experiments trying to link low glucose to low self-control are contradictory. Some show an effect, some do not.

Bushman believes there is a cause-effect link and that “aggression starts when self-control stops.”

Professor Florian Lange, a neuroscientist at Hannover Medical School in Germany, praised some parts of the study, but via email said he’s not convinced there’s “a significant role for glucose in self-regulation/self-control.”

A number of other factors could explain the experiment’s results “equally well,” Lange said. For example, he speculated, “who are these violent people having low blood sugar?” he asked.

“Maybe they eat healthier in order to be fit to do extreme sports, an activity they like to pursue because they are more risk-taking,” Lange suggested. “This latter variable could explain why they show more aggression.”

Whether or not low glucose specifically depletes self-control, though, most experts agree that hunger can. As Bushman said, “hungry people are cranky people.”

So, he said, “if you are having a discussion with your spouse about a conflict situation, make sure you’re not hungry.” He advised skipping candy bars and other high-sugar foods, which can spike glucose but lead to a crash. Instead, say, before that last minute tax return debate, eat something nutritious.

source; today


Antidepressant Use During Pregnancy Increases Autism Risk

Women who take antidepressants known as selective serotonin reuptake inhibitors (SSRIs) during pregnancy may be at an increased risk of having a child with autism spectrum disorder, according to a recent studyResearchers found that children who were exposed to SSRIs the most had the highest incidence of autism.

“We found prenatal SSRI exposure was almost three times as likely in boys with autism spectrum disorders relative to typical development, with the greatest risk when exposure is during the first trimester,” study co-author Li-Ching Lee, an associate scientist in the department of epidemiology at Johns Hopkins Bloomberg School of Public Health in Maryland, told Counsel and Heal.

For the study, researchers collected data from 966 mother and child pairs to better understand how SSRIs affect pregnancy outcomes. Of the children studied, 800 were boys. Nearly 500 children were diagnosed with autism spectrum disorder, 154 had some form of developmental delay and 320 had developed typically.

The SSRIs examined in the study were Celexa, Lezapro, Paxil, Prozac and Zoloft.

Researchers found that in the autism group, 5.9 percent of the pregnancies were exposed to SSRIs. In the delayed developmental group, 5.2 percent of the pregnancies were exposed to SSRIs. They also found that exposure rate in the typically developing children group was 3.4 percent.

Investigators said that in terms of gender, boys were three times more likely to have autism if they were exposed to the antidepressants during the first trimester.

Given their findings, researchers said they hope expecting mothers consult with their doctors before taking antidepressants during pregnancy.

“It’s a complex decision whether to treat or not treat depression with medications during pregnancy,” Lee said. “There are so many factors to consider. We didn’t intend for our study to be used as a basis for clinical treatment decisions. Women should talk with their doctors about SSRI treatments.”

Source: University herald


5 kitchen essentials for dieters

If you are hoping to shed some weight before summer these kitchen basics can help:

Non-stick pan

Cooking with a non-stick pan makes it easy to cut excess calories and fat because you don’t have to grease it with oil or butter.

Measuring tools

Measure out a one-cup serving of cereal and you may realize you’ve been filling your usual cereal bowl with about three cups! Using a measuring cup and measuring spoons makes portion control simple.

Food scale

A kitchen scale lets you gauge the proper portion of foods such as meats, poultry and fish based on their weight.

Spritzer bottle

Though olive oil is heart-healthy, the fact remains that one tablespoon contains 120 calories and 14 grams of fat. Instead of pouring oil directly on your food give it a spritz or two. You’ll still enjoy plenty of flavor while using a lot less oil at only 5 calories per spritz.

Tall glasses

Studies have shown that people tend to pour less into a tall, slender glass than they do using a short wide glass. A tall drink is apt to be more satisfying too, simply because it seems big.

Source: Pick news

 


Michigan girl mauled by raccoon gets surgery to replace ear

Raccoon Attack Girl 1_AP_April 15 2014.jpg

A Michigan girl whose face was severely mauled by a pet raccoon when she was a baby is getting reconstructive surgery Tuesday to create a new ear.

Eleven-year-old Charlotte Ponce is a veteran of many previous surgeries to repair damage to her nose, lip and cheek.

For the latest operation, Dr. Kongkrit Chaiyasate of Beaumont Hospital in Royal Oak is taking cartilage from Charlotte’s ribs and carving it into the shape of an ear using a computer-generated template based on the girl’s other ear. Then he’ll implant the ear structure under the skin of Charlotte’s forearm. The skin will mold to the ear shape over a period of weeks.

Chaiyasate, the director of reconstructive microsurgery at Beaumont, performed a series of five operations in 2012 to rebuild Charlotte’s nose, which was also disfigured by the attack. At the time, the girl’s adoptive mother, Sharon Ponce, described it as “the best Christmas present she could ever get.”

The family’s blog, Healing Charlotte, tells the harrowing story of how the girl was attacked in her crib as a baby by a pet raccoon that gnawed away at her right ear, upper lip and cheeks. she barely survived and spent months in the hospital. Charlotte and her brother were removed from the home and later adopted by their great aunt and uncle.

Before the series of operations began at Beaumont, Charlotte had already undergone half a dozen reconstructive surgeries and skin grafts. Some resulted in painful complications, and an earlier attempt at a prosthetic ear had to be removed after repeated infections, according to the blog. But despite the trauma, the family said Charlotte was “starting the road to emotional and mental recovery,” taking dance, art and gymnastics and making new friends at school.

Source: CBS news


Brain Tumor Risk Higher for Teens Who Stop Growing Later

Teens who take longer to reach their full height may be at increased risk for certain types of brain tumors later in life, a new study suggests.

The study involved nearly 2,600 people, including 1,045 people with glioma, a category of brain and spinal tumors that arise from cells known as glial cells; 274 people with meningioma, a type of tumor that forms in the lining of the brain; and 1,242 people without brain tumors. Participants, who were mostly in their 50s, reported how old they were when they stopped growing.

On average, men reached their full height at age 17, while women reached their full height at age 16.

For each additional year it took people to reach their full height, the risk of glioma increased by 14 percent for men and 11 percent for women, the study found.

People who stopped growing at age 19 or older had nearly twice the risk of glioma as those who stopped growing at age 15 or younger.

The reason for the possible link is not clear. It’s possible that teens who take longer to stop growidietitianng are exposed to growth hormonesfor longer periods, which may affect glioma risk, said study researcher Rebecca Little, a and doctoral student at the University of Alabama at Birmingham.

Previous studies have linked having a higher body mass index (BMI) in young adulthood, and taller stature overall, with glioma risk, but the new findings are a first, the researchers said.

No link was found between the age at which people stopped growing and their risk of meningioma, the researchers noted.

Because the study was conducted in a single region of the United States — the Southeast — additional research is needed to replicate the findings in other groups of people, Little said.

The study found only an association, and cannot prove that taking longer to stop growing causes brain tumors.

Interestingly, the risk of glioma was highest among people who took longer to reach their full height, but whose final height was on the short side. This finding could be due to chance, so more research is needed to confirm it, Little said. But it’s possible that these people’s bodies produce a lower level of growth hormones over a prolonged period, which may confer a higher risk of tumors than a higher level of growth hormones over a short period, Little said.

It’s possible that some people may not accurately recall the age at which they stopped growing, but Little noted that people are often good at remembering their height and weight at certain time points in life.

The study was presented this week at the meeting of the American Association for Cancer Research in San Diego.

Source: the needs


Study: Green tea boosts working memory

A beverage with multiple benefits, green tea has inspired a number of research projects in recent years. One of the latest studies on the subject, published in the journal Psychopharmacology, offers additional evidence on how the drink can improve working memory and cognitive performance.

According to various studies, the antioxidant-rich beverage may help in maintaining a healthy weight and fighting bad cholesterol, in addition to improving memory and preventing cognitive decline. The latter benefits in particular were the subject of a key study by Chinese researcher Bai Yun published in Food Science and Molecular Nutrition and Food Research in June 2012.

Eager to evaluate claims of green tea’s power to improve memory and to identify the mechanism behind it, researchers in Basel, Switzerland asked a group of healthy volunteers to consume a soft drink with green tea extract before solving a series of working memory tasks. The test subjects’ brain activity was analyzed using an MRI machine.

The researchers, led by Christoph Beglinger and Stefan Borgwardt, of the University Hospital of Basel and the Psychiatric University Clinics respectively, observed improved connectivity between the frontal and parietal brain regions in the test subjects who were given green tea extract.

This improved connectivity between the two brain regions correlated with enhanced performance of the memory tasks. “Our findings suggest that green tea might increase the short-term synaptic plasticity of the brain,” Borgwardt indicated.

In the future, the findings of the study could be used to assess the effectiveness of green tea extract in treating dementia and other neuropsychiatric illnesses, according to the researchers.
The study was published in the journal Psychopharmacology.

Source: FMT news


Degenerated organ fully restored in living animal

Scientists have made a breakthrough in regenerative medicine by fully restoring a degenerated organ in a living animal for the first time.

A team from the Medical Research Council (MRC) Centre for Regenerative Medicine, at the University of Edinburgh in the UK, reconstructed the thymus of aging mice, reported medicalnewstoday.com

The thymus is a glandular structure that functions as part of the body’s immune system by creating T cells – the type of white blood cell that is essential for fighting infection.

Located in the front of the heart, the thymus is the first organ to deteriorate as we age.

Scientists have attempted to regenerate the thymus before, using sex hormones. But using this technique, the thymus only temporarily regenerated with limited functional recovery.

In the new experiment, however, the restored thymus was fully functioning and “very similar” to the thymus of a young mouse, say the researchers.

Although the researchers have not yet ascertained whether the immune systems of the mice with a restored thymus were strengthened by the process, they do know that mice receiving this treatment began to produce more T cells.

“One of the key goals in regenerative medicine is harnessing the body’s own repair mechanisms and manipulating these in a controlled way to treat disease. This interesting study suggests that organ regeneration in a mammal can be directed by manipulation of a single protein, which is likely to have broad implications for other areas of regenerative biology,” says a senior MRC researcher.

Source: Press TV


Blue Pill May Boost Risk of Deadly Skin Cancer, Study Finds

Men who use Viagra to get a boost in the bedroom could find that the little blue pill also increases the risk of developing melanoma, the deadliest form of skin cancer, a preliminary study finds.

Researchers found that men who took sildenafil, best known as Viagra, were about 84 percent more likely to develop melanoma than men who didn’t take the drug.

Because it’s just one early study, no one is suggesting that men stop taking Viagra to treat erectile dysfunction, said Dr. Abrar Qureshi, professor and chair of the dermatology department in the Warren Alpert Medical School at Brown University.

“But people who are on the medication and who have a high risk for developing melanoma may consider touching base with their primary care providers,” said Qureshi, co-author of the study of nearly 26,000 men published Monday in JAMA Internal Medicine.

Viagra may increase the risk of melanoma because it affects the same genetic pathway that allows the skin cancer to become more invasive, Qureshi said. Those who took the drug weren’t at higher risk of other, less-dangerous skin cancers, such as basal cell or squamous cell cancers.

About 76,100 new melanoma cases are expected to be diagnosed in the U.S. in 2014, and about 9,710 people will die, including about 6,470 men.

Qureshi and colleagues at several sites in the U.S. and China analyzed data about Viagra use and skin cancer from the Health Professionals’ Follow-up Study, a long-term study of male doctors and other health care workers.

The average age of men in the study was 65 and about 6 percent had taken Viagra to treat erectile dysfunction. If men had ever used Viagra, the risk of developing melanoma was about double than for those who never used the drug. That finding held true even when the researchers adjusted for a family history of skin cancer, ultraviolet light exposure in the states where the men lived, other kinds of cancer and major illnesses and other factors.

Primary care doctors who treat older men taking Viagra should check their patients for signs of skin cancer, said Dr. June Robinson of Northwestern University’s Feinberg School of Medicine, who wrote an accompanying editorial.

She cautioned that the rate of increase in new melanoma cases in men actually slowed after Viagra entered the market in 1998, raising a “cautionary note” about the impact of sildenafil on melanoma.

“But its role in the biological behavior of melanoma in older men warrants further study,” she said.

Source: NBC news