Moderate Hot Flashes May Increase Risk of Depression

Moderate Hot Flashes May Increase Risk of Depression

A new study finds that moderate to severe hot flashes among menopausal women are a significant risk factor for depression.

Australian researchers studied more than 2,000 perimenopausal and menopausal women, focusing on the more severe forms of vasomotor symptoms – hot flashes or night sweats.

Although controversial, investigators found that among a group of women ages 40-65, those with moderate-severe hot flashes were significantly more likely to have moderate-severe depression than women with no or mild vasomotor symptoms.

Study findings appear in the Journal of Women’s Health.

Roisin Worsley, M.B.B.S., Robin Bell, Ph.D., Pragya Gartoulla, Penelope Robinson, and Susan Davis, M.B.B.S., Monash University, Melbourne, Australia, found hot flashes, depressive symptoms, and use of antidepressant medication to be common in the age range of women included in the study.

The researchers also examined whether or not moderate-severe depression was associated with a greater likelihood of psychotropic medication use, smoking, or binge drinking at least once a week.

“The results of this study shed further light on therapeutic findings, with both anti-depressant medication and estrogen therapy having the potential to improve hot flashes and mood,” said Susan G. Kornstein, M.D., editor-in-chief of the Journal of Women’s Health.

Source: Journal of Women’s Health/Mary Ann Liebert

Read More

Posted by Patricia Adams in Default

Gaming Software Can Personalize Therapy Programs

Gaming Software Can Personalize Therapy Programs

Emerging research suggests software programs that use game features in non-game contexts can improve individual motivation to follow prescribed or recommended therapy regimens.

Penn State engineers used machine learning to train computers to develop personalized mental or physical therapy regimens — for example, to overcome anxiety or recover from a shoulder injury — so many individuals can each use a tailor-made program.

“We want to understand the human and team behaviors that motivate learning to ultimately develop personalized methods of learning instead of the one-size-fits-all approach that is often taken,” said Dr. Conrad Tucker, an assistant professor of engineering design technology.

“Using people to individually evaluate others is not efficient or sustainable in time or human resources and does not scale up well to large numbers of people,” said Tucker.

“We need to train computers to read individual people. Gamification explores the idea that different people are motivated by different things.”

To begin creating computer models for therapy programs, the researchers tested how to most effectively make the completion of a physical task into a gamified application by incorporating game features like scoring, avatars, challenges and competition.

“We’re exploring here how gamification could be applied to health and wellness by focusing on physically interactive gamified applications,” said Christian Lopez, graduate student in industrial and manufacturing engineering, who helped conduct the tests using a virtual-reality game environment.

In the virtual-reality tests, researchers asked participants to physically avoid obstacles as they moved through a virtual environment. The game system recorded their actual body positions using motion sensors and then mirrored their movements with an avatar in virtual reality.

Participants had to bend, crouch, raise their arms, and jump to avoid obstacles. The participant successfully avoided a virtual obstacle if no part of their avatar touched the obstacle. If they made contact, the researchers rated the severity of the mistake by how much of the avatar touched the obstacle.

In one of the application designs, participants could earn more points by moving to collect virtual coins, which sometimes made them hit an obstacle.

“As task complexity increases, participants need more motivation to achieve the same level of results,” said Lopez. “No matter how engaging a particular feature is, it needs to move the participant towards completing the objective rather than backtracking or wasting time on a tangential task. Adding more features doesn’t necessarily enhance performance.”

Tucker and Lopez created a predictive algorithm to forecast the outcome of an event. The tool helped to rank the potential usefulness of a game feature. They then tested how well each game feature motivated participants when completing the virtual-reality tasks.

They compared their test results to the algorithm’s predictions as a proof of concept and found that the formula correctly anticipated which game features best motivated people in the physically interactive tasks.

The researchers found that gamified applications with a scoring system, the ability to select an avatar, and in-game rewards led to significantly fewer mistakes and higher performance than those with a win-or-lose system, randomized gaming backgrounds and performance-based awards.

Sixty-eight participants tested two designs that differed only by the features used to complete the same set of tasks.

The researchers chose the tested game features from the top-ranked games in the Google Play app store, taking advantage of the features that make the games binge-worthy and re-playable, and then narrowed the selection based on available technology.

Their algorithm next ranked game features by how easily designers could implement them, the physical complexity of using the feature, and the impact of the feature on participant motivation and ability to complete the task.

Investigators discovered that ff a game feature is too technologically difficult to incorporate into the game, too physically complex, does not offer enough incentive for added effort or works against the end goal of the game, then the feature has low potential usefulness.

Study results appear in the journal Computers in Human Behavior. Researchers believe their findings may help to boost workplace performance and personalize virtual-reality classrooms for online education.

“Game culture has already explored and mastered the psychological aspects of games that make them engaging and motivating,” said Tucker. “We want to leverage that knowledge towards the goal of individualized optimization of workplace performance.”

To do this, Tucker and Lopez next want to connect performance with mental state during these gamified physical tasks. Heart rate, electroencephalogram signals and facial expressions will be used as proxies for mood and mental state while completing tasks to connect mood with game features that affect motivation.

Source: Penn State

Read More

Posted by Patricia Adams in Default

Brain Adjusts Learning Rate Depending on Environment

Brain Adjusts Learning Rate Depending on Environment

Each time we get feedback, the brain updates its knowledge and behavior in response to changes in the environment. However, if there’s uncertainty or volatility in the environment, the entire process must be adjusted.

In a new study, Dartmouth researchers discovered that there’s not a single rate of learning for everything we do, as the brain can self-adjust its learning rates using a synaptic mechanism called metaplasticity.

The findings refute the theory that the brain always behaves optimally. How the brain adjusts learning has long been thought to be driven by the brain’s reward system and its goal of optimizing rewards obtained from the environment or by a more cognitive system responsible for learning the structure of the environment.

Study findings are published in Neuron.

Researchers explain that synapses are the connections between neurons in the brain and are responsible for transferring information from one neuron to the next.

When it comes to choice in evaluating potential rewards, your learned value of a particular option, reflecting how much you like something, is stored in certain synapses. If you get positive feedback after choosing a particular option, the brain increases the value of that option by making the associated synapses stronger.

In contrast, if the feedback is negative, those synapses become weaker. Synapses, however, can also undergo modifications without changing how they transmit information through a process called metaplasticity.

Previous studies have suggested that the brain relies on a dedicated system for monitoring the uncertainty in the environment to adjust its rate of learning. The authors of this study found however, that metaplasticity alone is sufficient to fine-tune learning according to the uncertainty about reward in a given environment.

“One of the most complex problems in learning is how to adjust to uncertainty and the rapid changes that take place in the environment. It is very exciting to find that synapses, the simplest computational elements in the brain, can provide a robust solution for such challenges,” said Dr. Alireza Soltani, an assistant professor of psychological and brain sciences.

“Of course, such simple elements may not provide an optimal solution but we found that a model based on metaplasticity can explain real behaviors better than models that are based on optimality,” he added.

This study demonstrates that learning can be self-adjusted and does not require explicit optimization or complete knowledge of the environment. The authors propose potential practical implications of their findings.

For behavioral anomalies such as addiction, where the synapses might not adapt flexibly, more carefully designed feedback may be required to make the system plastic again, illustrating how metaplasticity may have broader relevance.

Source: Dartmouth College/EurekAlert

Read More

Posted by Patricia Adams in Default

Antidepressants in Early Pregnancy May Not Hike Risk of Autism, ADHD

Antidepressants in Early Pregnancy May Not Hike Risk of Autism, ADHD

A new study published in the Journal of the American Medical Association contradicts prior research, finding that antidepressants used during early pregnancy does not increase the risk of children developing autism or attention-deficit hyperactivity disorder.

The University of Indiana study found significant evidence for only a slight increase in risk for premature birth in the infants of mothers who used antidepressants during the first trimester of pregnancy.

“To our knowledge, this is one of the strongest studies to show that exposure to antidepressants during early pregnancy is not associated with autism, ADHD or poor fetal growth when taking into account the factors that lead to medication use in the first place,” said study leader Dr. Brian D’Onofrio.

“Balancing the risks and benefits of using antidepressants during pregnancy is an extremely difficult decision that every woman should make in consultation with her doctor,” he said. “However, this study suggests use of these medications while pregnant may be safer than previously thought.”

Researchers called the study unique because its methodology included the review of an entire population rather than common techniques using smaller samples.

Researchers reported that after controlling for multiple other risk factors, they did not find any increased risk of autism, ADHD or reduced fetal growth among exposed offspring. The risk for premature birth was about 1.3 times higher for exposed offspring compared to unexposed offspring.


The analysis, conducted in collaboration with researchers at Karolinska Institute in Sweden and Harvard T.H. Chan School of Public Health, drew upon data on all live births in Sweden from 1996 to 2012.

It also incorporated data reporting the country’s antidepressant prescriptions in adults, autism and ADHD diagnoses in children, genetic relationships between parents and children, parents’ age and education levels, and other factors.

With over 1.5 million infants, the study comprises one of the largest and most comprehensive populations ever analyzed to understand the impact of antidepressant use during pregnancy.

The increased risk for premature birth was found after controlling for other factors that affect health, such as a mother’s age at childbearing, in siblings whose mothers used antidepressants during one pregnancy but not during another pregnancy.

“The ability to compare siblings who were differentially exposed to antidepressants in pregnancy is a major strength of this study,” D’Onofrio said.

“Most analyses rely upon statistical matching to control for differences in factors such as age, race and socioeconomic status. But it’s difficult to know if you’ve made a perfect match because you can’t be certain you have all the relevant measures to control for these differences.”

When comparing unrelated children and controlling for related risk factors, the researchers found a slightly higher risk for all four conditions: 1.4 times higher odds for premature birth, 1.1 times higher odds for low fetal growth and 1.6 times higher risk for autism and ADHD.

In an uncontrolled analysis — which did not take these factors into account — antidepressant use in early pregnancy was associated with 1.5 times higher odds for premature birth, 1.2 times higher odds for fetal growth, 2.0 times higher risk for autism and 2.2 times increased risk for ADHD.

The majority of the antidepressants examined in the study — 82 percent — were selective serotonin reuptake inhibitors, or SSRIs, the most common type of antidepressants. Commonly used SSRIs include fluoxetine (Prozac), sertraline (Zoloft) and citalopram (Celexa).

In addition to the use of these medications during early pregnancy, the study looked at concurrent antidepressant use in fathers, as well as mothers’ use of antidepressants before but not during pregnancy.

These uses were associated with increased risk for autism, ADHD and poor fetal growth, providing evidence that family factors, such as genetics or environmental factors, influence these outcomes, as opposed to antidepressant use during pregnancy.

“The additional comparisons provide further evidence that other factors — not first-trimester exposure to antidepressants — explain why women who took these medications during early pregnancy were more likely to have offspring with these pregnancy and neurodevelopmental problems,” D’Onofrio said.

Source: University of Indiana

Read More

Posted by Patricia Adams in Default

Some Ties Found Between Social Media Activity, Narcissism

Some Ties Found Between Social Media Activity, Narcissism

A new German study finds a weak to moderate link between a certain form of narcissism and social media activity.

The enormous popularity of social media sites such as Facebook, Instagram and Twitter has challenged researchers to explain their appeal, and one area of interest has been the link between social media and narcissism.

Narcissists think of themselves as being exceptionally talented, remarkable and successful. They love to themselves to other people and seek approval from them.

As such, various studies conducted over the past years have investigated to what extent the use of social media is associated with narcissistic tendencies, with contradictory results. Some studies supported a positive relationship between the use of social network channels whereas others confirmed only weak or even negative effects.

The new study was led by Professor Markus Appel, chair of Media Communication at the University of Würzburg, and Dr. Timo Gnambs, head of the Educational Measurement section at the Leibniz Institute for Educational Trajectories, Bamberg.

The researchers performed a meta-analysis in which they summarized the results of 57 studies comprising more than 25,000 participants in total. Their findings appear in the Journal of Personality.

Given the established definition of narcissism, social networks such as Facebook are believed to be an ideal platform for these people, Appel said. The network gives them easy access to a large audience and allows them to selectively post information for the purpose of self-promotion. Moreover, they can meticulously cultivate their image.

As such, researchers have suspected social networking sites to be an ideal breeding ground for narcissists from early on. However, the new meta-analysis shows that the situation does not seem to be as bad as feared.

In the study, scientists examined three hypotheses.

The first assumption suggests “grandiose narcissists” frequent social networking sites more often than representatives of another form of narcissism, the “vulnerable narcissists.” Vulnerable narcissism is associated with insecurity, fragile self-esteem, and social withdrawal.

Secondly, investigators reviewed the assumption that the link between narcissism and the number of friends and certain self-promoting activities is much more pronounced compared to other activities possible on social networking sites.

Thirdly, the researchers hypothesized that the link between narcissism and the social networking behavior is subject to cultural influences.

That is, in collectivistic cultures where the focus is on the community rather than the individual or where rigid roles prevail, social media give narcissists the opportunity to escape from prevalent constraints and present themselves in a way that would be impossible in public.

Results from the meta-analysis of the 57 studies did in fact confirm the scientists’ assumptions.

Grandiose narcissists are encountered more frequently in social networks than vulnerable narcissists. Moreover, a link has been found between the number of friends a person has and how many photos they upload and the prevalence of traits associated with narcissism.

The gender and age of users is not relevant in this respect. Typical narcissists spend more time in social networks than average users and they exhibit specific behavioral patterns.

A mixed result was found for the influence of the cultural background on the usage behavior. “In countries where distinct social hierarchies and unequal power division are generally more accepted such as India or Malaysia, there is a stronger correlation between narcissism and the behavior in social media than in countries like Austria or the USA,” said Appel.

However, the analysis of the data from 16 countries on four continents does not show a comparable influence of the “individualism” factor.

Researchers wondered if the frequently cited “Generation Me” is a reflection or product of social media such as Facebook and Instagram because they promote narcissistic tendencies? Or, do these sites simply provide the ideal environment for narcissists? The researchers were not able to finally answer these questions.

“We suggest that the link between narcissism and the behavior in social media follows the pattern of a self-reinforcing spiral,” said Appel. And, the appeal of social media activities is dependent on an individual’s disposition.

Therefore, researchers say that more research has to be conducted over longer periods to resolve the questions.

Source: University of Wurzburg

Read More

Posted by Patricia Adams in Default

Eye Expressions Provide Insight Into Emotions

Eye Expressions Provide Insight Into Emotions

New research suggests we interpret a person’s emotions by analyzing the expression in their eyes.

Dr. Adam Anderson, professor of human development at Cornell University’s College of Human Ecology believes this process began as a universal reaction to environmental stimuli and evolved to communicate our deepest emotions.

In other words, the eyes may indeed be the window into the soul.

Anderson’s new study found that people consistently associated narrowed eyes — which enhance our visual discrimination by blocking light and sharpening focus — with emotions related to discrimination, such as disgust and suspicion.

In contrast, people linked open eyes — which expand our field of vision — with emotions related to sensitivity, like fear and awe.

“When looking at the face, the eyes dominate emotional communication,” Anderson said.

“The eyes are windows to the soul likely because they are first conduits for sight. Emotional expressive changes around the eye influence how we see, and in turn, this communicates to others how we think and feel.”

This findings, published in Psychological Science, builds on Anderson’s 2013 research which demonstrated that human facial expressions, such as raising one’s eyebrows, arose from universal, adaptive reactions to one’s environment and did not originally signal social communication.

Both studies support Charles Darwin’s 19th-century theories on the evolution of emotion, which hypothesized that our expressions originated for sensory function rather than social communication.

“What our work is beginning to unravel,” said Anderson, “are the details of what Darwin theorized: why certain expressions look the way they do, how that helps the person perceive the world, and how others use those expressions to read our innermost emotions and intentions.”

Anderson and his co-author, Dr. Daniel H. Lee, professor of psychology and neuroscience at the University of Colorado, Boulder, created models of six expressions — sadness, disgust, anger, joy, fear and surprise — using photos of faces in widely used databases.

Study participants were shown a pair of eyes demonstrating one of the six expressions and one of 50 words describing a specific mental state, such as discriminating, curious, bored, etc. Participants then rated the extent to which the word described the eye expression. Each participant completed 600 trials.

Participants consistently matched the eye expressions with the corresponding basic emotion, accurately discerning all six basic emotions from the eyes alone.

Anderson then analyzed how these perceptions of mental states related to specific eye features. Those features included the openness of the eye, the distance from the eyebrow to the eye, the slope and curve of the eyebrow, and wrinkles around the nose, the temple and below the eye.

The study found that the openness of the eye was most closely related to our ability to read others’ mental states based on their eye expressions.

Narrow-eyed expressions reflected mental states related to enhanced visual discrimination, such as suspicion and disapproval, while open-eyed expressions related to visual sensitivity, such as curiosity. Other features around the eye also communicated whether a mental state is positive or negative.

Further, he ran more studies comparing how well study participants could read emotions from the eye region to how well they could read emotions in other areas of the face, such as the nose or mouth. Those studies found the eyes offered more robust indications of emotions.

This study, said Anderson, was the next step in Darwin’s theory, asking how expressions for sensory function ended up being used for communication function of complex mental states.

“The eyes evolved over 500 million years ago for the purposes of sight but now are essential for interpersonal insight,” Anderson said.

Source: Cornell University

Read More

Posted by Patricia Adams in Default

Well-Kept Vacant Lots May Mean Less Crime in Urban Areas

Well-Kept Vacant Lots May Mean Less Crime in Urban Areas

Maintaining the yards of vacant properties, a movement known as “greening,” may help reduce crime rates in urban neighborhoods, according to a new study at Michigan State University. The findings show that higher levels of greening are tied to less crime in general, including victimless crimes, property crimes and even violent crimes.

Previous research has shown that greening and gardening programs are linked to less stress, depression and hopelessness for residents, as well as lower crime rates, including assaults, burglaries and robberies. But an in-depth space-and-time analysis of these correlations has not been explored until now, say the researchers.

For the study, the researchers analyzed nine years of crime statistics in Flint, Mich., using data from a greening program where thousands of abandoned lots in various neighborhoods were regularly mowed and maintained.

Today, more than 42 percent of the properties in Flint are either publicly owned or otherwise vacant.

Dr. Richard Sadler, an urban geographer and the study’s lead author, assigned each neighborhood a greening score based on how many vacant properties in the area were being kept up. Using a method called “emerging hot spot analysis,” which identifies patterns or trends of events over space and time, he applied crime data from 2005 through 2014.

“Generally speaking, I found that greening was more prevalent where violent crime, property crime and victimless crime were going down,” said Sadler, an assistant professor of public health in the College of Human Medicine.

The idea for the study was born when the Genesee County Land Bank Authority began its Clean and Green program 13 years ago to help maintain vacant properties throughout the city. They discovered that over the years, the program seemed to produce another benefit — crime appeared to be declining.

“We’ve always had a sense that maintaining these properties helps reduce crime and the perception of crime,” said Christina Kelly, the land bank’s planning and neighborhood revitalization director. “So we weren’t surprised to see the research back it up.”

Flint has one of the highest crime rates in the nation. The city’s population of slightly more than 100,000 is half what it was in the 1960s when it was the world headquarters for Buick. But once the auto industry pulled out of the city, Flint lost 41 percent of its jobs. This led to a concentration of poverty in the city as well as a decrease in the number of police officers.

Sadler said investments in eliminating blight and encouraging community buy-in can pay off in a number of ways for urban areas across the country and be less expensive to sustain.

He indicated that programs such as Clean and Green not only make the properties more attractive for development and stabilizes neighborhoods, but alert potential criminals that residents are keeping an eye on things.

“It’s people looking out for their own neighborhoods,” he said. “If you know somebody’s watching, you’re not going to go out and vandalize something. It’s the overall change in perception created by cleaning up blighted property.”

The study is published online in the journal Applied Geography.

Source: Michigan State University

Read More

Posted by Patricia Adams in Default

Study Finds No Evidence Brain Games Improve Cognition

Study Finds No Evidence Brain Games Improve Cognition

New research suggests that brain training may not protect older people from memory loss or help them think better.

Florida State researcher Dr. Neil Charness, professor of psychology and a leading authority on aging and cognition, teamed up with Dr. Wally Boot, associate professor of psychology, and graduate student Dustin Souders to investigate claims made by the booming brain-training industry.

“Our findings and previous studies confirm there’s very little evidence these types of games can improve your life in a meaningful way,” said Boot, an expert on age-related cognitive decline.

Their findings appear in the science journal Frontiers in Aging Neuroscience.

Charness, who’s also the director of FSU’s Institute for Successful Longevity, said an increasing number of people believe brain training helps protect them against memory loss or cognitive disorders.

“Brain challenges like crossword games are a popular approach, especially among baby boomers, as a way to try to protect cognition,” Charness said.

That popularity has turned the brain-training industry into a billion-dollar business. Brain games are available online and through mobile apps that typically sell for about $15 a month or $300 for lifetime memberships.

But advertising for this rapidly growing business sector has sometimes used inflated claims. The Federal Trade Commission fined one brain-training company $50 million for false advertising, which was later lowered to $2 million.

“More companies are beginning to be fined for these types of inflated claims and that’s a good thing,” Boot said. “These exaggerated claims are not consistent with the conclusions of our latest study.”

The FSU team’s study focused on whether brain games could boost the “working memory” needed for a variety of tasks. In their study, they set up one group of people to play a specially designed brain-training video game called “Mind Frontiers,” while another group of players performed crossword games or number puzzles.

All players were given lots of information they needed to juggle to solve problems. Researchers tested whether the games enhanced players’ working memory and consequently improved other mental abilities, such as reasoning, memory and processing speed.

That’s the theory behind many brain games: If you improve overall working memory, which is fundamental to so much of what we do every day, then you can enhance performance in many areas of your life.

The team examined whether improving working memory would translate to better performance on other tasks or as the researchers called it: “far transfer.”
In short, no.

“It’s possible to train people to become very good at tasks that you would normally consider general working memory tasks: memorizing 70, 80, even 100 digits,” Charness said. “But these skills tend to be very specific and not show a lot of transfer. The thing that seniors in particular should be concerned about is, if I can get very good at crossword puzzles, is that going to help me remember where my keys are? And the answer is probably no.”


Charness noted that other research finds aerobic exercise, rather than mental exercise, is great for your brain. Physical exercise can actually cause beneficial structural changes in the brain and boost its function. He predicts “exer-gaming,” which combines exercise with brain games, will increase in popularity in the 21st century.


Source: Florida State University/EurekAlert

Read More

Posted by Patricia Adams in Default

Vets Show that Post-Traumatic Growth May Follow PTSD

Vets Show that Post-Traumatic Growth May Follow PTSD

New research finds that military veterans who went through trauma and related post-traumatic stress disorder (PTSD) are also more likely to experience “post-traumatic growth.”

Investigators discovered recovering veterans often experience an increased appreciation of life, awareness of new possibilities and enhanced inner strength.

“There’s been a lot of attention paid to PTSD in our military population, but very little research on post-traumatic growth,” says Sarah Desmarais, an associate professor of psychology at North Carolina State University and author of a paper on the new study.

“But these findings are important, because they show that the way veterans respond to trauma is not a zero-sum game.”

“Some Department of Defense (DoD) training implies that people are either resilient or they’re not, but we found that people can struggle with PTSD and experience emotional growth due to traumatic events,” says Jessica Morgan, Ph.D. candidate at NC State and principal investigator on the study.

“In addition, growth can occur very quickly, or it can be a process that unfolds over years. In other words, while recovering from trauma can be a painful and difficult ordeal, veterans and their families can have hope, and the DoD should pay attention to this field of study.”

In the study, researchers conducted a survey of 197 veterans from all branches of the military. Approximately half of the study participants served in the Army, 72 percent were active duty, and 69.4 percent were male.

Study participants reported on a traumatic event that had occurred within the previous three years and were asked a series of questions designed to measure post-traumatic growth. Growth was measured on a scale from zero to 105.

The researchers found that study participants fell into four groups with respect to their post-traumatic growth.

The short-term moderate group, including 33.7 percent of participants, had post-traumatic growth scores typically between 40 and 60 and experienced that growth within about 6 months of the traumatic event.

The long-term moderate group made up 18.7 percent of participants, and reported similar levels of post-traumatic growth, but more than a year after the traumatic event.

The high-growth group, 20.7 percent of participants, had scores typically between 70 and 105 – and this growth could take anywhere from a few months to several years. The last group, made up of 26.9 percent of participants, experienced limited post-traumatic growth.

The researchers found that members of each group shared common characteristics.

For example, the group that experienced the greatest post-traumatic growth was made up of participants who were the most likely to report that their trauma fundamentally challenged the way they viewed the world.
They also spent the most time thinking about their traumatic event and had the highest rate of PTSD.

Those who experienced moderate growth very quickly had similar characteristics, placing second in all three categories: the extent to which the trauma challenged their worldview, the amount of time spent thinking about the trauma, and the rate of PTSD.

At the other end of the spectrum, those who experienced limited post-traumatic growth ranked last in all three categories.

“One of the key points here is that there can be real benefit from having military veterans think about their traumatic experiences,” Desmarais says.

“While it may be painful in the short term, it can contribute to their well-being in the long term.

“These findings also demonstrate that we need to do more research into post-traumatic growth, working with the veteran community,” Desmarais adds.

“The fact that we still know so little about post-traumatic growth, and that much of the existing work was not done with members of the military, is a significant oversight.”

Source: North Carolina State University

Read More

Posted by Patricia Adams in Default

Tough Breaks in Life Can Fuel Extreme Political Views

Tough Breaks in Life Can Fuel Extreme Political Views

A new study finds that stress, be it the result of losing a job or dealing with an illness, can lead people to adopt more extreme political views.

University of Toronto researchers discovered negative life events can have a profound impact on how people think about how the world should work.

“If people experience unexpected adversity in their lives they tend to adopt more rigid styles of thinking,” said Dr. Daniel Randles, a post-doc researcher in psychology at U of T Scarborough.

The study, which is published in the journal Social Psychological and Personality Science, drew on an existing survey of about 1,600 Americans who were repeatedly polled between 2006 and 2008.

Randles stresses that while he’s not a political scientist, the research could shed light on growing support for populist politics.

“Over the last few years there’s a general feeling that a more rigid form of politics is emerging. It’s possible that more extreme candidates are becoming popular because the people who support them have a growing number of challenges in their lives that they weren’t expecting.”

For the survey, participants were asked about their political attitudes as well as negative events they faced in their personal lives to see if their attitudes changed following adversity.

The unexpected negative life events ranged from divorce, illness, injury and assault to even loss of a job.

Randles found that regardless of where people stand on the political spectrum – left or right – adverse life events hardened their leanings either way.

“After facing adversity, these respondents weren’t saying about an issue, ‘Maybe this is OK.’ They were either saying, ‘This is definitely OK,’ or, ‘This is definitely not OK,’” said Randles.

Randles, whose past research has looked at the behavioral consequences of uncertainty, said those who have very black and white views are probably more vulnerable to moving towards the extreme.

“It’s not an on/off switch. It’s a slow movement towards either end of the spectrum based on negative experiences,” he says, adding there’s no exact number of events that can cause the effect.

Randles believes the shift in perception occurs because people tend to have expectations about how those around them will behave, and how the natural world should work as a possible explanation.

“If people believe that something about their world has suddenly changed, they will look for things in the world that are still intact,” he said.

Source: University of Toronto

Read More

Posted by Patricia Adams in Default