by Carolyn Gregoire, syndicated from huffingtonpost.com, Daily Good: http://www.dailygood.org/story/954/science-proves-that-hugs-can-boost-your-immune-system-carolyn-gregoire/
We know that hugs make us feel warm and fuzzy inside. And this
feeling, it turns out, could actually ward off stress and protect the
immune system, according to new research from Carnegie Mellon University.
It's a well-known fact that stress can weaken the immune system. In
this study, the researchers sought to determine whether hugs - like social support more broadly - could protect individuals from the increased susceptibility to
illness brought on by the particular stress that comes with
"We know that people experiencing ongoing conflicts with others are
less able to fight off cold viruses. We also know that people who report
having social support are partly protected from the effects of stress
on psychological states, such as depression and anxiety," the study's lead author, psychologist Dr. Sheldon Cohen, said in a statement.
"We tested whether perceptions of social support are equally effective
in protecting us from stress-induced susceptibility to infection and
also whether receiving hugs might partially account for those feelings
of support and themselves protect a person against infection."
In the experiment, over 400 healthy adults who filled out a
questionnaire about their perceived social support and also participated
in a nightly phone interview for two weeks. They were asked about the
frequency that they engaged in interpersonal conflict and received hugs
Then, the researchers exposed the participants to a common cold
virus, and monitored them to assess signs of infection. They found that
both perceived social support and more frequent hugs reduced the risk of
infection associated with experiencing interpersonal conflict.
Regardless of whether or not they experienced social conflicts, infected
participants with greater perceived social support and more frequent
hugs had less severe illness symptoms.
"This suggests that being hugged by a trusted person may act as an
effective means of conveying support and that increasing the frequency
of hugs might be an effective means of reducing the deleterious effects
of stress," Cohen said.
"The apparent protective effect of hugs may be attributable to the
physical contact itself or to hugging being a behavioral indicator of
support and intimacy ... either way, those who receive more hugs are
somewhat more protected from infection."
If you needed any more reason to go wrap your arms around someone special, consider this: Hugs also lower blood pressure, alleviate fears around death and dying, improve heart health and decrease feelings of loneliness.
The findings were published in the journal Psychological Science.
This article is reprinted with permission from the Huffington Post's Good News Channel
Saturday, January 31, 2015
Friday, January 30, 2015
|The Thinking Man at Musée Rodin in Paris (Photo: Wikipedia)|
by Jordan Bates
by Jordan Batesby Jordan Bates, Refine the Mind: http://www.refinethemind.com/destructive-western-ideas/
Since the day you emerged into this bizarre, sparkling universe, you’ve been conditioned to think in certain ways.
And that’s damn wonderful a lot of the time. It’s arguably a blessing that our minds learn to auto-dismiss certain notions - like, say, walking off of that cliff or stabbing Uncle Melvin with a butter knife - and auto-accept others.
But it’s also problematic. Because, well, our minds are gullible - sufficiently gullible, at least, to spend the first decade or seven of our lives unconsciously internalizing the dominant ways of thinking of our culture.
From our earliest years, we are surrounded by the worldviews of our parents, our education systems, our economic systems, our governments, our media, our religious institutions, and myriad other sources of foundational values and ideas. Depending on one’s culture, this situation eventually results in significant suffering, as we’ll see. And arguably this would not be the case if, at a young age, we were instructed to view all of these sources as reliant on fallible, human ideologies, but this is almost never the case.Unknown Title, Ewa Gorals. Photo Credit: Wiki Commons
Incentives for Claiming to Know the “Truth”
At least in the United States, my home country, most of these entities are portrayed, or portray themselves, as possessing and delivering the capital-T Truth. The media often portrays itself as a “No Spin Zone” and delivers only firm, declarative statements; the government purports to understand what is most beneficial “for the people”; religion has access to the “word of God”; schools teach “the facts”; mama “knows best,” etc. All of this contributes to a sense that those before us figured everything out, that we need not ask questions. The “Truth” has already been filed away.
There are strong incentives, for both individuals and institutions, to claim to hold or to genuinely believe they hold a monopoly on the “Truth.” One of those incentives is to protect themselves, their fellow citizens, and their loved ones from the unknown, or the unknowable. As Kurt Vonnegut once wrote:
“Tiger got to hunt, bird got to fly;
Man got to sit and wonder ‘why, why, why?’
Tiger got to sleep, bird got to land;
Man got to tell himself he understand.”
Culture is like an operating system that provides default answers to all of life’s questions to protect us from the inevitable insecurity and anxiety that arise when we admit that we know very little about what is actually going on - about what we are, why we’re here, what we should do, etc. I would not be so bold as to suggest that this is never a good thing. Culture can aid us, I think, by providing stabilizing structure where none would otherwise exist. And some cultural values are surely worthy, human-friendly ones.
However, in emphasizing a specific set of values, most cultures (particularly dominator cultures) seem to become dogmatic - i.e. seem to reach a point where they do not recognize other ways of seeing and thinking and living than their own predominant prescriptions. For anyone curious about the truth of our condition or about alternative modi operandi, this can become an absurdly limiting and oppressive state of affairs.
Another incentive for sources to claim to possess the “Truth” has to do with rhetoric, persuasion. In an age of expert opinions and fifty million channels of information, the “Truth” gets people to listen. And everyone wants an audience, wants to propagate their worldview for its own sake or for other ends. Convincing people to do and believe what you want them to do and believe is supremely dependent on presenting yourself or your organization as wise, enlightened, certain.
No One Has the Answers
But, as you are hopefully aware, the “Truth” changes, depending on who you ask. It differs wildly from continent to continent, culture to culture, institution to institution, family to family, individual to individual. There are infinite variations on the “Truth.”
It may follow that truth is utterly subjective, but I’ll save the nihilism discussion for another day. What’s imperative to realize, here and now, is that one culture’s truth is another culture’s fiction; and to consider, furthermore, that much of what has been presented to you as “True” might be an agenda-driven house of cards - a collection of misguided, propagandistic faux-facts, some of which lead to despair if left unchallenged.
I don’t mean to pull a wide-eyed “What if I told you?” stunt à la the Matrix Morpheus meme. This is just food for thought, and again, this is not to say that culturally inherited perspectives and structures can never be useful, meaningful, life-stabilizing, social-cohesion mechanisms. I think they often are. But I humbly submit to you that an idea of inconceivable purchase - especially in terms of un-learning certain insidious culturally inherited worldviews - is the idea that no one has the answers. If we’re honest (which might be a good idea sometimes), every person, every culture, and every institution tells a story, and everyone’s story is, at root, relative to their point of view and way of life. Including mine, so don’t take me too seriously.
When you start to think in this way, you become much more curious and critical. You begin to take personal responsibility for separating the strawberries from the smegma - for deciding which ideas are worthwhile and which are malignant. Presuming you see value in this undertaking, let me suggest to you (in cute, easily processable listicle-blurbs) that the following six ideas - ideas prevalent in (though not exclusive to) the West - are at best unsound and at worst, utterly destructive.
Idea #1: Uncertainty can and should be eliminated from existence.
“Only mystery allows us to live, only mystery.”Federico García Lorca
From our earliest years of schooling, there is a rhetoric of certainty wafting in the air, a notion of control over knowledge and over events. Year after year, the system has already determined what’s best for us: first grade to second grade to third grade, et al. Life has an order, an obvious trajectory. Schools, experts, and scientists seem, conveniently, to have “the facts” - everything we need to know about the universe and everything we need to know to “succeed” (slippery linguistic sign, that one).
By the time we’re 15 or 16, we’re well-aware of a traditional life-path narrative in which people go to college, find a nice job, work their way up in the world, get married, have kids, buy a house, settle down, retire, die. At the same time, we’re beginning to be bombarded with terms like “career path,” “next step,” “plan your future,” etc. etc., all of which serve to further cement in us the notion that it is desirable and possible to “figure out” our lives, to schedule them down to the smallest detail, to eliminate uncertainty.
Eventually, though, this narrative breaks down. We eat some strange fungi or get fired or read too much philosophy or break our pelvis or miss a flight or gaze too long at the stars or our dog dies, and it becomes (perhaps distressingly) clear that things are not so certain. Existence forever evades compartmentalization, and countless events, both micro and macro in scale, occur despite our expectations otherwise. Innumerable questions defy our understanding. Our efforts to impose certainty and control onto the universe are attempts to avoid countenancing the inevitable fear that comes with acknowledging our own physical and perceptual limitations.
This fear seems to cause many people to need certainty - to insist that they or someone else (priests, scientists, etc.) understands How Things Are in some kind of absolute way. I see this as tragic. It’s as if these people want to take kaleidoscopic, inscrutable, inarticulable existence and put it in some kind of hermetically sealed canister in order to feel proud that it’s been contained and dissected thoroughly. From my perspective, this attitude eliminates or severely paralyzes one’s ability to really see the living mystery that surrounds us all the time and to feel what is, I think, the greatest of feelings - profound astonishment and awe at the fact of one’s own existence.
Ultimately, we cannot escape the unknown. It will eventually return to engulf, torment, and cripple us if we have long suppressed it. For some time, I’ve aimed to adopt an attitude I dub “dancing with uncertainty.” Rather than feeling that I need to know anything, I have aimed to embrace that I am immersed in and inseparable from a colossal, mysterious unfolding of being and don’t really know what will occur tomorrow, let alone in five years. I admit that I don’t understand this whole “life” thing, and from that admission emerges humility, wonder, curiosity, and an openness to novelty and possibility. I heed the poet Rilke’s words: “Live the questions.”
Idea #2: You deserve to feel great all the time, and you can.
“This is Bob. Bob is doing well. Very well indeed. That’s because not long ago with just a quick phone call Bob realized that he could have something better in his life.”
You may have recognized the above ad-copy as that of Enzyte, “Natural Male Enhancement,” and were unwittingly prompted to imagine “Bob,” that disturbingly perky boner-boosting poster boy and his eerily exaggerated inhuman perma-smile. “Bob” is a prime (if a bit extreme) example of an archetype that is widely propagated by advertisers, self-help “gurus,” and the hokey, think-positive people who you unfollowed on Facebook for incessantly posting“Just Be Happy Now” memes.
The archetype is that of the ever-smiling person - the complete and totally realized human who has discovered happiness and now just feels so damn good that they can hardly stand it. In a culture that places an inestimable premium on individual happiness, it’s comforting to believe that such an ideal can be realized. It’s also quite lucrative to sell people an image of themselves attaining the ideal.
But beneath the glamorous, ever-grinning archetype is a serpentine subtextual message, a message that can quite literally make you feel as if something is terribly wrong with you. The message is that happiness consists in never feeling depressed, lonely, scared, anxious, or downright shitty. That if we just buy the product or adopt the right mentality, perpetual Best Days Ever are within reach. But is such a vision really tenable or the substance of agenda-driven fantasy? My vote: the latter.
Perhaps “happiness,” if the word means anything at all, means accepting whatever circumstances or emotions arise. To be human is to experience a vast spectrum of emotion, and there will always be an ebb and flow. Remembering to approach all happenings with an attitude of “this too shall pass” is more valuable than a warehouse full of consumer quick-fixes.
Furthermore, as many a thinker has argued, our pain can be an essential catalyst toward resilience, self-knowledge, and compassion. Friedrich Nietzsche famously stated, “That which does not kill us makes us stronger.” Viktor Frankl, a renowned psychotherapist and Holocaust survivor, thought that in our suffering we could discover profound meaning. Dostoyevsky, Schopenhauer, Rilke, and other writers, poets, and philosophers have likewise held deep convictions about the metamorphoses and purposes to be found in misery. So, in sum, if you feel like hell, chill. You’re human. Be with those feelings; see what they might mean to you and know that they’re temporary.
Idea #3: You should be afraid.
“Nature loves courage. You make the commitment and nature will respond to that commitment by removing impossible obstacles. Dream the impossible dream and the world will not grind you under, it will lift you up. This is the trick. […] This is how magic is done. By hurling yourself into the abyss and discovering it’s a feather bed.”Terence McKenna
In numerous ways, society implicitly communicates to us that the world is a harsh and frightening place. As mentioned above, we’re implored to meticulously design our future, to plan the “secure” career - the sure bet, the foolproof strategy. Risky, bold, or unpopular decisions are thus necessarily presented to us as avoidable traps. Caution is portrayed as the only way to ensure success. Anything that thousands or millions of other people aren’t already doing is likely to be met with unease from our parents, counselors, coaches, and advisors.
Moreover, the entirety of the outside world becomes something to fear if one invests in mass-media narratives (something all too easy to do, consciously or unconsciously). Every other news story is a sexual assault or homicide or bombing or school shooting or shocking accident. The news is over-saturated with exceptionally enraging, fear-inducing, and unsettling stories. This pattern communicates an inaccurate image of a world where death and danger lurk in every shadowed alley. That’s not to say that there isn’t a whole lot of frankly fucked up garbage happening in this world. There is, and we should keep our wits about us, have compassion for all, and care about helping to develop more effective systems and a sense of global solidarity.
But the point is that the media skews our perception, exaggerating stories for shock value and subtly conditioning us to expect highly improbable events to affect us. We’re taught to look out at the world and see tragedies rather than possibilities. This situation and other sources of fear become further reasons to take the cautious life-path that is advertised to us in schools and acted out all around us by the majority of citizens.
But when fear dictates our lives, we almost inevitably avoid things that beckon to us. Often times, the things that you really want to do - the stand-up comedy bit, the backpacking trip abroad, the music project, etc. - are petrifying. God forbid, we might not get the result we want or be laughed at by other people. This fear prevents most of us from “doing our thing” - i.e. expressing ourselves openly and honestly in our lives/actions, flowing intelligently with our nature. I have my fair share of fears and anxieties, but I refuse to be reduced and controlled by them - I let them be and try always to do the personally meaningful things that arise organically within me. To invoke Vonnegut once more: “We have to continually be jumping off cliffs and developing our wings on the way down.”
Idea #4: The products can fix you.
“It did what all ads are supposed to do: create an anxiety relievable by purchase.”David Foster Wallace
The subtext of 99% of advertising for consumer products is this: “You’re inadequate, ugly, uncool, no fun, average, predictable, prudish, and inessential, but our product can fix you.“ As Wallace points out in the above quote, advertisements are designed to foster in us a sense of anxiety about some aspect of our lives, to give us a sense that something is missing, but that the emptiness can be dispelled “for just $19.95!.”
This is the rhetoric of consumerist culture, and goddammit it is effective. I mean, look around you. People are working endless hours, week after week, year after year, just to throw it all away on a TV with imperceptibly higher resolution, an upgrade for their perfectly functional cell phone, silicone body parts and liposuction, a new wardrobe that won’t be “stylish” by next year, janky and extraneous gadgetry, plastic fruit, lawn ornaments, ShamWows, skin creams, throw pillows, diet pills, and infinite other bits of trivial bullshit.
The obvious point is that if any of this rubbish actually rectified the void that people try so desperately to remove from their lives, they’d stop buying. But they don’t - because making a purchase is simple and addictive and because the jolt of relief and satisfaction that arises from purchasing is fleeting as it is liberating. Rather than being any sort of real solution, consumption is merely a cyclic distraction. No amount of external accumulation can resolve internal issues and conflicts. These things require time, reflection, introspection, and a willingness to countenance unsexy truths about ourselves/life. It’s no surprise that most people try to escape this oft-uncomfortable work.
Idea #5: There are “good” people and there are “bad” people.
We all remember hearing things like “Be a good girl, Wilma.” or “Don’t be a naughty little boy, Sigmund.” From a young age, we were introduced to a set of ethical rules - rules which determined whether a person was “good” or “bad.” Most of the stories we grew up with, whether in the form of movies or books, contained morally unambiguous characters - that is, heroes and villains, the “good guys” and the “bad guys.”
Furthermore, those of us who were indoctrinated into certain Western religious institutions were taught that the species Homo sapiens is inherently sinful, and that some “good” people go to a Heaven while other “bad” people burn for eternity in Hell. In the US, at least, criminals are typically viewed not as fallible humans who made a mistake, but as irrevocably evil men who deserve decades-long prison sentences or capital punishment, rather than a second chance or rehabilitation.
This dichotomy of “good” and “bad” is firmly established in Western culture. It burrows deep into our fundamental conception of the world, generating guilt, shame, and doubt regarding our actions. I don’t deny that people are shitheads at times. Some of us do downright hideous things, and all of us make regrettable errors and end up hurting ourselves and other people we would rather not have hurt. The problem with a cultural dichotomy of “good” and “bad” is that it suggests to us that any one ephemeral mistake could have lifelong consequences, could prove that we are just downright “bad” people.
Though I personally feel we should hold ourselves largely accountable for our actions (while viewing ourselves and others with compassion), we must recognize that numerous factors beyond our control contribute to our mental state and impulses at any given time. As humans, it is a given that we will falter at some point. “The deck is stacked against us,” I often say. We will fuck up, but we will also do generous, loving things. These two poles arise mutually within us, and one allows us to experience the other, and vice versa. We are neither purely “good” nor purely “bad.” Kahlil Gibran knew this when he wrote:
“You are good in countless ways, and you
are not evil when you are not good,
You are only loitering and sluggard.
Pity that the stags cannot teach swiftness
to the turtles.”
Judging someone else to be a “bad” person is an exercise in subtle self-aggrandizement, and conversely, dwelling in excessive guilt or shame over our mistakes is entirely counterproductive. As Albert Ellis once pointed out, the most effective way to express remorse for our actions is to acknowledge that what we did was misguided and then to focus on doing better now - in the moment we can still influence.
The other thing to consider is that both “good” and “bad” might be wholly native to the human experience and not objectively real. Existence may well be amoral, as Zen, Taoism, and other schools of thought suggest. This is one of those arguably unanswerable questions I mentioned earlier. Either way, I take the position that a basic moral compass - something as simple as compassion deriving from a recognition of mutual suffering - is an indispensable component of a meaningful human life. I humbly submit that we ought to aim for kindness and understanding, remembering that all people are human, just like us.
Idea #6: You are better than “them.”
We humans naturally form our identities by contrasting ourselves with that which appears different from us. We call ourselves “artists” or “athletes” because not every person is creatively expressive or adept at sports. If everyone was, the terms “artist” and “athlete” would lose their meaning and simply be subsumed into our “human” identity - the identity we form by contrasting our physiology with that of other animals. Curiously, all of our individual identities and even our collective “human” identity are rendered illusory when we examine ourselves at the atomic or subatomic level - at this fundamental level, everything is the same - but most of us don’t focus on this most of the time. Which is okay, because it’s fun and interesting to lose oneself in the game of human identity and social existence, wherein distinctions between people are indispensable for order and civil behavior. But I digress.
Groups - distinct parcels of people often set in opposition to one another - provide an ideal opportunity for the sorts of contrasts that we rely on for a sense of identity. For this reason and others, we in the Western world endlessly divide ourselves into in-groups and out-groups. We’re jocks, hipsters, nerds, feminists, Vikings fans, environmentalists, atheists, Republicans, stoners, frat guys, bikers, vegetarians, transcendental idealists, et al.
Unfortunately, this group-based social dynamic is a slippery slope to narcissism and animosity. A healthy amount of pride in a given community identity can silently morph into an elitist sense of superiority and a desire to spite relevant out-groups or people generally who are not “in the club.” These types of attitudes have historically been and remain a mainstay in religious organizations, political parties, races, and social classes, as well as in countless other niche-groups such as the “popular” kids in a high school or a segment of health enthusiasts on the Internet or NRA members or all of the professors in a given department at a university.
Ironically, if it weren’t for all of the people that are different from you or I, we would be rendered indistinguishable from one another, retain no shape to call our own, and the whole game of human identity would be kaput. As I said above, we create our identities by contrasting ourselves with that which we are not, so maybe we ought to be thankful for those unlike us. On a deeper level, we might realize that our personal differences are illusory and temporary. We invent endless differences in an ongoing game of human drama when, at root, we are all members of the same species living on a tiny rock in a mysterious void. Beyond that, we’re all sentient beings. And beyond that, we’re all star-stuff, energy, subatomic particles.
Deciding that “we” are better than “them” has been a fatal error throughout human history leading to innumerable wars, genocides, and other unspeakable acts of brutality. We ought to aim to define ourselves first and foremost as the same fundamental stuff in an unknowable existence and realize that everything else is a little human game that we’re playing.
In sum, beware of dysfunctional cultural operating systems. Cool? Cool.
I’ll leave you with this passage - which echoes my introductory sentiments and distills the crux of this piece - from the philosopher Robert Anton Wilson:
“It’s important to abolish the unconscious dogmatism that makes people think their way of looking at reality is the only sane way of viewing the world. My goal is to try to get people into a state of generalized agnosticism, not agnosticism about God alone, but agnosticism about everything. If one can only see things according to one’s own belief system, one is destined to become virtually deaf, dumb, and blind. It’s only possible to see people when one is able to see the world as others see it.”
Saturday, January 24, 2015
I Don’t Vaccinate My Child Because It’s My Right To Decide What Eliminated Diseases Come Roaring Back
As a mother, I put my parenting decisions above all else. Nobody knows my son better than me, and the choices I make about how to care for him are no one’s business but my own.
So, when other people tell me how they think I should be raising my child, I simply can’t tolerate it.
Regardless of what anyone else thinks, I fully stand behind my choices as a mom, including my choice not to vaccinate my son, because it is my fundamental right as a parent to decide which eradicated diseases come roaring back.
The decision to cause a full-blown, multi-state pandemic of a virus that was effectively eliminated from the national population generations ago is my choice alone, and regardless of your personal convictions, that right should never be taken away from a child’s parent. Never.
Say what you will about me, but I’ve read the information out there and weighed every option, so I am confident in my choice to revive a debilitating illness that was long ago declared dead and let it spread like wildfire from school to school, town to town, and state to state, until it reaches every corner of the country.
Leaving such a momentous decision to someone you haven’t even met and who doesn’t care about your child personally - now that’s absurd! Maybe I choose to bring back the mumps. Or maybe it’s diphtheria. Or maybe it’s some other potentially fatal disease that can easily pass among those too young or too medically unfit to be vaccinated themselves.
But whichever highly communicable and formerly wiped-out disease that I opt to resurrect with a vengeance, it is a highly personal decision that only I and my family have the liberty to make.
The bottom line is that I’m this child’s mother, and I know what’s best. End of story. Politicians, pharmaceutical companies - they don’t know the specific circumstances that made me decide to breathe new life into a viral infection that scientists and the nation at large celebrated stamping out roughly a century ago.
It seems like all they care about is following unexamined old rules, injecting chemicals into our kids, preventing ghastly illnesses that used to ravage millions and have since been erased from storming back and wreaking mass havoc on a national scale, and making a buck. Should we really be listening to them and not our own hearts?
I am by no means telling mothers and fathers out there what to do; I’m simply standing up for every parent’s right to make his or her own decision.
You may choose to follow the government-recommended immunization schedule for your child, and that’s your decision as a parent. And I might choose to unleash rubella on thousands upon thousands of helpless people, and that’s my decision as a parent.
It’s simple: You don’t tell me how to raise my kids to avoid reviving a horrific illness that hasn’t been seen on our shores since our grandparents were children, and I won’t tell you how to raise yours.
Look, I’ve done the research on these issues, I’ve read the statistics, and I’ve carefully considered the costs and benefits, and there’s simply no question in my mind that inciting a nationwide health emergency by unleashing a disease that can kill 20% or more of its victims is the right one for my child.
People need to respect that and move on.
Friday, January 23, 2015
by Karin Eckhard, European Wilderness Society: http://wilderness-society.org/shifting-baseline-syndrome-disease-forgetting-lost-nature/
I first read about it in a recent book by Jeremy Hance, called Life is Good: Conservation in the Age of Mass Extinction, but the disease was identified much earlier, the syndrome having been discovered by Daniel Pauly who first elucidated its symptoms in 1995.
He observed only one symptom and that is the acceptance by the general population of the gradual disappearance of nature, in the specific case about which Daniel Pauly spoke, was the marine environment.
He observed that fisherman and marine scientists assumed that the current populations of fish species were the norm, he observed how each generation was gradually accepting less, less fish.
This syndrome is also present in terrestrial landscapes as well, we accept less forests, less meadows, less species diversity. It is now an accepted ‘syndrome’ and it is defined as a gradual change in our accepted norms for ecological conditions.
The syndrome manifests itself by incrementally lowering our standards of the natural world which results in each new generation lacking knowledge of the historical and presumably more natural condition of the environment of previous generations. In simpler terms, we are forgetting what has already been lost in nature.
I alluded to this syndrome in a blog post from Nov, called the Silencing of European Wilderness in which it highlighted the rapid decline of bird species in Europe forests, not just rare species but common species as well.
Are we going to allow this disease to silence our forests, meadows and wilderness areas, will we not even notice what is happening? But it is happening, gradually as Daniel described. Will wilderness, natural areas and a diversity of species become a distance memory in the years to come?
As it stands, only 2% of Europe is considered wilderness but the European Wilderness Society is working towards more wilderness in Europe, with a goal of increasing that percentage to 5%. We are creating momentum about the importance of wilderness areas, educating about what constitute wilderness and why it needs protection.
We don’t just speak about wilderness protection but also we tell the story of large carnivores and their plight throughout Europe.
We are constantly trying to show the importance of top predators in the ecosystem, bears, wolves and lynx. Their importance for biodiversity conservation and how they form an integral part of the cultural and natural heritage of the European landscape, how they are unjustly persecuted for perceived threats to human safety and livestock.
Join us in helping stop shifting baseline syndrome! We want More Wilderness in Europe.
Sunday, January 18, 2015
|English: Dr. Martin Luther King giving his "I Have a Dream" speech during the March on Washington in Washington, D.C., on 28 August 1963. Español: Dr. Martin Luther King dando su discurso "Yo tengo un sueño" durante la Marcha sobre Washington por el trabajo y la libertad en Washington, D.C., 28 de agosto de 1963. (Photo credit: Wikipedia)|
In honor of Martin Luther King Jr’s birthday this week, I bring to you the story behind his most famous speech, just in case, as the great Paul Harvey used to say, you don’t know the rest of the story.
I hope you enjoy.
Thursday, January 15, 2015
|Students need sleep in order to study (Photo: Wikipedia)|
You might think that you got all you need to know from those times you fell asleep in your high school calculus class, but there is a lot more to napping than meets the eye.
You may be most familiar with the Spanish siesta. A cultural habit in Spain, as well as Spanish influence on other Hispanic countries and the Philippines, the word “siesta” derives from the Latin phrase hora sexta or “sixth hour” (counting from dawn, this is around midday).
The concept also has a strong presence in Southern Italy, where museums, churches and shops close midday for riposo. In Japan, employees are often encouraged to take naps during the work day, not only to increase performance but also because the need for a nap supposedly shows that an employee is working hard.
Sleep itself is a vital necessity for our bodies and minds. Not getting enough sleep can cause physical health problems such as high blood pressure, irregular heart rhythm, weight gain, vulnerability to colds and flu, and even increased risk for more serious illnesses such as diabetes and heart disease.
Risks for your brain include irritability, trouble focusing, poor reflexes, forgetfulness, and decreased coordination and balance. Continuous sleep deprivation is a problem and needs to be treated by lifestyle changes or a visit to a doctor, but naps can help temporarily remedy some of the side effects.
Additionally, napping offers many benefits for those who more or less get a good amount of sleep but want a little boost during the day. Taking brief naps at a seasonable time during the day has proven to increase alertness, improve the ability to perform tasks, improve overall mood, increase creativity, and increase memory performance.
Think of it as a form of resetting your system. In fact, the idea that we are supposed to have one big sleep at night and stay awake until the following night is a relatively new one. Scientists now say we are actually hardwired to take naps or at least have more than one sleep per 24 hour cycle, and historians have found some evidence to back up this claim.
And while we’re talking about naps, here are some helpful tips to take a great one every day:
- Nap at a regular time: Studies show the best time to nap is in the middle of the day, between 1pm and 3pm.
- Don’t make it long: Set an alarm on your phone for an amount between 20 and 30 minutes. Any more than 30 minutes and you will likely wake up feeling groggy for up to an hour after your wake, possibly for the rest of the day.
- Make sure to block out the light: Make sure the room you nap in is as dark as you can make it, or wear a sleeping mask. Blocking out light helps you fall asleep faster and have a more restful nap (you can even get blackout curtains for your room for optimal sleep/nap conditions).
- Keep yourself cozy: You sleep better when you’re comfortably warm, so keep a blanket on hand wherever you take your naps to keep out the chill.
Still feeling guilty about the possibility of a regular nap schedule? Here are famous people who were pro-nappers:
- Winston Churchill
- Thomas Edison
- John F. Kennedy
- Eleanor Roosevelt
- Napoleon Bonaparte
- Salvador Dali
- Albert Einstein
So go on, learn all about napping with this nifty infographic, then have yourself a nice siesta. Chances are, you need it.
Jane is finishing a Media Studies major at Scripps College in Southern California and hails from Seattle. They enjoy voraciously consuming articles and op-eds online, spending hard-earned funds on Steam sales, and inquiring if there is "wifi and/or wine here". A constant theorist as well as a (wannabe) Internet culture savant.
Wednesday, January 14, 2015
|Poverty concentrations in the Bronx (Photo: Wikipedia)|
We have become more aware that Americans’ chances of upward economic mobility have for decades been a lot lower than Americans imagined, that being poor or rich can last generations.
Efforts to explain that lock-in have pointed to several patterns, from the intergenerational inheritance of assets (or debt, as the case may be) to intergenerational continuity in child-rearing styles (say, how much parents read to their children). In such ways, the past is not really past.
Increasingly, researchers have also identified the places - the communities, neighborhoods, blocks - where people live as a factor in slowing economic mobility.
In a post earlier this year, I noted a couple of 2008 studies showing that growing up in poor neighborhoods impaired children’s cognitive skills and reduced their chances to advance beyond their parents.
In this post, I report on further research by NYU sociologist Patrick Sharkey (here and here) suggesting that a bad environment can worsen the life chances not only of a child, but that of the child’s child, an unfortunate residential patrimony.
Consider the ways that the immediate environment shapes a child’s development. It does so physically. Air and soil pollution, noise, and traffic, for example, measurably affect children’s health, stress, and cognitive development. Local institutions and resources, such as the policing, quality of the schools, availability of health services, food options, parks, and so on matter, as well. And the social environment may matter most of all.
Growing up in a community with gangs, dangerous streets, discouraging role models, confused social expectations, and few connections to outsiders commanding resources is a burden for any child. Just getting by day-to-day can be a struggle (in a pair of studies, Sharkey found that a violent crime occurring near black children’s homes in the days before they took a standardized test reduced their scores on the test, presumably because of anxiety and distraction).
In their research on historical effects, Sharkey and co-author Felix Elwert used a survey that has followed thousands of American families since 1968 (the PSID).
The researchers know much about the adults in the survey, including where they lived when they were around 16, about the children they had and where those children lived around the age of six. The researchers also have the results from cognitive tests administered to those children in 2002.
Sharkey and Elwert found that living in a neighborhood where 20% or more of the residents are poor - many other things being held constant (including the parents’ education, health, and attitudes) - seems to lower the test scores of children. And so does having a parent who grew up in such a neighborhood.
The effect on children of living in a poor neighborhood and having parents who had also are substantially greater than the effect of only the second generation living in a poor neighborhood.
Moreover, the children of two generations of poor neighborhoods do much worse than those of two generations who managed to stay out of poor neighborhoods (over half a standard deviation worse). For technical reasons, these statistical results probably underestimate the real effect of neighborhood poverty on scores.
What appears to have happened is this: Survey respondents in the first generation who grew up in poor neighborhoods ran higher risks than other respondents, on average getting less education and worse jobs, if any, and bearing more physical, social, and psychological problems. Not surprisingly, they tended to end up in poor neighborhoods as adults.
When this first generation became parents, they commonly passed on some of their personal disadvantages, such as weak reading skills, to their own children. And they also passed on their places, raising the second generation in poor neighborhoods, which further hampered their children.
In this way, Sharkey and Elwert argue, neighborhood problems dragged down (at least) two generations.
No discussion of neighborhood effects can ignore the racial dimension, because the residential segregation of blacks has been and, though reduced, continues to be extreme: 41% of the African-American parent-child pairs in the study grew up in poor neighborhoods in both generations; only 2% of white families did.
Poor whites were less likely to live in concentrated areas of poverty and are more likely to get out of them if they did. The weight of the past is much heavier for some than others.
Claude Fischer is a sociologist at UC Berkeley, is the author of Made in America: A Social History of American Culture and Character. This post originally appeared at Made in America and was re-posted on the Boston Review BR Blog.
Tuesday, January 13, 2015
Fortunately, whatever he needed was available: food, water, a place to sleep, clothes to wear, family who loved him. But what he wanted, well, that’s a different story.
We battled with wanting every Lego set possible, staying up late even when tired, eating ten cookies instead of two. Imagine a toddler being allowed to do whatever they wanted: chaos would soon reign, with sleep deprivation and sugar overload.
Temporary pleasure quickly turns to unhappiness when there’s excess. Perhaps part of that is because we don’t appreciate things as much when they are in constant abundance right in front of us.
When an adult consistently prioritizes what they want (versus need) first, there is usually a problem. When a country does only what it wants, it can be a disaster. And when we have a world full of people, governments, corporations and large institutions doing what they want, it eventually leads to collapse. Yet this is what we face. Consider this quote:
The country of Canada existed before there were large-scale oil exports. We are a nation founded on resource extraction, always have been, but it’s now reaching proportions that are giving us the worst kind of international reputation, in mining and oil production in particular. Australia has its own struggles with a similar economy.
Why are things progressively getting worse? Because more money can be made, and that is a ‘want’ far more often than it is a ‘need’. It’s worth repeating the adage “you can’t eat money”.
We need a planet hospitable to human life in a temperature range that supports us, along with clean water and good soil. Yet we pursue the very opposite in the tar sands of Canada. The real issue appears to be the time scale: short-term wants versus longer-term needs. We have all enjoyed something in the short term and regretted the results further down the road.
When our needs are too easily provided for, we value them less. And teens are rapidly losing the skills required to take care of ourselves. Most teenagers, or “screenagers” as Geoff Lawton calls them, spend more time interacting with a screen than they do sleeping.
I can only write about what I know, based on personal experience, but what I see are many teenagers turning into adults with few cooking, or growing skills. This means teens are growing up not witnessing and learning basic skills that allow them to provide for their needs; I read that fewer than 10% of teenagers have chores that contribute to the household.
Teenagers can be directionless when they graduate, reluctant to take on debt at school, or not wanting to get on the treadmill of working to pay the bills. Escaping into an online world is so very tempting. So in much of the Western world we have the most privileged generation in history, and I would argue the most pampered and least prepared for the future.
Professor Jean Twenge, author of the book Generation Me, writes that:
A dialogue between the generations is more important than ever before. We all face an uncertain future and passivity and pessimism don’t work, nor do they make us feel good.
What about a program where teens devote a year to volunteering, with several categories and skill sets to choose from? After all, numerous countries still have the military draft (1-2 years of mandatory service before the age of 25).
Working on a cause larger than yourself, helping your community, and learning new skills is certainly a positive experience. Thinking about what we need and how to provide it for ourselves is far more empowering and satisfying than many of us realize. To graduate here in BC, students are required to have a certain number of volunteer hours. Below is a picture of teens planting pollinator plants:
It seems that we often merge wants and needs which influences our actions. We may need to get groceries from the store but we also want cheap prices, so this guides our choice of purchases.
We need to have a habitable planet yet want to have all the conveniences of a modern, energy-intensive lifestyle. That means humans have to find a way to want what we need, and fast. Permaculture helps with that process, offering support, ideas, and real-life examples of projects working in communities. Let’s make 2015 a year of continuing, much-needed action!
Monday, January 12, 2015
|Creative thought processes mindmap (Photo: Wikipedia)|
A long-term trend of declining creativity test scores has renewed interest in mechanisms to stimulate and foster the development of creativity - at home, in schools, universities and workplaces.
At the same time, there is a bewildering array of self-help books and popular advice woven into a framework of research studies and empirical evidence.
The result is that it can be hard to separate myth from fact. Teachers, parents and managers may find it difficult to know what to do to develop the creativity habit in a rigorous, systematic, and effective manner.
Perhaps the most harmful myth is that creativity is the exclusive domain of the arts. Of course creativity is found in music, poetry, writing and painting - but the myth is harmful because it creates a perception that creativity cannot be found in science, engineering, sport or cooking.
Another obstructive misconception is that creativity is simply unfettered thinking, divorced from practicality and reality. Nothing could be further from the truth. Creativity, in fact, is hard work.
Think of an analogy to food. Some characterisations of creativity are like fast food - attractively packaged, easy to consume, sugary and sweet - but lacking in nutrition and long-term benefit. Others are like spinach - more work to prepare, not so attractive to consume - but nourishing!
Good habitsDeveloping the creativity habit requires more than just good intentions. It takes more than superficial, fast food approaches that focus only on simple, short-term cognitive tricks. It requires a systematic, holistic approach such as American psychologist Robert Sternberg’s framework of opportunity, encouragement and reward.
But unless this framework is translated into concrete, actionable guidelines for parents, teachers and managers, we run the risk that a piecemeal, and ultimately ineffective, approach will prevail.
We may readily accept that children, for example, need to be given relevant and authentic opportunities to engage in creativity - but if the practical outcome of this is simply “40 minutes of art on Thursday mornings”, then we are unlikely to see any reversal of the creativity slump.
The key question then is “how” do we generate suitable opportunities around which a structure of encouragement and rewards can be built?
One answer lies in the 12 keys for developing the creativity habit that Sternberg spelled out when he elaborated his framework of opportunity, encouragement and reward. These 12 keys serve as a set of guidelines that inform how we should design learning to develop the creativity habit. A few examples will help to illustrate.
Thinking through definitions
The first of the 12 keys say that children - in fact, all those in whom we are trying to develop creativity - should be given the opportunity to redefine problems. Very often, we define the problem for the learner instead of letting them do so themselves.
The importance of this principle is more apparent if we look at what happens when we fail to do this. Failing to give learners - especially children - the opportunity to define and redefine their own problems means that they don’t learn how to identify problems, and how to make good choices in the process of problem solving.
One practical, holistic way to address this issue is to give learners problem statements that invite interpretation and judgement. State the problem in functional terms - what has to be achieved, not how it has to be achieved - so that the learner is not simply executing a predefined solution.
Another of the 12 keys says that we should encourage idea generation. This seems self-evident in a discussion of creativity; however, the value lies in how we do this. To spend five minutes thinking of three different ways to solve this problem is one possible approach - but far better is to create a problem situation that is genuinely open-ended.
There should not be one single, correct solution, but a range of possible solutions. The learner should be presented with the means to try different solutions, and should be encouraged with phrases like “what if you tried xxx?” rather than “do xxx!”
A third example is that we should encourage a tolerance of ambiguity. People like clear, black-and-white instructions, but the problems we encounter in life are far less structured and deterministic. Many people, however, when faced with uncertainty say, “I don’t know what to do, therefore I’ll do nothing.”
What we need to develop is the mindset “I don’t know what to do, therefore I’ll try something!” To develop this aspect of the creativity habit means setting problems with deliberately vague rules and constraints. Give the learner a degree of uncertainty - and let them solve the problem in spite of it.
Developing meaningful opportunities to engage in creativity is a challenge. It is easy to do poorly, and hard to do well. One thing, however, is clear. If we accept that creativity - the ability to generate effective and novel solutions to problems - is a core competency, then developing the creativity habit is something that we must get right.
This article was originally published on The Conversation. Read the original article.
Friday, January 9, 2015
|Man thinking on a train journey (Photo credit: Wikipedia)|
by Tom Chi, Unreasonable: http://unreasonable.is/opinion/how-to-create-a-sense-of-purpose-in-your-life/
Why Give a Damn: Did you inherit your sense of purpose from others? Or have you cultivated a purpose in your life that is uniquely your own? Here’s a very simple litmus test to figure it out and a way to bring more purpose to your life.
The author of this post, Tom Chi, has pioneered a unique approach to rapid prototyping, visioning, and data-driven design that has allowed him to both get new things off the ground and move large organizations at unprecedented speeds.
There are many ways to formulate a sense of purpose in your life. There are top-down ways and bottom-up ways to do it. I think most people don’t consciously do either. When folks actually do end up working with purpose, it tends to be from the top-down. They might say: what would my religion want me to do? Or, what would my extended family want me to do?
But one thing to notice about this type of default is that these types of purpose originate in another person’s framework. I don’t think there’s anything wrong with that, per se. You might work with a different person’s framework and be extremely excited and feel called to live it fully because it feels right. And to some extent, all of us inherit some aspects of our purpose.
But I think much of really defining your purpose in life is to examine our inherited frameworks and question how much it truly speaks to us instead of simply living it by default.
In my case, I’m out here building products and inventing various technologies, but at times I’ll notice an underlying thought: “Hey, you want/need to make money.” Why is that? Why is that in there?
After all, while I wouldn’t consider myself fabulously rich, I’ve been relatively fortunate to work in an industry that has done well - so why even have a money-related drive? If I investigate it a level deeper: my parents were immigrants and worked long hours just to survive. Money became a fixture, and when there wasn’t enough, there was strife.
So even in a situation where I know on a logical level, if I didn’t work, I could support myself for a couple of years, I still have this impulse: “No - I’ve got to do it; I’ve got to keep pushing.”
So that’s a personal example of an inherited element of purpose. It motivates me, I live it, but I didn’t choose it. So the really interesting process is looking at it and saying: “Do I continue to agree with it or not?”
Many of us never examine our inherited elements of purpose, so it shapes their lives unconsciously. Or, if they are aware of it, they’re afraid to investigate or question it with any depth. The result is that they never challenge it and they miss out on a chance to make the purpose their own.
This gives you a sense of what it looks like to work with top-down purpose that you’ve inherited from family, religion, country, or really any ideas of purpose we have inherited from others.
However, there is a bottom-up formula, which is what I call the A, B and C Levels of Energy. I’ve referenced this formula before in an article about managing energy, not time, here is a quick way for you to use this formula to create more purpose in your life.
- Level A is energizing - you finish an activity with even more energy than you started. For me,
doing a painting, writing a song or working on an interesting
technological problem makes me feel that way. I do it and make progress
and it’s: “Yes!” It’s amazing. It’s buzzing. You are excited and continue thinking about it and want to build off it. That’s Level A.
- Level B is neutral - imagine being in the Zone Out zone. You are sitting on the couch with a
beer, kicking back, watching some TV - relaxing. That’s a situation
that’s not giving you a ton of energy, so you don’t leave the couch
totally energized, but it doesn’t drain you much, either.
- Level C is draining - Level C types of activities are where every time that you do them, you
feel drained. Oftentimes you feel it even within the first 15 minutes -
the sense of drudgery and dread. You leave these activities drained …
If you look at your relationship to Level B activities, how do you feel about sitting on the couch, watching some TV? How do you feel about surfing aimlessly on the Web? These are Level B activities. If you get home from work and you can’t wait to do those things, then you are probably living mostly a Level C life. It feels better to go back to neutral and than it does to keep engaging in what’s draining.
Similarly, if your life is filled with more Level A activities, your relationship to Level B activities is: “This is a freaking waste of time. I can’t believe I just watched three sitcoms in a row. Two hours have just gone by. That’s nuts. I’ve got to stop.”
I don’t think anyone’s life becomes completely Level A activities or even Level C activities, although admittedly, some people have really tough lives. But in all cases, your relationship to Level B activities gives you that quick sense of whether you are relatively well aligned or not.
The more that you have a sense that you’re getting into better alignment, then you can begin asking yourself: “What exactly is it that I’m doing throughout the day that puts me at Level A?” Then you can start breaking things down and understanding what the intrinsic motivations that makes your life meaningful to you. That is categorically different from the top-down purpose.
I feel like nobody can define this other than you. It’s something that can only be discovered through a conscious process that includes a dash of mindfulness and honesty.
So my overall sense of how somebody develops a purpose in life is to bring a conscious process to bottom-up meaning, by using A, B, C evaluation or other reflection techniques.
As for the top-down stuff that you inherit (because you can’t help but inherit some things), you can look at them and say: “Here is some of the purpose that I’ve inherited from how my country works. Here is some of the purpose I’ve inherited from how my family works. Here’s some of the purpose I’ve inherited from how the Silicon Valley technology culture works.”
Just being conscious about it, so that you know exactly what you’ve inherited and then asking yourself a question: “Do I want to keep these ideas or not? Do I want to consciously depart from it, or do I want to consciously say I’m into that?” This is in contrast to just chugging along and not knowing twenty years from now why you did what you did.
This is just one technique of many that can help you establish a sense of purpose in your life. What I like about it is how quickly the litmus test cuts through ambiguity and rationalizations. Try out for yourself let me know how it turns out for you - or share a technique that’s worked well for you!