On Sept. 13, 1848, at around 4:30 p.m., the time of day when the mind might start wandering, a railroad foreman named Phineas Gage filled a drill hole with gunpowder and turned his head to check on his men. It was the last normal moment of his life.
Other victims in the annals of medicine are almost always referred to by initials or pseudonyms. Not Gage: His is the most famous name in neuroscience. How ironic, then, that we know so little else about the man—and that much of what we think we know, especially about his life unraveling after his accident, is probably bunk.
Here’s how Kean explains the continuing fascination with the case of Phineas Gage:
Most of us first encountered Gage in a neuroscience or psychology course, and the lesson of his story was both straightforward and stark: The frontal lobes house our highest faculties; they’re the essence of our humanity, the physical incarnation of our highest cognitive powers. So when Gage’s frontal lobes got pulped, he transformed from a clean-cut, virtuous foreman into a dirty, scary, sociopathic drifter. Simple as that. This story has had a huge influence on the scientific and popular understanding of the brain. Most uncomfortably, it implies that whenever people suffer grave damage to the frontal lobes—as soldiers might, or victims of strokes or Alzheimer’s disease—something essentially human can vanish.
Recent historical work, however, suggests that much of the canonical Gage story is hogwash, a mélange of scientific prejudice, artistic license, and outright fabrication. In truth each generation seems to remake Gage in its own image, and we know very few hard facts about his post-accident life and behavior. Some scientists now even argue that, far from turning toward the dark side, Gage recovered after his accident and resumed something like a normal life—a possibility that, if true, could transform our understanding of the brain’s ability to heal itself.
Despite Gage’s fame, Kean notes, very little is actually known about how the injury changed Gage. Malcolm Macmillan, a psychologist and historian now with the University of Melbourne, has spent the last 40 years researching the facts surrounding the accident and the body of legend that has grown up around it.
People butcher history all the time, of course, for various reasons. But something distinct seems to have happened with Gage. Macmillan calls it “scientific license.” “When you look at the stories told about Phineas,” he says, “you get the impression that [scientists] are indulging in something like poetic license—to make the story more vivid, to make it fit in with their preconceptions.” Science historian Douglas Allchin has noted the power of preconceptions as well: “While the stories [in science] are all about history—events that happened,” Allchin writes, “they sometimes drift into stories of what ‘should’ have happened.”
In addition to the story of Phineas Gage, this piece contains several illustrations of the man and the rod that went through his brain, along with pictures of Gage’s skull and modern visualizations of his brain damage.
The brain is Steven Pinker’s playground. A cognitive scientist and experimental psychologist, Pinker is fascinated by language, behavior, and the development of human nature. His work has ranged from a detailed analysis of how the mind works to a best-seller about the decline in violence from biblical times to today.
Harvard University offers an interview with one of its most famous teachers about his life as a researcher and college professor.
For most of us, daydreaming is a virtual world where we can rehearse the future, explore fearful scenarios or imagine new adventures without risk. It can help us devise creative solutions to problems or prompt us, while immersed in one task, with reminders of other important goals.
For others, however, the draw of an alternative reality borders on addiction, choking off other aspects of everyday life, including relationships and work. Starring as idealized versions of themselves—as royalty, raconteurs and saviors in a complex, ever changing cast of characters—addictive daydreamers may feel enhanced confidence and validation. Their fantasies may be followed by feelings of dread and shame, and they may compare the habit to a drug or describe an experience akin to drowning in honey.
We all daydream, but for some people daydreaming becomes obsessive. Read how studying such people is helping scientists discover how daydreaming is “essential to generating our sense of self, suggesting that daydreaming plays a crucial role in who we are and how we integrate the outside world into our inner lives.”
Storytelling has become a major topic lately, with its own Twitter hashtag and books such as Jonathan Gottschall’s The Storytelling Animal examining why storytelling so appeals to our brains.
But Samuel McNerney, writing for Scientific American Blogs, argues that storytelling has its limitations and is one factor contributing to the persistence of some pop psychology beliefs:
This is one of the reasons we humans love narratives; they summarize the important information in a form that’s familiar and easy to digest. It’s much easier to understand events in the world as instances of good versus evil, or any one of the seven story types. As Daniel Kahneman explains, “[we] build the best possible story form the information available… and if it is a good story, [we] believe it.” The implication here is that it’s how good the story is, not necessarily its accuracy, that’s important.
But narratives are also irrational because they sacrifice the whole story for one side of a story that conforms to one’s worldview. Relying on them often leads to inaccuracies and stereotypes. This is what the participants in Brenner’s study highlight; people who take in narratives are often blinded to the whole story – rarely do we ask: “What more would I need to know before I can have a more informed and complete opinion?”
He encourages us to live life with a generous dose of critical thinking:
Ultimately, we need to remember what philosophers get right. Listen and read carefully; logically analyze arguments; try to avoid jumping to conclusions; don’t rely on stories too much. The Greek playwright Euripides was right: Question everything, learn something, answer nothing.
Memoir and family story-telling is about creating a legacy and a heritage, showing where you came from so you can know better where you are going — and even how you want to change that legacy. Every time I hear stories about someone’s family history, I’m inspired about the work memoirists do to create a historical record of how life is being lived now. In a short time, your today becomes your yesterday. What are you preserving for your family?
Read here her advice for creating a family memoir. She emphasizes writing both light and dark stories, and also addresses the issue of truth and secrets.
If you’re writing a memoir with an eye toward publication, Damian Barr, author of Maggie & Me: Coming Out and Coming of Age in 1980s Scotland, explains how to use tools of fiction writing to produce a book people will want to read.
Barr offers 10 pieces of advice. I think these three are the most basic:
- You Are Just A Character In Your Own Story
- In the Particular We find the Universal
- Find Your Voice
Whether we’re writing about our memories or simply remembering them, the key to those memories is language. Don MacKay, professor of psychology at the University of California, Los Angeles, has spent much of his life researching the links between language and memory:
I’ve always looked at language from a psychological point of view, as very central to human cognition and memory. It’s built into our genes. We think in terms of language. Language dominates everything. Language can influence how we remember things—but that’s a whole field in itself.
In this interview MacKay discusses different kinds of memory and his work with one of medicine’s most famous patients:
In 1957 a man named Henry Molaison, who became one of psychology’s most famous patients, had his hippocampus removed in an attempt to control his life-threatening seizures. After the surgery, he was unable to form new memories without years or even decades of effort. Many at the time believed the damage was limited to his “episodic memory,” or memories of events. But research by Don MacKay, a professor of psychology at the University of California, Los Angeles, demonstrated that his imagination, language production and language memory were also decimated—at a time when most psychologists did not consider language a type of memory.
Writer Sam Kean says that focusing on the deficits of patients, like Henry Molaison, with damaged brains overlooks one crucial fact: “However glaring their deficits are, their brains still work like ours to a large extent.”
But, Kean reports, while working on his forthcoming book The Tale of the Dueling Neurosurgeons: And Other True Stories of Trauma, Madness, Affliction, and Recovery That Reveal the Surprising History of the Human Brain, he discovered an antidote to this tendency: stories.
When we read the full stories of people’s lives, fictional or non-, we have to put ourselves into the minds of the characters, even if those minds are damaged. Only then can we see that they want the same things, and endure the same disappointments, as the rest of us. They feel the same joys, and suffer the same bewilderment that life got away from them. Like an optical illusion, we can flip our focus. Tales about bizarre deficits become tales of resiliency and courage.
Steve Jobs is the American icon of design creativity and innovation. This article looks at how Jobs pushed at legal boundaries:
Mr. Jobs’s conduct is a reminder that the difference between genius and potentially criminal behavior can be a fine line. Mr. Jobs “always believed that the rules that applied to ordinary people didn’t apply to him,” Walter Isaacson, author of the best-selling biography “Steve Jobs,” told me this week. “That was Steve’s genius but also his oddness. He believed he could bend the laws of physics and distort reality. That allowed him to do some amazing things, but also led him to push the envelope.”
Perhaps this is yet another demonstration that creativity often involves challenging the status quo.
In a 2007 review paper, Gutman and Victoria Schindler surveyed the scientific literature that analyzes the neurological basis for how hobbies and activities relate to health and well-being. They found that engaging in such activities as arts and crafts, music, meditation, home repairs and reading stimulates the mind, reduces the effects of stress-related diseases and slows cognitive decline.
Amanda Mascarelli reports on recent research into how activities like knitting affect the brain to offer long-term health benefits.
How we seek and respond to those rewards is part of what determines our overall happiness. Aristotle famously said there were two basic types of joy: hedonia, or that keg-standing, Netflix binge-watching, Nutella-from-the-jar selfish kind of pleasure, and eudaimonia, or the pleasure that comes from helping others, doing meaningful work, and otherwise leading a life well-lived.
Recent psychological research has suggested that this second category is more likely to produce a lasting increase in happiness. Hedonic rewards may generate a short-term burst of glee, but it dissipates more quickly than the surge created by the more selfless eudaimonic rewards.
A recent report on a study of adolescents published in the Proceedings of the National Academy of Sciences found that those who derive joy from selfless deeds were less likely to be depressed over time.
Creative thinking improves while a person is walking and shortly thereafter, according to a study co-authored by Marily Oppezzo, a Stanford doctoral graduate in educational psychology, and Daniel Schwartz, a professor at Stanford Graduate School of Education.
I do a lot of my walking indoors, on a treadmill, so I was pleased to see that “The study found that walking indoors or outdoors similarly boosted creative inspiration. The act of walking itself, and not the environment, was the main factor.”
Memoir instructor Roberta Temes, Ph.D., explains why it’s all right for memoirists to break three standard writing guidelines. She advises:
- Don’t start at the beginning.
- Forget about grammar.
- Don’t put your best foot forward.
Among the most hotly debated topics in psychology is gender differences. Are there really differences in the ways that men and women behave? Should we be studying those differences or examining ways in which men and women are similar? And how can researchers design studies that isolate gender from all other variables that influence human behavior?
Unfortunately, the actual science behind gender gaps . . . [is] a miasma of conflicting results, non-replicable studies, and varying effect sizes. And when you think about the complexities involved, it’s no wonder there’s a lot of confusion. Researchers studying gender differences must deal with genetics, physiology, behavior, culture, age, environment, race, and innumerable other variables. Behavior is also extremely sensitive to context, which muddies the waters further. Evaluating and interpreting the differences between men and women is, simply put, no easy task.
Stating that “gender research is coming a long way,” Ars Technica here evaluates “some of the more promising research into what—if anything—separates males from females when it comes to behavior.”
As anyone who’s read Gillian Flynn’s hugely popular novel Gone Girl can tell you, what we write creates a portrait—whether accurate or not—of who we are:
The journal PLOS ONE has just published the results of the largest study to-date of personality and language. The findings offer an unprecedented look at how differences in age, gender and temperament affect how we communicate on social media.
Using over 15 million messages posted by 75,000 Facebook users, researchers at the University of Pennsylvania were able to find which words, phrases and topics correlated most closely with certain personalities and demographics.
The word clouds and graphs here are fascinating. Here are some of the study’s key findings:
- Use of “I” decreases with age, while use of “we” increases.
- Complaining decreases with age.
- “Introverts Post About The Internet, Extroverts Post About Parties”
- “The researchers’ model was able to predict an individual’s gender from their language patterns with 91.9 percent accuracy.”
Mind wandering can get us in trouble, particularly at school or at work, where we’re told to pay attention and to focus on the problem at hand. But both William James, the father of modern psychology, and Sigmund Freud recognized the creative function of daydreaming.
Here cognitive psychologist Scott Barry Kaufman advocates looking at mind wandering from a personal perspective:
Most recent studies depict mind wandering as a costly cognitive failure with relatively few benefits (Mooneyham and Schooler, 2013). This perspective makes sense when mind wandering is observed by a third party and when costs are measured against externally imposed standards such as speed or accuracy of processing, reading fluency or comprehension, sustained attention, and other external metrics.
There is, however, another way of looking at mind wandering, a personal perspective, if you will. For the individual, mind wandering offers the possibility of very real, personal reward, some immediate, some more distant.
These rewards include self- awareness, creative incubation, improvisation and evaluation, memory consolidation, autobiographical planning, goal driven thought, future planning, retrieval of deeply personal memories, reflective consideration of the meaning of events and experiences, simulating the perspective of another person, evaluating the implications of self and others’ emotional reactions, moral reasoning, and reflective compassion (Singer and Schonbar, 1961; Singer, 1964b; Singer, 1966, 1974, 1975, 1993, 2009; Wang et al., 2009; Baars, 2010; Baird et al., 2011, 2012; Kaufman and Singer, 2011; Stawarczyk et al., 2011; Immordino-Yang et al., 2012; Kaufman, 2013).
Read his explanation of why people are “willing to invest nearly 50% of their waking hours engaged in” mind wandering “because it produces tangible reward when measured against goals and aspirations that are personally meaningful.”
Between 1914 and 1930 Carl Jung underwent a period of intense inner exploration that he documented in The Red Book. The book contains drawings of dreams and visions, and the text of Jung’s formulation of the concepts of archetypes, the collective unconscious, and the individuation of the psyche. The Red Book was finally made public with its publication in 2009.
In this article, which includes photos of some of Jung’s art work in The Red Book, Laura K Kerr, PhD, discusses this seminal work:
Through his meticulous design of The Red Book, CG Jung interwove his experience of madness with the collective suffering of his era. Such syntheses are rare — and just what the current mental health field desperately needs. In what follows, I look at how The Red Book became Jung’s journey out of madness as well as the foundation for his analytical psychology. Even today, over 50 years after his death, Jung’s analytical psychology is a relevant, non-pathologizing method for transcending madness, while also relating individual suffering to the larger collective.
Anyone who has ever taken an introductory psychology course has seen a graphic representation of Abraham Maslow’s hierarchy of needs. Maslow’s theory places our most basic needs at the bottom and holds that higher needs cannot be met until the ones below have first been taken care of.
This article examines the history of how Maslow’s hierarchy has been applied to management theory in the business world:
In the second half of the 20th Century, bosses began to realise that employees’ hopes, feelings and needs had an impact on performance. In 1960, Douglas McGregor published The Human Side of Enterprise, which contrasted traditional managerial styles with a people-centred approach inspired by Maslow. It became a best-seller.
For readers in search of tales that step outside familiar viewpoints, there is an abundance of fiction by women unraveling the big themes of conflict, religion, race and love — from new and different angles. The five novels I’m recommending offer up-close-and-personal engagement with characters who are often at odds with their communities or whose lives are so far on the periphery that we can be sure history books would pass them by. They make far-flung places and faraway lives feel immediate. This, for me, is the magic of good fiction: that outsiders — a child from the slums, an executed zealot, a reluctant immigrant, a guilty survivor and a suffering mother — can take center stage and make the world a bigger, yet more knowable, place.
Recommendations from Ellah Allfrey for National Public Radio (NPR).
I used to have a friend who loved to say, “You’re SO left-brain. You need me to collaborate on your project because I’m predominantly right-brain.”
The left side of the brain has been traditionally thought of as the area of logic and organization, while the right side was seen as the home of inspiration and creativity. So what my friend was always telling me was that I was too logical and had no creativity. I finally got tired of her attitude, and we are no longer friends.
And now I can feel vindicated:
Popular culture would have you believe that logical, methodical and analytical people are left-brain dominant, while the creative and artistic types are right-brain dominant. Trouble is, science never really supported this notion.
Now, scientists at the University of Utah have debunked the myth with an analysis of more than 1,000 brains. They found no evidence that people preferentially use their left or right brain. All of the study participants — and no doubt the scientists — were using their entire brain equally, throughout the course of the experiment.
Brain scans of people who say they have insomnia have shown differences in brain function compared with people who get a full night’s sleep.
“10 of the best psychology and neuro links from the past week (or so)”
CNN’s Jacque Wilson reports on the work of cognitive psychologist Elizabeth Loftus, an expert on the malleability of human memory. Loftus is famously controversial for her research into so-called “repressed memories,” which her critics say are actually false memories. But Loftus has also demonstrated the unreliability of eye-witness testimony and explored the use of memory manipulation to treat eating disorders and addictions such as alcoholism.
Roy Peter Clark knows that writers don’t merely look at things; they truly see:
I once heard of a clever writing prompt given to school children: “If you had a third eye, what could you see?” Writers, I would argue, already have a third eye. They use it to see life, language and literature in special ways.
This third eye has a number of different names. It’s called vision (and then revision), curiosity, inspiration, imagination, visitation of the muse. When an ordinary person says “I see,” she usually means “I understand.” If she’s a writer, she means that and much more. For the writer, seeing is a synecdochic and synesthetic gerund. It stands for all the senses, all the ways of knowing.
Take a look at his list of 50 “things I think writers see in life, language and literature.”
Fairy tales fascinate novelist Alison Littlewood:
Her second book Path of Needles was published last week and is a compelling read, focusing on a series of murders which, from the gruesome way in which the victims’ bodies are posed, appear to have a connection with fairytales. A young police officer, Cate Corbin, is part of the investigating team and on a hunch she calls in academic Alice Hyland, an expert in fairytales, to assist them on the case.
Fairy tales are enduring stories that deal with some of the more unsavory aspects of human nature. Says Littlewood, “I tend to write about things that personally scare me and I’m also fascinated by the fact that, despite all the technological advances we have made, there are still things we can’t explain.”
More than 40 million people globally take an SSRI antidepressant, among them many writers and musicians. But do they hamper the creative process, extinguishing the spark that produces great art, or do they enhance artistic endeavour?
In The Guardian, novelist Alex Preston takes an in-depth look at the question of whether psychiatric drugs help or hinder artistic creativity.
Joseph Campbell, the great scholar of religion, hit the core of our problem when he wrote, “People say that what we’re all seeking is a meaning for life. I don’t think that’s what we’re really seeking. I think that what we’re seeking is an experience of being alive.”
Adam Frank has some advice for becoming more aware of the experience of living: take a walk in the woods, and look around like a scientist. And don’t retort that you’re not a scientist: “You already are a scientist. You have been since you were a kid playing with water in the tub, or screwing around in the backyard with dirt and sticks and stuff.”
Refining our capacity to notice is an act of reverence that we can bring to everywhere and everywhen. It’s an invitation, bringing the world’s most basic presence into view, opening our horizons and restoring our spirits. And that is what science is really there for.
Most teachers, when asked if they value creativity in their students, say that of course they do. But when asked, in a different context, what the characteristics are of students that they like best and least, characteristics of creative people fill their “least liked” list. This is because creative children are the least docile in the classroom. They tend to work better on their own than with others, they focus on what catches their interest to the exclusion of other things, and they see associations and relationships between objects and ideas that most other people do not see. In other words, creative children can be disruptive in the classroom, and they are often bored with the material being presented.
And that’s too bad, says Scott Barry Kaufman, author of Ungifted: Intelligence Redefined, which will come out this summer:
Since so much is at stake for increasing creativity in society, it’s essential that we continually question and attempt to improve the methods by which we identify, mentor, and cultivate those who are ready and capable of becoming our next generation of innovators. Tragically, we are failing these students, often unknowingly letting them fall between the cracks in an education system that rewards characteristics that dampen creativity, such as conformity, standardization, and efficiency.
Read his research on alternative ways to identity and nurture creativity in children.
Here’s an interesting article on the study of prescriptive stereotypes—those stereotypes that specify how certain groups should be—of older adults. This research comes from two scientists at Princeton University, Susan Fiske and Michael North:
The research by North and Fiske homes in on the idea that understanding intergenerational tension is key to understanding ageism. Ageism is the one kind of discrimination, North noted, in which those who are generally doing the discriminating—younger generations—will eventually become part of the targeted demographic.
a few years back I became aware that I was living in a story that I hadn’t intentionally or consciously written for myself. It was a default story. I was living out the storyline of “abandoned girl” who believed she could not count on others and that she must do it all herself. She must be self-sufficient, strong, and prove herself capable in all matters. The main character in my story was controlling, driven, unintentionally selfish and dis-empowering to others. And boy, was this story laden with victim-mindset beliefs and self-limitation.
This was not the story I had imagined for my life! I wanted to be graceful and compassionate. I wanted to do work that I loved and that makes a difference in the world. I wanted to be a loving, calm parent and spouse.
De Yarrison explains how she changed her life by changing her life story. And she offers some tips on how other people can do the same.
The first piece of advice given to anyone working on a memoir is to be honest, to tell the truth about one’s own life. But is it possible to be TOO honest, Sarah Hampson asks. She thinks Jowita Bydlowska’s memoir Drunk Mom suggests a positive answer to that question:
She doesn’t want it to be about anything more than the words, the sentences, the writing, all of which are skillful, spare, lovely. But it is. Her memoir, Drunk Mom, is a terrifying journey about her relapse into alcohol abuse after her son was born. She had been sober for three and a half years. She started drinking again before he was born, then stopped during the pregnancy. To celebrate his birth, she had a glass of wine, and her addiction came back, full-grown and needy, like a long-lost, jealous child bent on taking her away from the innocent one, asleep in his crib.
This is a memoir that pushes at boundaries – what is private, what should perhaps be kept private, what we need to know, what we don’t, what is insightful or just exhibitionism. It is already one of the most talked about books of the season. Bydlowska is very honest in her writing. Let me be as well then. There’s self-harm in choosing to publish this memoir. It’s just like alcoholism: the recklessness of it; the abandonment of responsibility to her partner, to their relationship, to her child, now almost 4, and also, most painfully, to herself.
Yes, Facebook can be a time suck. But:
Researchers at the University of Arizona have found that the social media site isn’t just for uploading party photos and killing time. Facebook also boosts the cognitive abilities of older people and provides them with a stronger connection to their loved ones.
Using Facebook provided both social engagement and cognitive stimulation for study participants:
The adults who learned to use Facebook performed about 25 percent better on tasks that measured their cognitive abilities. They were also better able to do “updating,” a psychological term meaning that they could quickly add to or recall parts of their working memory as needed. Those who had used the private online diary site Penzu or who had not used Facebook at all saw no such cognitive gains.
The article provides tips on helping older relatives or friends learn how to use social media.
If you haven’t yet discovered Maria Popova’s Brain Pickings, here’s a start. In this entry she discusses The Ravenous Brain: How the New Science of Consciousness Explains Our Insatiable Search for Meaning by Cambridge neuroscientist Daniel Bor. Bor’s book examines “how our species’ penchant for pattern-recognition is essential to consciousness and our entire experience of life.” The basis for the ability to recognize patterns is working memory, where the brain temporarily stores individual items for further processing. And working memory can be improved by “a concept called chunking, which allows us to hack the limits of our working memory — a kind of cognitive compression mechanism wherein we parse information into chunks that are more memorable and easier to process than the seemingly random bits of which they’re composed.”
Read on to find out how pattern recognition can be a hindrance as well as an advantage. But be warned: Popova’s site is so rich that, once you start reading, you won’t want to stop.
While it can be difficult for anyone to remember happier times, the task is especially difficult for people with depression. This article reports on a study of whether building a memory palace can help people conjure up happier times:
The method-of-loci technique, which relies on spatial memory, is remarkably simple to explain, but does require some mental effort to set up. What you do is think of a place that you know really well, like a house you lived in as a child or your route to work. Then you place all the things you want to remember around the house as you mentally move around it. Each stop on the journey should have one object relating to a memory. The more bizarre and surreal or vivid you can make these images, the better they will be remembered.
If you carried out this process for a series of good memories, you’d have what is called a ‘memory palace’ of happy times that you could return to in moments of stress.
The study found that people who had used the method-of-loci technique to construct their memory palace were later able to recall more happpy memories than people who had rehearsed memories without the organizing technique.
Our identities become shaped by our life stories as we gradually incorporate the memories of the events in our lives into our sense of self (Whitbourne, 1985). The most important of these, the “self-defining memories,” are the ones that we remember most vividly and that contribute most heavily to our overall sense of self. A self-defining memory is also easily remembered, and emotionally intense. In some cases, these memories represent ongoing themes that we play out over and over again in our lives.
Learning to recognize your own self-defining memories can help you gain important insights about your identity. The easiest way to find out your own self-defining memories is by thinking about the events in your life that you are most likely to tell people about when they say “Tell me a little about yourself.”
Susan Krauss Whitbourne, Ph.D., discusses how to identify your self-defining memories, those memories of events that have most contributed to shaping your life story and, therefore, your sense of personal identity.
Our self-defining memories may very well change over time, as we have more experiences to incorporate into our life story. For this reason, identifying which memories have become self-defining for us can be a valuable exercise as we grow older.
You’re walking down the street, just like any other day, when suddenly a memory pops into your head from years ago. It’s about a person you haven’t thought of for years.
Just for a moment you’re transported back to a time and place you thought was long-forgotten. In a flash, though, the memory has vanished as quickly as it appeared.
This experience has been dubbed a ‘mind-pop’ and sometimes it is prompted by nothing your conscious mind is aware of.
There is, perhaps, an even weirder type of ‘mind-pop’. This is when all you get is a word or an image which seems to have no connection to anything at all. Like suddenly thinking of the word ‘orange’ or getting the image of a cheese grater. They seem weirder because they feel unconnected to any past experience, place or person—a thought without any autobiographical context.
Have you ever had an experience like this? If you have, don’t worry. These experiences don’t mean you’re weird.
Researchers have discovered that many people have these experiences, which are most likely to occur during routine, habitual activities. And although some such experiences can be traced back to a triggering event, many cannot:
The fact that many mind-pops could not be traced back to their source is probably the result of how much of our processing is carried out unconsciously.
The fascinating thing was that many of these mind-pops occurred weeks or months after exposure to the original trigger. This suggests that these words, images and ideas can lie in wait for a considerable period. Some even think that experiencing mind-pops could be associated with creativity as these apparently random associations can help to solve creative problems.
The aftermath of the 2012 presidential election has generated a moment of myth creation about what happened on Nov. 6 — why President Obama won, why Mitt Romney lost, and what roles real human beings played in the result. These myths are not only being repeated and set in stone by the media and pundits, but also by the campaigns themselves. Democrats and the Obama campaign as well as Republicans and the Romney campaign are repeating the same myths to explain the outcome.
I have often talked about how myths arise in politics in the aftermath of the election, and how these myths move from fiction to nonfiction. And it is because folks buy into the myths that mistakes are made in future campaigns, and wrong lessons are learned along the way. The winning campaign operatives and consultants usually disseminate the myths to justify their work, but in this election both the winners and losers are creating the same mythic narrative.
Political strategist Matthew Dowd discusses how stories become narratives that in turn develop into national myths. He defines three myths about the 2012 U. S. Presidential election, then concludes:
All of this raises the question of whether campaigns and tactics matter. They do, but only in a very limited way, and they are insignificant compared with the overall political environment and the grand movements of the world and our country. The most successful people in life and in politics learn to recognize the big waves happening in the world and then surf them as best they can. President Obama, despite many flaws and vulnerabilities, had the traits and attributes that made him more able to surf the movements than Romney or the Republican Party.
Recommended Reading: Memoirs
The Minneapolis Star Tribune suggests five books. The page also includes a link to the newspapers complete annual holiday books roundup.
Book Riot has put together Behind-the-Scenes Memoirs: A Reading List from readers’ suggestions: “Here’s a list of books that satisfy your voyeuristic literary urges and provide a peek behind the curtain of an otherwise secretive culture or group.”
What I’m reading this week:
Since I’m fairly new to Twitter ( @MDBrownPhD ), I look for informative articles and blog posts about how to use Twitter effectively. Here are a couple of things I’ve looked at recently:
Writer and editor Meghan Ward has a good summary of both sides of this issue. There are also a lot of comments on her post that also contribute to the conversation.
Twitter’s research into how journalists can best grow their followings uses data to confirm what you’ve probably been told at a dozen social media seminars: Be a firehose of information about your beat, use hashtags and @ mentions as much as you can, and share what you’re reading.
The experience of meaningful coincidences is universal. They are reported by people of every culture, every belief system, and time period. Traditionally these synchronistic events are made acceptable by ascribing them to outside supernatural forces such as divinities, or in modern times, impersonal archetypal influences.
Dr. Kirby Surprise demonstrates that synchronistic events, based on the activity of the mind, are actually caused by the person who perceives them, and reflect many levels of their consciousness.
His research reveals that what we believe and the way we look for patterns in the world generates synchronistic events that mirror our own assumptions. By decoding the science of synchronicity, Dr. Kirby uncovered how we actually create events and how we co‐create our reality.
The term dementia is used broadly to describe a condition which is characterized by cognitive decline, but there are many different types of dementia. Although it is usually progressive, properly diagnosing dementia can reverse the effects and be treated and even cured completely by addressing the underlying cause. However, dementia caused by incurable conditions such as Alzheimer’s disease, are irreversible.
In The American Scholar, Priscilla Long explains what happens when we drift off to dreamland.
If you were paying attention to news about successful treatments for opioid addiction you probably fell out of your chair when you saw a stream of headlines one mid-month morning proclaiming that a way to “block” addiction to heroin had been found. All of the articles were based on one of two press releases, both of which were issued by the institutions where the lead authors of an animal study work (University of Colorado and University of Adelaide). The study was published in the August 15 issue of the Journal of Neuroscience.
In fact, the study was about how a promising new immune system component can contribute to pain relief with lower amounts of opioids. It was not about addiction.
Thanks to SciCurious, a Scientific American blogger, the truth got out pretty fast. But it was too late for the hundreds of articles that have already been published—with no retractions or corrections that we can find.
This is a cautionary tale about journalists who, instead of going to the easily-findable study itself, rely on the press release.
We formulate stories about our own behavior and that of others all the time. If we’re not sure about the details, we make them up – or rather, our brain does, without so much as thinking about asking our permission.