A common metaphor for memory is a video recording: We think of our memory as a recording that we simply rewind and play back to look at specific events. But this metaphor is inaccurate because the sensory (such as sight, sound, and touch) components are stored in different areas of the brain than are emotional (such as fear or love) components. Every time we remember a specific event, our brains must reassemble the various parts of it.
Memories of traumatic events can cause post-traumatic stress disorder, or PTSD, a common condition in soldiers returning from war and in people who have experienced other traumatic events like carjacking. PTSD involves flashback memories of traumatic experiences. Symptoms of the condition include insomnia, nightmares, loss of concentration, anxiety, and panic attacks.
If our brains must reassemble the pieces of memory every time we remember an experience, might it not be possible to change how this reassembly occurs? This article reports on research into treating PTSD by altering patients’ memories with either drugs or behavioral therapy:
A paper recently published in the journal Biological Psychiatry … reviews a growing body of scientific literature on memory reconsolidation, a relatively new (and, in humans, still somewhat contentious) concept in which old information is called to mind, modified with the help of drugs or behavioral interventions, and then re-stored with new information incorporated—like a piece of metal that’s been melted down, remolded, and left to harden into a different shape.
But even if altering memories like this is possible, is it ethical?
Slippery philosophical quandaries abound: Does the act of taking a pill to change memory require different ethical considerations than something like psychotherapy or hypnotism? Could it pave the way for more ominous applications? And is the altering of memories a humane way of helping those who suffer, or is it some fundamental violation of what makes humans, well, human?
As one researcher asks:
“If you could edit your own memories, are there any memories you’d want to get rid of? If you have a memory of a painful event, do you lose some part of yourself if you get rid of it? Would that be worth the trade?”
This report in USA Today is a good companion to the article above, although it’s necessary to point out that the research was done on mice: “researchers say they’ve devised a technique to change a lab mouse’s bad memory of a particular place into a good memory, and vice versa.”
This research was possible because of the nature of memory:
Memory “is actually not like a tape recorder or camcorder of the past. It’s a reconstruction of the past every time you recall a memory,” says neuroscientist Steve Ramirez, a co-author of the new study and a graduate student at MIT. The new findings are “a testament to how unreliable certain memories can be, despite our convictions.”
This experiment demonstrates where mice store memories in their brains, but be sure to read what the preceding article says about extrapolating animal results to humans. And don’t forget the ethical implications:
There are potential ethical issues with changing a patient’s memories as a form of medical treatment, says Karen Rommelfanger, director of Emory University’s Neuroethics Program. For example, would a memory change have side effects, perhaps on a patient’s identity and interactions with others? But as long as the ethical issues are discussed early, Rommelfanger thinks it’s worth pursuing these results, especially for their possible application to those suffering from post-traumatic stress disorder.
Here’s the weekly does of good news:
The world is not always fair. The bad are not always punished and the good do not always prevail.
But there are plenty of reasons, scientifically tested, to have hope and be positive about the future.
A leading neuroscientist who has spent decades studying creativity shares her research on where genius comes from, whether it is dependent on high IQ—and why it is so often accompanied by mental illness.
Nancy C. Andreasen began her career as an English professor interested in the high incidence of mental illness in highly creative people, particularly writers. After publishing a book about poet John Donne, she decided she wanted a career that might save people’s lives. She enrolled in medical school and began working with patients who had schizophrenia and other mood disorders. “I was drawn to psychiatry because at its core is the most interesting and complex organ in the human body: the brain”:
I have spent much of my career focusing on the neuroscience of mental illness, but in recent decades I’ve also focused on what we might call the science of genius, trying to discern what combination of elements tends to produce particularly creative brains. What, in short, is the essence of creativity? Over the course of my life, I’ve kept coming back to two more-specific questions: What differences in nature and nurture can explain why some people suffer from mental illness and some do not? And why are so many of the world’s most creative minds among the most afflicted? My latest study, for which I’ve been scanning the brains of some of today’s most illustrious scientists, mathematicians, artists, and writers, has come closer to answering this second question than any other research to date.
In this article Andreasen presents the history of the study of intelligence and creativity, and discusses the results and significance of her own work.
One of the most infamous behavioral experiments was the 1971 prison experiment by Philip Zimbardo. This is one of the experiments that lead to the codification of informed consent that must be obtained by research participants and ethical guidelines that govern research design. I did not know until just recently that this experiment has its own web site.
I’ll let Philip Zimbardo himself describe the experiment for you:
Welcome to the Stanford Prison Experiment web site, which features an extensive slide show and information about this classic psychology experiment, including parallels with the abuse of prisoners at Abu Ghraib. What happens when you put good people in an evil place? Does humanity win over evil, or does evil triumph? These are some of the questions we posed in this dramatic simulation of prison life conducted in the summer of 1971 at Stanford University.
How we went about testing these questions and what we found may astound you. Our planned two-week investigation into the psychology of prison life had to be ended prematurely after only six days because of what the situation was doing to the college students who participated. In only a few days, our guards became sadistic and our prisoners became depressed and showed signs of extreme stress. Please join me on a slide tour describing this experiment and uncovering what it tells us about the nature of human nature.
Fake memoirs should not simply be dismissed and pulped by their publishers, according to an academic who argues that they can be great works of fiction.
The merits of the writing itself are often overlooked in the outrage and lawsuits that can follow the exposure of a supposed work of non-fiction or of an author who invents a bogus life story. So says Professor Sue Vice, author of the forthcoming book Textual Deceptions: False Memoirs and Literary Hoaxes in the Contemporary Era.
If you are interested in the writing and reading of memoir, you’ve heard about the fate of fake memoirs. Once a memoir has been found to be in any way embellished and therefore not strictly a factual account of the writer’s experience, the author is vilified and the book is denounced and often removed from bookstore shelves.
Here’s how Professor Sue Vice explains her position:
“Readers might feel angry or betrayed when they discover the truth [that a particular memoir is not completely factual]. But I wonder if very strict boundaries between different literary genres are partly to blame. If memoirs include even a small amount of fictional or reconstructed material, they may be judged as wholly worthless, even though they may have value in literary or psychological terms that exceeds their truth value.”
I even agree with her statement that stories that include some “amount of fictional or reconstructed material” may “have value in literary or psychological terms that exceeds their truth value.”
But let there be no misunderstanding: the difference between writing a memoir and writing a fictional account is more than “changing genres.” Of course all fiction in autobiographical in the most basic sense that it presents the author’s view of the world. But when authors call their book a memoir, they are using a word that readers will understand to mean something like “a truthful account of the writer’s personal experience.” Describing a book as a memoir is a tacit agreement with readers that you haven’t made any of this stuff up. When a so-called memoir is exposed as a not totally accurate narration of the writer’s actual experience, readers—and consumers who have purchased the book under false pretenses—have a right to feel cheated and betrayed.
I’m not saying that a fictionalized account can’t have literary or psychological value. I’m just saying that if your story is fictionalized, you should call it a novel, not a memoir.
In this excerpt from an interview with Salon’s Laura Miller, Grossman discusses magic as a metaphor for depression:
Including magic in a story always invites people to think of it as a metaphor for something. I remember after I read “The Magicians” I thought that Brakebills was like the Ivy League, this sought-after, elusive place that once you get out of it you hit a post-graduation slump into decadence and indecision about what you should do with these powers bestowed on you. What are some of the things other readers think it stands for?
The big ones that I get are definitely writing, which is completely fair. A useful skeleton key whenever you’re reading fantasy is if you’re wondering how the author feels about literature and writing, watch how they describe magic. But the other ones are addiction — Julia’s experience of magic has a lot to do with drug addiction and drug culture. But most of all, people identify magic as being about depression and the struggle to recover from it. If there’s a demographic I reliably do well with, it’s people who have dealt with depression. I hear from them about it a lot. I find that very gratifying because I was struggling with depression when I wrote “The Magicians.” Much less so now. But at the time I was really in the grip and looking for a way to write about it.
But the magic itself doesn’t represent depression surely? Depression seems so disempowering.
When you’re depressed, when you’re in bed and feel like you can’t get out, you can’t imagine doing work or accomplishing anything or anybody loving you. So when you look around you and you see these things happening to other people, they look like magic to you. They look that exotic, that strange, that impossible. And when you begin to crawl out of the pit and reengage with the world, it seems very magical. It felt as though getting out of bed yesterday was impossible, but now you’re doing it. Just by returning to daily life, you’re a magician.
Lauren Bacall, who died recently at age 89, wrote three memoirs: By Myself (1978), Now (1994), and By Myself and Then Some (2005).
After the publication of By Myself, she returned to Los Angeles for a book signing and was stunned when the line line of people snaked around the block.
“Writing a book is the most complete experience I’ve ever had,” she told The [Los Angeles] Times. Bacall spent three years writing “By Myself” in longhand on yellow legal pads. “I’m happily stunned with the results and astonished by the reaction.”
One piece of advice writers often hear is not to talk too much about the project they’re working on. It’s not that someone else might steal their ideas (although this could be more crucial for nonfiction than fiction writers), but rather that talking through their ideas may substitute for writing through them, with the result that the piece never gets written.
Here Dani Shapiro, author of five novels and three memoirs, addresses this issue:
And ten years later, would I have been compelled to write a memoir about that time in my life? Or would I have felt that I’d already told the story by posting it as my status update?
Quoting poet Adrienne Rich, Shapiro explains how pressure builds in a writer to the point of explosion, and reaching that explosive point is the trigger for writing:
Literary memoir is born of this explosion. It is born of the powerful need to craft a story out of the chaos of one’s own history. One of literary memoir’s greatest satisfactions—both for writer and reader—is the slow, deliberate making of a story, of making sense, out of randomness and pain.
This is how memoir is written:
In order to write a memoir, I’ve sat still inside the swirling vortex of my own complicated history like a piece of old driftwood, battered by the sea. I’ve waited—sometimes patiently, sometimes in despair—for the story under pressure of concealment to reveal itself to me. I’ve been doing this work long enough to know that our feelings—that vast range of fear, joy, grief, sorrow, rage, you name it—are incoherent in the immediacy of the moment. It is only with distance that we are able to turn our powers of observation on ourselves, thus fashioning stories in which we are characters.
At the very peak of memoir-mania, Kathryn Harrison released The Kiss, a detailed account of her incestuous relationship with her father. As James Wolcott recounts below, her relationship was not one of “childhood exploitation, but a consensual act between two adults.” Harrison was twenty years old when the affair began, and yes, she was fully aware he was her father.
As one would expect, the book triggered an outpouring of responses from the literary community, mostly regarding the question of why Harrison would make public the details of such a taboo relationship. The memoir proved divisive among critics, with writers like Wolcott deriding Harrison’s decision to publish such spilled guts as “opportunism [which] oozes from every pore of The Kiss and its launch.” Meanwhile, The New York Times called it “beautifully written” and “a powerful piece of writing.” Now, seventeen years later, it’s worth reading Wolcott’s scathing critique and asking the same question he does: Is the publication of such intensely personal information necessary for catharsis, or simply irresponsible?
I admit that a visceral reaction has kept me from reading Harrison’s The Kiss, similar to the reaction that has kept me from reading Thomas Harris’s The Silence of the Lambs because of the cannibalism.
In this piece, which originally appeared in The New Republic on March 31, 1997, James Wolcott criticizes Harrison’s book for another reason:
Since the 1970s, the deluge of pop-psych bestsellers, celebrity confessionals and tabloid talk shows has made Freud’s intellectual heavy-lifting seem as antiquated as washing by hand. Even our deepest, darkest secrets seem shallow now—easy pickings. Our once-hidden shames have become publicity hounds. It’s as if our psyches are no longer labyrinths or flooded basements, but well-lit TV studios where we swivel in the guest chair, awaiting our cue.
Thanks to memoir, Wolcott wrote, we are approaching “saturation agony overload.” Further, he speculates on Harrison’s reason for publishing the book:
“Literary fiction” has become a dread phrase in publishing, and the sales of Harrison’s novels have only been so-so; but a juicy memoir is where the money is. It is certainly true that opportunism oozes from every pore of The Kiss and its launch.
But these are only excerpts from Wolcott’s piece. Read the whole thing for an interesting take on the nature of secrets and the reasons for writing about—and publishing—them.
There are many theories of creativity, most of which suppose that creative flowering requires much practice of technique and cultivation of talent. In this article in The Atlantic Cody C. Delistraty writes:
With these widely accepted theories of creativity in mind, it is rather jarring to see two brand [new ?] studies, both of which suggest that creativity is closely linked with inherent neurological and personality traits rather than methodology or practice. The implication is that creativity can be learned, but only to a certain extent. To truly be an artistic great, the makeup of your brain is more important than the number of hours spent in your atelier.
Read to find out how openness to new experiences, willingness to experiment, and neural processing speed may contribute to creativity.
This is not a scientific study but an observation that meshes with many theories of creativity. Kevan Lee declares, “At any given time, I have a side project running.” This side project is something other than the main project he’s working on at the time. “Spending your time in this way can make you happier, healthier, and more productive,” he says.
If you’ve ever experienced a “Eureka!” moment when the answer or solution to some problem you’d been considering before hits you while you’re doing something else, you know what Lee is talking about. He looks at psychological research that suggests three characteristics of side projects:
- They don’t have to provide you with a living. You can still eat if they fail.
- They don’t have a deadline. And as there is no time pressure, you don’t revert to your usual formula. You try new things. You experiment. You take risks.
- This is a Labor of Love. You provide the ‘Labor’. And you provide the ‘Love’. So when you spend time on it, it is because you really want to. That keeps you coming back and pushing it on.
Lee suggests some side projects or creative hobbies you might cultivate and offers suggestions on how to keep a side project or hobby active while pursuing the work that pays your bills.
From a cognitive perspective, aging is typically associated with decline. As we age, it may get harder to remember names and dates, and it may take us longer to come up with the right answer to a question.
But the news isn’t all bad when it comes to cognitive aging, according to a set of three articles in the July 2014 issue of Perspectives in Psychological Science.
Plumbing the depths of the available scientific literature, the authors of the three articles show how several factors — including motivation and crystallized knowledge — can play important roles in supporting and maintaining cognitive function in the decades past middle age.
And here’s particularly good news: One of these studies suggests that, contrary to common belief, older adults are not more susceptible than younger people to consumer fraud because of cognitive decline.
In the 1998 movie The Truman Show, Truman Burbank, played by Jim Carrey, developed the feeling that the world revolved around him.
In this article Susannah Cahalan, author of Brain on Fire: My Month of Madness, the story of a her neurological disorder, looks at a new book:
The Truman Show Delusion, first described in 2006, written up in academic journals in 2012, and now the subject of a fascinating new book called “Suspicious Minds” by NYU psychiatrist Joel Gold and his brother Ian Gold, a professor philosophy and psychology at McGill University, reveals how intimately culture interacts with madness and mental health.
Among the book’s findings is the observation that “The content and type of our delusions are … culturally specific.”
“ Reading “Suspicious Minds” offers lessons to anyone interested in the complexity of the mental health field’s future,” Cahalan concludes.
On Sept. 13, 1848, at around 4:30 p.m., the time of day when the mind might start wandering, a railroad foreman named Phineas Gage filled a drill hole with gunpowder and turned his head to check on his men. It was the last normal moment of his life.
Other victims in the annals of medicine are almost always referred to by initials or pseudonyms. Not Gage: His is the most famous name in neuroscience. How ironic, then, that we know so little else about the man—and that much of what we think we know, especially about his life unraveling after his accident, is probably bunk.
Here’s how Kean explains the continuing fascination with the case of Phineas Gage:
Most of us first encountered Gage in a neuroscience or psychology course, and the lesson of his story was both straightforward and stark: The frontal lobes house our highest faculties; they’re the essence of our humanity, the physical incarnation of our highest cognitive powers. So when Gage’s frontal lobes got pulped, he transformed from a clean-cut, virtuous foreman into a dirty, scary, sociopathic drifter. Simple as that. This story has had a huge influence on the scientific and popular understanding of the brain. Most uncomfortably, it implies that whenever people suffer grave damage to the frontal lobes—as soldiers might, or victims of strokes or Alzheimer’s disease—something essentially human can vanish.
Recent historical work, however, suggests that much of the canonical Gage story is hogwash, a mélange of scientific prejudice, artistic license, and outright fabrication. In truth each generation seems to remake Gage in its own image, and we know very few hard facts about his post-accident life and behavior. Some scientists now even argue that, far from turning toward the dark side, Gage recovered after his accident and resumed something like a normal life—a possibility that, if true, could transform our understanding of the brain’s ability to heal itself.
Despite Gage’s fame, Kean notes, very little is actually known about how the injury changed Gage. Malcolm Macmillan, a psychologist and historian now with the University of Melbourne, has spent the last 40 years researching the facts surrounding the accident and the body of legend that has grown up around it.
People butcher history all the time, of course, for various reasons. But something distinct seems to have happened with Gage. Macmillan calls it “scientific license.” “When you look at the stories told about Phineas,” he says, “you get the impression that [scientists] are indulging in something like poetic license—to make the story more vivid, to make it fit in with their preconceptions.” Science historian Douglas Allchin has noted the power of preconceptions as well: “While the stories [in science] are all about history—events that happened,” Allchin writes, “they sometimes drift into stories of what ‘should’ have happened.”
In addition to the story of Phineas Gage, this piece contains several illustrations of the man and the rod that went through his brain, along with pictures of Gage’s skull and modern visualizations of his brain damage.
The brain is Steven Pinker’s playground. A cognitive scientist and experimental psychologist, Pinker is fascinated by language, behavior, and the development of human nature. His work has ranged from a detailed analysis of how the mind works to a best-seller about the decline in violence from biblical times to today.
Harvard University offers an interview with one of its most famous teachers about his life as a researcher and college professor.
For most of us, daydreaming is a virtual world where we can rehearse the future, explore fearful scenarios or imagine new adventures without risk. It can help us devise creative solutions to problems or prompt us, while immersed in one task, with reminders of other important goals.
For others, however, the draw of an alternative reality borders on addiction, choking off other aspects of everyday life, including relationships and work. Starring as idealized versions of themselves—as royalty, raconteurs and saviors in a complex, ever changing cast of characters—addictive daydreamers may feel enhanced confidence and validation. Their fantasies may be followed by feelings of dread and shame, and they may compare the habit to a drug or describe an experience akin to drowning in honey.
We all daydream, but for some people daydreaming becomes obsessive. Read how studying such people is helping scientists discover how daydreaming is “essential to generating our sense of self, suggesting that daydreaming plays a crucial role in who we are and how we integrate the outside world into our inner lives.”
Storytelling has become a major topic lately, with its own Twitter hashtag and books such as Jonathan Gottschall’s The Storytelling Animal examining why storytelling so appeals to our brains.
But Samuel McNerney, writing for Scientific American Blogs, argues that storytelling has its limitations and is one factor contributing to the persistence of some pop psychology beliefs:
This is one of the reasons we humans love narratives; they summarize the important information in a form that’s familiar and easy to digest. It’s much easier to understand events in the world as instances of good versus evil, or any one of the seven story types. As Daniel Kahneman explains, “[we] build the best possible story form the information available… and if it is a good story, [we] believe it.” The implication here is that it’s how good the story is, not necessarily its accuracy, that’s important.
But narratives are also irrational because they sacrifice the whole story for one side of a story that conforms to one’s worldview. Relying on them often leads to inaccuracies and stereotypes. This is what the participants in Brenner’s study highlight; people who take in narratives are often blinded to the whole story – rarely do we ask: “What more would I need to know before I can have a more informed and complete opinion?”
He encourages us to live life with a generous dose of critical thinking:
Ultimately, we need to remember what philosophers get right. Listen and read carefully; logically analyze arguments; try to avoid jumping to conclusions; don’t rely on stories too much. The Greek playwright Euripides was right: Question everything, learn something, answer nothing.
Memoir and family story-telling is about creating a legacy and a heritage, showing where you came from so you can know better where you are going — and even how you want to change that legacy. Every time I hear stories about someone’s family history, I’m inspired about the work memoirists do to create a historical record of how life is being lived now. In a short time, your today becomes your yesterday. What are you preserving for your family?
Read here her advice for creating a family memoir. She emphasizes writing both light and dark stories, and also addresses the issue of truth and secrets.
If you’re writing a memoir with an eye toward publication, Damian Barr, author of Maggie & Me: Coming Out and Coming of Age in 1980s Scotland, explains how to use tools of fiction writing to produce a book people will want to read.
Barr offers 10 pieces of advice. I think these three are the most basic:
- You Are Just A Character In Your Own Story
- In the Particular We find the Universal
- Find Your Voice
Whether we’re writing about our memories or simply remembering them, the key to those memories is language. Don MacKay, professor of psychology at the University of California, Los Angeles, has spent much of his life researching the links between language and memory:
I’ve always looked at language from a psychological point of view, as very central to human cognition and memory. It’s built into our genes. We think in terms of language. Language dominates everything. Language can influence how we remember things—but that’s a whole field in itself.
In this interview MacKay discusses different kinds of memory and his work with one of medicine’s most famous patients:
In 1957 a man named Henry Molaison, who became one of psychology’s most famous patients, had his hippocampus removed in an attempt to control his life-threatening seizures. After the surgery, he was unable to form new memories without years or even decades of effort. Many at the time believed the damage was limited to his “episodic memory,” or memories of events. But research by Don MacKay, a professor of psychology at the University of California, Los Angeles, demonstrated that his imagination, language production and language memory were also decimated—at a time when most psychologists did not consider language a type of memory.
Writer Sam Kean says that focusing on the deficits of patients, like Henry Molaison, with damaged brains overlooks one crucial fact: “However glaring their deficits are, their brains still work like ours to a large extent.”
But, Kean reports, while working on his forthcoming book The Tale of the Dueling Neurosurgeons: And Other True Stories of Trauma, Madness, Affliction, and Recovery That Reveal the Surprising History of the Human Brain, he discovered an antidote to this tendency: stories.
When we read the full stories of people’s lives, fictional or non-, we have to put ourselves into the minds of the characters, even if those minds are damaged. Only then can we see that they want the same things, and endure the same disappointments, as the rest of us. They feel the same joys, and suffer the same bewilderment that life got away from them. Like an optical illusion, we can flip our focus. Tales about bizarre deficits become tales of resiliency and courage.
Steve Jobs is the American icon of design creativity and innovation. This article looks at how Jobs pushed at legal boundaries:
Mr. Jobs’s conduct is a reminder that the difference between genius and potentially criminal behavior can be a fine line. Mr. Jobs “always believed that the rules that applied to ordinary people didn’t apply to him,” Walter Isaacson, author of the best-selling biography “Steve Jobs,” told me this week. “That was Steve’s genius but also his oddness. He believed he could bend the laws of physics and distort reality. That allowed him to do some amazing things, but also led him to push the envelope.”
Perhaps this is yet another demonstration that creativity often involves challenging the status quo.
In a 2007 review paper, Gutman and Victoria Schindler surveyed the scientific literature that analyzes the neurological basis for how hobbies and activities relate to health and well-being. They found that engaging in such activities as arts and crafts, music, meditation, home repairs and reading stimulates the mind, reduces the effects of stress-related diseases and slows cognitive decline.
Amanda Mascarelli reports on recent research into how activities like knitting affect the brain to offer long-term health benefits.
How we seek and respond to those rewards is part of what determines our overall happiness. Aristotle famously said there were two basic types of joy: hedonia, or that keg-standing, Netflix binge-watching, Nutella-from-the-jar selfish kind of pleasure, and eudaimonia, or the pleasure that comes from helping others, doing meaningful work, and otherwise leading a life well-lived.
Recent psychological research has suggested that this second category is more likely to produce a lasting increase in happiness. Hedonic rewards may generate a short-term burst of glee, but it dissipates more quickly than the surge created by the more selfless eudaimonic rewards.
A recent report on a study of adolescents published in the Proceedings of the National Academy of Sciences found that those who derive joy from selfless deeds were less likely to be depressed over time.
Creative thinking improves while a person is walking and shortly thereafter, according to a study co-authored by Marily Oppezzo, a Stanford doctoral graduate in educational psychology, and Daniel Schwartz, a professor at Stanford Graduate School of Education.
I do a lot of my walking indoors, on a treadmill, so I was pleased to see that “The study found that walking indoors or outdoors similarly boosted creative inspiration. The act of walking itself, and not the environment, was the main factor.”
Memoir instructor Roberta Temes, Ph.D., explains why it’s all right for memoirists to break three standard writing guidelines. She advises:
- Don’t start at the beginning.
- Forget about grammar.
- Don’t put your best foot forward.
Among the most hotly debated topics in psychology is gender differences. Are there really differences in the ways that men and women behave? Should we be studying those differences or examining ways in which men and women are similar? And how can researchers design studies that isolate gender from all other variables that influence human behavior?
Unfortunately, the actual science behind gender gaps . . . [is] a miasma of conflicting results, non-replicable studies, and varying effect sizes. And when you think about the complexities involved, it’s no wonder there’s a lot of confusion. Researchers studying gender differences must deal with genetics, physiology, behavior, culture, age, environment, race, and innumerable other variables. Behavior is also extremely sensitive to context, which muddies the waters further. Evaluating and interpreting the differences between men and women is, simply put, no easy task.
Stating that “gender research is coming a long way,” Ars Technica here evaluates “some of the more promising research into what—if anything—separates males from females when it comes to behavior.”
As anyone who’s read Gillian Flynn’s hugely popular novel Gone Girl can tell you, what we write creates a portrait—whether accurate or not—of who we are:
The journal PLOS ONE has just published the results of the largest study to-date of personality and language. The findings offer an unprecedented look at how differences in age, gender and temperament affect how we communicate on social media.
Using over 15 million messages posted by 75,000 Facebook users, researchers at the University of Pennsylvania were able to find which words, phrases and topics correlated most closely with certain personalities and demographics.
The word clouds and graphs here are fascinating. Here are some of the study’s key findings:
- Use of “I” decreases with age, while use of “we” increases.
- Complaining decreases with age.
- “Introverts Post About The Internet, Extroverts Post About Parties”
- “The researchers’ model was able to predict an individual’s gender from their language patterns with 91.9 percent accuracy.”
Mind wandering can get us in trouble, particularly at school or at work, where we’re told to pay attention and to focus on the problem at hand. But both William James, the father of modern psychology, and Sigmund Freud recognized the creative function of daydreaming.
Here cognitive psychologist Scott Barry Kaufman advocates looking at mind wandering from a personal perspective:
Most recent studies depict mind wandering as a costly cognitive failure with relatively few benefits (Mooneyham and Schooler, 2013). This perspective makes sense when mind wandering is observed by a third party and when costs are measured against externally imposed standards such as speed or accuracy of processing, reading fluency or comprehension, sustained attention, and other external metrics.
There is, however, another way of looking at mind wandering, a personal perspective, if you will. For the individual, mind wandering offers the possibility of very real, personal reward, some immediate, some more distant.
These rewards include self- awareness, creative incubation, improvisation and evaluation, memory consolidation, autobiographical planning, goal driven thought, future planning, retrieval of deeply personal memories, reflective consideration of the meaning of events and experiences, simulating the perspective of another person, evaluating the implications of self and others’ emotional reactions, moral reasoning, and reflective compassion (Singer and Schonbar, 1961; Singer, 1964b; Singer, 1966, 1974, 1975, 1993, 2009; Wang et al., 2009; Baars, 2010; Baird et al., 2011, 2012; Kaufman and Singer, 2011; Stawarczyk et al., 2011; Immordino-Yang et al., 2012; Kaufman, 2013).
Read his explanation of why people are “willing to invest nearly 50% of their waking hours engaged in” mind wandering “because it produces tangible reward when measured against goals and aspirations that are personally meaningful.”
Between 1914 and 1930 Carl Jung underwent a period of intense inner exploration that he documented in The Red Book. The book contains drawings of dreams and visions, and the text of Jung’s formulation of the concepts of archetypes, the collective unconscious, and the individuation of the psyche. The Red Book was finally made public with its publication in 2009.
In this article, which includes photos of some of Jung’s art work in The Red Book, Laura K Kerr, PhD, discusses this seminal work:
Through his meticulous design of The Red Book, CG Jung interwove his experience of madness with the collective suffering of his era. Such syntheses are rare — and just what the current mental health field desperately needs. In what follows, I look at how The Red Book became Jung’s journey out of madness as well as the foundation for his analytical psychology. Even today, over 50 years after his death, Jung’s analytical psychology is a relevant, non-pathologizing method for transcending madness, while also relating individual suffering to the larger collective.
Anyone who has ever taken an introductory psychology course has seen a graphic representation of Abraham Maslow’s hierarchy of needs. Maslow’s theory places our most basic needs at the bottom and holds that higher needs cannot be met until the ones below have first been taken care of.
This article examines the history of how Maslow’s hierarchy has been applied to management theory in the business world:
In the second half of the 20th Century, bosses began to realise that employees’ hopes, feelings and needs had an impact on performance. In 1960, Douglas McGregor published The Human Side of Enterprise, which contrasted traditional managerial styles with a people-centred approach inspired by Maslow. It became a best-seller.
For readers in search of tales that step outside familiar viewpoints, there is an abundance of fiction by women unraveling the big themes of conflict, religion, race and love — from new and different angles. The five novels I’m recommending offer up-close-and-personal engagement with characters who are often at odds with their communities or whose lives are so far on the periphery that we can be sure history books would pass them by. They make far-flung places and faraway lives feel immediate. This, for me, is the magic of good fiction: that outsiders — a child from the slums, an executed zealot, a reluctant immigrant, a guilty survivor and a suffering mother — can take center stage and make the world a bigger, yet more knowable, place.
Recommendations from Ellah Allfrey for National Public Radio (NPR).
I used to have a friend who loved to say, “You’re SO left-brain. You need me to collaborate on your project because I’m predominantly right-brain.”
The left side of the brain has been traditionally thought of as the area of logic and organization, while the right side was seen as the home of inspiration and creativity. So what my friend was always telling me was that I was too logical and had no creativity. I finally got tired of her attitude, and we are no longer friends.
And now I can feel vindicated:
Popular culture would have you believe that logical, methodical and analytical people are left-brain dominant, while the creative and artistic types are right-brain dominant. Trouble is, science never really supported this notion.
Now, scientists at the University of Utah have debunked the myth with an analysis of more than 1,000 brains. They found no evidence that people preferentially use their left or right brain. All of the study participants — and no doubt the scientists — were using their entire brain equally, throughout the course of the experiment.
Brain scans of people who say they have insomnia have shown differences in brain function compared with people who get a full night’s sleep.
“10 of the best psychology and neuro links from the past week (or so)”
CNN’s Jacque Wilson reports on the work of cognitive psychologist Elizabeth Loftus, an expert on the malleability of human memory. Loftus is famously controversial for her research into so-called “repressed memories,” which her critics say are actually false memories. But Loftus has also demonstrated the unreliability of eye-witness testimony and explored the use of memory manipulation to treat eating disorders and addictions such as alcoholism.
Roy Peter Clark knows that writers don’t merely look at things; they truly see:
I once heard of a clever writing prompt given to school children: “If you had a third eye, what could you see?” Writers, I would argue, already have a third eye. They use it to see life, language and literature in special ways.
This third eye has a number of different names. It’s called vision (and then revision), curiosity, inspiration, imagination, visitation of the muse. When an ordinary person says “I see,” she usually means “I understand.” If she’s a writer, she means that and much more. For the writer, seeing is a synecdochic and synesthetic gerund. It stands for all the senses, all the ways of knowing.
Take a look at his list of 50 “things I think writers see in life, language and literature.”
Fairy tales fascinate novelist Alison Littlewood:
Her second book Path of Needles was published last week and is a compelling read, focusing on a series of murders which, from the gruesome way in which the victims’ bodies are posed, appear to have a connection with fairytales. A young police officer, Cate Corbin, is part of the investigating team and on a hunch she calls in academic Alice Hyland, an expert in fairytales, to assist them on the case.
Fairy tales are enduring stories that deal with some of the more unsavory aspects of human nature. Says Littlewood, “I tend to write about things that personally scare me and I’m also fascinated by the fact that, despite all the technological advances we have made, there are still things we can’t explain.”
More than 40 million people globally take an SSRI antidepressant, among them many writers and musicians. But do they hamper the creative process, extinguishing the spark that produces great art, or do they enhance artistic endeavour?
In The Guardian, novelist Alex Preston takes an in-depth look at the question of whether psychiatric drugs help or hinder artistic creativity.
Joseph Campbell, the great scholar of religion, hit the core of our problem when he wrote, “People say that what we’re all seeking is a meaning for life. I don’t think that’s what we’re really seeking. I think that what we’re seeking is an experience of being alive.”
Adam Frank has some advice for becoming more aware of the experience of living: take a walk in the woods, and look around like a scientist. And don’t retort that you’re not a scientist: “You already are a scientist. You have been since you were a kid playing with water in the tub, or screwing around in the backyard with dirt and sticks and stuff.”
Refining our capacity to notice is an act of reverence that we can bring to everywhere and everywhen. It’s an invitation, bringing the world’s most basic presence into view, opening our horizons and restoring our spirits. And that is what science is really there for.
Most teachers, when asked if they value creativity in their students, say that of course they do. But when asked, in a different context, what the characteristics are of students that they like best and least, characteristics of creative people fill their “least liked” list. This is because creative children are the least docile in the classroom. They tend to work better on their own than with others, they focus on what catches their interest to the exclusion of other things, and they see associations and relationships between objects and ideas that most other people do not see. In other words, creative children can be disruptive in the classroom, and they are often bored with the material being presented.
And that’s too bad, says Scott Barry Kaufman, author of Ungifted: Intelligence Redefined, which will come out this summer:
Since so much is at stake for increasing creativity in society, it’s essential that we continually question and attempt to improve the methods by which we identify, mentor, and cultivate those who are ready and capable of becoming our next generation of innovators. Tragically, we are failing these students, often unknowingly letting them fall between the cracks in an education system that rewards characteristics that dampen creativity, such as conformity, standardization, and efficiency.
Read his research on alternative ways to identity and nurture creativity in children.
Here’s an interesting article on the study of prescriptive stereotypes—those stereotypes that specify how certain groups should be—of older adults. This research comes from two scientists at Princeton University, Susan Fiske and Michael North:
The research by North and Fiske homes in on the idea that understanding intergenerational tension is key to understanding ageism. Ageism is the one kind of discrimination, North noted, in which those who are generally doing the discriminating—younger generations—will eventually become part of the targeted demographic.
a few years back I became aware that I was living in a story that I hadn’t intentionally or consciously written for myself. It was a default story. I was living out the storyline of “abandoned girl” who believed she could not count on others and that she must do it all herself. She must be self-sufficient, strong, and prove herself capable in all matters. The main character in my story was controlling, driven, unintentionally selfish and dis-empowering to others. And boy, was this story laden with victim-mindset beliefs and self-limitation.
This was not the story I had imagined for my life! I wanted to be graceful and compassionate. I wanted to do work that I loved and that makes a difference in the world. I wanted to be a loving, calm parent and spouse.
De Yarrison explains how she changed her life by changing her life story. And she offers some tips on how other people can do the same.
The first piece of advice given to anyone working on a memoir is to be honest, to tell the truth about one’s own life. But is it possible to be TOO honest, Sarah Hampson asks. She thinks Jowita Bydlowska’s memoir Drunk Mom suggests a positive answer to that question:
She doesn’t want it to be about anything more than the words, the sentences, the writing, all of which are skillful, spare, lovely. But it is. Her memoir, Drunk Mom, is a terrifying journey about her relapse into alcohol abuse after her son was born. She had been sober for three and a half years. She started drinking again before he was born, then stopped during the pregnancy. To celebrate his birth, she had a glass of wine, and her addiction came back, full-grown and needy, like a long-lost, jealous child bent on taking her away from the innocent one, asleep in his crib.
This is a memoir that pushes at boundaries – what is private, what should perhaps be kept private, what we need to know, what we don’t, what is insightful or just exhibitionism. It is already one of the most talked about books of the season. Bydlowska is very honest in her writing. Let me be as well then. There’s self-harm in choosing to publish this memoir. It’s just like alcoholism: the recklessness of it; the abandonment of responsibility to her partner, to their relationship, to her child, now almost 4, and also, most painfully, to herself.