Among the most hotly debated topics in psychology is gender differences. Are there really differences in the ways that men and women behave? Should we be studying those differences or examining ways in which men and women are similar? And how can researchers design studies that isolate gender from all other variables that influence human behavior?
Unfortunately, the actual science behind gender gaps . . . [is] a miasma of conflicting results, non-replicable studies, and varying effect sizes. And when you think about the complexities involved, it’s no wonder there’s a lot of confusion. Researchers studying gender differences must deal with genetics, physiology, behavior, culture, age, environment, race, and innumerable other variables. Behavior is also extremely sensitive to context, which muddies the waters further. Evaluating and interpreting the differences between men and women is, simply put, no easy task.
Stating that “gender research is coming a long way,” Ars Technica here evaluates “some of the more promising research into what—if anything—separates males from females when it comes to behavior.”
As anyone who’s read Gillian Flynn’s hugely popular novel Gone Girl can tell you, what we write creates a portrait—whether accurate or not—of who we are:
The journal PLOS ONE has just published the results of the largest study to-date of personality and language. The findings offer an unprecedented look at how differences in age, gender and temperament affect how we communicate on social media.
Using over 15 million messages posted by 75,000 Facebook users, researchers at the University of Pennsylvania were able to find which words, phrases and topics correlated most closely with certain personalities and demographics.
The word clouds and graphs here are fascinating. Here are some of the study’s key findings:
- Use of “I” decreases with age, while use of “we” increases.
- Complaining decreases with age.
- “Introverts Post About The Internet, Extroverts Post About Parties”
- “The researchers’ model was able to predict an individual’s gender from their language patterns with 91.9 percent accuracy.”
Mind wandering can get us in trouble, particularly at school or at work, where we’re told to pay attention and to focus on the problem at hand. But both William James, the father of modern psychology, and Sigmund Freud recognized the creative function of daydreaming.
Here cognitive psychologist Scott Barry Kaufman advocates looking at mind wandering from a personal perspective:
Most recent studies depict mind wandering as a costly cognitive failure with relatively few benefits (Mooneyham and Schooler, 2013). This perspective makes sense when mind wandering is observed by a third party and when costs are measured against externally imposed standards such as speed or accuracy of processing, reading fluency or comprehension, sustained attention, and other external metrics.
There is, however, another way of looking at mind wandering, a personal perspective, if you will. For the individual, mind wandering offers the possibility of very real, personal reward, some immediate, some more distant.
These rewards include self- awareness, creative incubation, improvisation and evaluation, memory consolidation, autobiographical planning, goal driven thought, future planning, retrieval of deeply personal memories, reflective consideration of the meaning of events and experiences, simulating the perspective of another person, evaluating the implications of self and others’ emotional reactions, moral reasoning, and reflective compassion (Singer and Schonbar, 1961; Singer, 1964b; Singer, 1966, 1974, 1975, 1993, 2009; Wang et al., 2009; Baars, 2010; Baird et al., 2011, 2012; Kaufman and Singer, 2011; Stawarczyk et al., 2011; Immordino-Yang et al., 2012; Kaufman, 2013).
Read his explanation of why people are “willing to invest nearly 50% of their waking hours engaged in” mind wandering “because it produces tangible reward when measured against goals and aspirations that are personally meaningful.”
Between 1914 and 1930 Carl Jung underwent a period of intense inner exploration that he documented in The Red Book. The book contains drawings of dreams and visions, and the text of Jung’s formulation of the concepts of archetypes, the collective unconscious, and the individuation of the psyche. The Red Book was finally made public with its publication in 2009.
In this article, which includes photos of some of Jung’s art work in The Red Book, Laura K Kerr, PhD, discusses this seminal work:
Through his meticulous design of The Red Book, CG Jung interwove his experience of madness with the collective suffering of his era. Such syntheses are rare — and just what the current mental health field desperately needs. In what follows, I look at how The Red Book became Jung’s journey out of madness as well as the foundation for his analytical psychology. Even today, over 50 years after his death, Jung’s analytical psychology is a relevant, non-pathologizing method for transcending madness, while also relating individual suffering to the larger collective.
Anyone who has ever taken an introductory psychology course has seen a graphic representation of Abraham Maslow’s hierarchy of needs. Maslow’s theory places our most basic needs at the bottom and holds that higher needs cannot be met until the ones below have first been taken care of.
This article examines the history of how Maslow’s hierarchy has been applied to management theory in the business world:
In the second half of the 20th Century, bosses began to realise that employees’ hopes, feelings and needs had an impact on performance. In 1960, Douglas McGregor published The Human Side of Enterprise, which contrasted traditional managerial styles with a people-centred approach inspired by Maslow. It became a best-seller.
For readers in search of tales that step outside familiar viewpoints, there is an abundance of fiction by women unraveling the big themes of conflict, religion, race and love — from new and different angles. The five novels I’m recommending offer up-close-and-personal engagement with characters who are often at odds with their communities or whose lives are so far on the periphery that we can be sure history books would pass them by. They make far-flung places and faraway lives feel immediate. This, for me, is the magic of good fiction: that outsiders — a child from the slums, an executed zealot, a reluctant immigrant, a guilty survivor and a suffering mother — can take center stage and make the world a bigger, yet more knowable, place.
Recommendations from Ellah Allfrey for National Public Radio (NPR).
I used to have a friend who loved to say, “You’re SO left-brain. You need me to collaborate on your project because I’m predominantly right-brain.”
The left side of the brain has been traditionally thought of as the area of logic and organization, while the right side was seen as the home of inspiration and creativity. So what my friend was always telling me was that I was too logical and had no creativity. I finally got tired of her attitude, and we are no longer friends.
And now I can feel vindicated:
Popular culture would have you believe that logical, methodical and analytical people are left-brain dominant, while the creative and artistic types are right-brain dominant. Trouble is, science never really supported this notion.
Now, scientists at the University of Utah have debunked the myth with an analysis of more than 1,000 brains. They found no evidence that people preferentially use their left or right brain. All of the study participants — and no doubt the scientists — were using their entire brain equally, throughout the course of the experiment.
Brain scans of people who say they have insomnia have shown differences in brain function compared with people who get a full night’s sleep.
“10 of the best psychology and neuro links from the past week (or so)”
CNN’s Jacque Wilson reports on the work of cognitive psychologist Elizabeth Loftus, an expert on the malleability of human memory. Loftus is famously controversial for her research into so-called “repressed memories,” which her critics say are actually false memories. But Loftus has also demonstrated the unreliability of eye-witness testimony and explored the use of memory manipulation to treat eating disorders and addictions such as alcoholism.
Roy Peter Clark knows that writers don’t merely look at things; they truly see:
I once heard of a clever writing prompt given to school children: “If you had a third eye, what could you see?” Writers, I would argue, already have a third eye. They use it to see life, language and literature in special ways.
This third eye has a number of different names. It’s called vision (and then revision), curiosity, inspiration, imagination, visitation of the muse. When an ordinary person says “I see,” she usually means “I understand.” If she’s a writer, she means that and much more. For the writer, seeing is a synecdochic and synesthetic gerund. It stands for all the senses, all the ways of knowing.
Take a look at his list of 50 “things I think writers see in life, language and literature.”
Fairy tales fascinate novelist Alison Littlewood:
Her second book Path of Needles was published last week and is a compelling read, focusing on a series of murders which, from the gruesome way in which the victims’ bodies are posed, appear to have a connection with fairytales. A young police officer, Cate Corbin, is part of the investigating team and on a hunch she calls in academic Alice Hyland, an expert in fairytales, to assist them on the case.
Fairy tales are enduring stories that deal with some of the more unsavory aspects of human nature. Says Littlewood, “I tend to write about things that personally scare me and I’m also fascinated by the fact that, despite all the technological advances we have made, there are still things we can’t explain.”
More than 40 million people globally take an SSRI antidepressant, among them many writers and musicians. But do they hamper the creative process, extinguishing the spark that produces great art, or do they enhance artistic endeavour?
In The Guardian, novelist Alex Preston takes an in-depth look at the question of whether psychiatric drugs help or hinder artistic creativity.
Joseph Campbell, the great scholar of religion, hit the core of our problem when he wrote, “People say that what we’re all seeking is a meaning for life. I don’t think that’s what we’re really seeking. I think that what we’re seeking is an experience of being alive.”
Adam Frank has some advice for becoming more aware of the experience of living: take a walk in the woods, and look around like a scientist. And don’t retort that you’re not a scientist: “You already are a scientist. You have been since you were a kid playing with water in the tub, or screwing around in the backyard with dirt and sticks and stuff.”
Refining our capacity to notice is an act of reverence that we can bring to everywhere and everywhen. It’s an invitation, bringing the world’s most basic presence into view, opening our horizons and restoring our spirits. And that is what science is really there for.
Most teachers, when asked if they value creativity in their students, say that of course they do. But when asked, in a different context, what the characteristics are of students that they like best and least, characteristics of creative people fill their “least liked” list. This is because creative children are the least docile in the classroom. They tend to work better on their own than with others, they focus on what catches their interest to the exclusion of other things, and they see associations and relationships between objects and ideas that most other people do not see. In other words, creative children can be disruptive in the classroom, and they are often bored with the material being presented.
And that’s too bad, says Scott Barry Kaufman, author of Ungifted: Intelligence Redefined, which will come out this summer:
Since so much is at stake for increasing creativity in society, it’s essential that we continually question and attempt to improve the methods by which we identify, mentor, and cultivate those who are ready and capable of becoming our next generation of innovators. Tragically, we are failing these students, often unknowingly letting them fall between the cracks in an education system that rewards characteristics that dampen creativity, such as conformity, standardization, and efficiency.
Read his research on alternative ways to identity and nurture creativity in children.
Here’s an interesting article on the study of prescriptive stereotypes—those stereotypes that specify how certain groups should be—of older adults. This research comes from two scientists at Princeton University, Susan Fiske and Michael North:
The research by North and Fiske homes in on the idea that understanding intergenerational tension is key to understanding ageism. Ageism is the one kind of discrimination, North noted, in which those who are generally doing the discriminating—younger generations—will eventually become part of the targeted demographic.
a few years back I became aware that I was living in a story that I hadn’t intentionally or consciously written for myself. It was a default story. I was living out the storyline of “abandoned girl” who believed she could not count on others and that she must do it all herself. She must be self-sufficient, strong, and prove herself capable in all matters. The main character in my story was controlling, driven, unintentionally selfish and dis-empowering to others. And boy, was this story laden with victim-mindset beliefs and self-limitation.
This was not the story I had imagined for my life! I wanted to be graceful and compassionate. I wanted to do work that I loved and that makes a difference in the world. I wanted to be a loving, calm parent and spouse.
De Yarrison explains how she changed her life by changing her life story. And she offers some tips on how other people can do the same.
The first piece of advice given to anyone working on a memoir is to be honest, to tell the truth about one’s own life. But is it possible to be TOO honest, Sarah Hampson asks. She thinks Jowita Bydlowska’s memoir Drunk Mom suggests a positive answer to that question:
She doesn’t want it to be about anything more than the words, the sentences, the writing, all of which are skillful, spare, lovely. But it is. Her memoir, Drunk Mom, is a terrifying journey about her relapse into alcohol abuse after her son was born. She had been sober for three and a half years. She started drinking again before he was born, then stopped during the pregnancy. To celebrate his birth, she had a glass of wine, and her addiction came back, full-grown and needy, like a long-lost, jealous child bent on taking her away from the innocent one, asleep in his crib.
This is a memoir that pushes at boundaries – what is private, what should perhaps be kept private, what we need to know, what we don’t, what is insightful or just exhibitionism. It is already one of the most talked about books of the season. Bydlowska is very honest in her writing. Let me be as well then. There’s self-harm in choosing to publish this memoir. It’s just like alcoholism: the recklessness of it; the abandonment of responsibility to her partner, to their relationship, to her child, now almost 4, and also, most painfully, to herself.
Yes, Facebook can be a time suck. But:
Researchers at the University of Arizona have found that the social media site isn’t just for uploading party photos and killing time. Facebook also boosts the cognitive abilities of older people and provides them with a stronger connection to their loved ones.
Using Facebook provided both social engagement and cognitive stimulation for study participants:
The adults who learned to use Facebook performed about 25 percent better on tasks that measured their cognitive abilities. They were also better able to do “updating,” a psychological term meaning that they could quickly add to or recall parts of their working memory as needed. Those who had used the private online diary site Penzu or who had not used Facebook at all saw no such cognitive gains.
The article provides tips on helping older relatives or friends learn how to use social media.
If you haven’t yet discovered Maria Popova’s Brain Pickings, here’s a start. In this entry she discusses The Ravenous Brain: How the New Science of Consciousness Explains Our Insatiable Search for Meaning by Cambridge neuroscientist Daniel Bor. Bor’s book examines “how our species’ penchant for pattern-recognition is essential to consciousness and our entire experience of life.” The basis for the ability to recognize patterns is working memory, where the brain temporarily stores individual items for further processing. And working memory can be improved by “a concept called chunking, which allows us to hack the limits of our working memory — a kind of cognitive compression mechanism wherein we parse information into chunks that are more memorable and easier to process than the seemingly random bits of which they’re composed.”
Read on to find out how pattern recognition can be a hindrance as well as an advantage. But be warned: Popova’s site is so rich that, once you start reading, you won’t want to stop.
While it can be difficult for anyone to remember happier times, the task is especially difficult for people with depression. This article reports on a study of whether building a memory palace can help people conjure up happier times:
The method-of-loci technique, which relies on spatial memory, is remarkably simple to explain, but does require some mental effort to set up. What you do is think of a place that you know really well, like a house you lived in as a child or your route to work. Then you place all the things you want to remember around the house as you mentally move around it. Each stop on the journey should have one object relating to a memory. The more bizarre and surreal or vivid you can make these images, the better they will be remembered.
If you carried out this process for a series of good memories, you’d have what is called a ‘memory palace’ of happy times that you could return to in moments of stress.
The study found that people who had used the method-of-loci technique to construct their memory palace were later able to recall more happpy memories than people who had rehearsed memories without the organizing technique.
Our identities become shaped by our life stories as we gradually incorporate the memories of the events in our lives into our sense of self (Whitbourne, 1985). The most important of these, the “self-defining memories,” are the ones that we remember most vividly and that contribute most heavily to our overall sense of self. A self-defining memory is also easily remembered, and emotionally intense. In some cases, these memories represent ongoing themes that we play out over and over again in our lives.
Learning to recognize your own self-defining memories can help you gain important insights about your identity. The easiest way to find out your own self-defining memories is by thinking about the events in your life that you are most likely to tell people about when they say “Tell me a little about yourself.”
Susan Krauss Whitbourne, Ph.D., discusses how to identify your self-defining memories, those memories of events that have most contributed to shaping your life story and, therefore, your sense of personal identity.
Our self-defining memories may very well change over time, as we have more experiences to incorporate into our life story. For this reason, identifying which memories have become self-defining for us can be a valuable exercise as we grow older.
You’re walking down the street, just like any other day, when suddenly a memory pops into your head from years ago. It’s about a person you haven’t thought of for years.
Just for a moment you’re transported back to a time and place you thought was long-forgotten. In a flash, though, the memory has vanished as quickly as it appeared.
This experience has been dubbed a ‘mind-pop’ and sometimes it is prompted by nothing your conscious mind is aware of.
There is, perhaps, an even weirder type of ‘mind-pop’. This is when all you get is a word or an image which seems to have no connection to anything at all. Like suddenly thinking of the word ‘orange’ or getting the image of a cheese grater. They seem weirder because they feel unconnected to any past experience, place or person—a thought without any autobiographical context.
Have you ever had an experience like this? If you have, don’t worry. These experiences don’t mean you’re weird.
Researchers have discovered that many people have these experiences, which are most likely to occur during routine, habitual activities. And although some such experiences can be traced back to a triggering event, many cannot:
The fact that many mind-pops could not be traced back to their source is probably the result of how much of our processing is carried out unconsciously.
The fascinating thing was that many of these mind-pops occurred weeks or months after exposure to the original trigger. This suggests that these words, images and ideas can lie in wait for a considerable period. Some even think that experiencing mind-pops could be associated with creativity as these apparently random associations can help to solve creative problems.
The aftermath of the 2012 presidential election has generated a moment of myth creation about what happened on Nov. 6 — why President Obama won, why Mitt Romney lost, and what roles real human beings played in the result. These myths are not only being repeated and set in stone by the media and pundits, but also by the campaigns themselves. Democrats and the Obama campaign as well as Republicans and the Romney campaign are repeating the same myths to explain the outcome.
I have often talked about how myths arise in politics in the aftermath of the election, and how these myths move from fiction to nonfiction. And it is because folks buy into the myths that mistakes are made in future campaigns, and wrong lessons are learned along the way. The winning campaign operatives and consultants usually disseminate the myths to justify their work, but in this election both the winners and losers are creating the same mythic narrative.
Political strategist Matthew Dowd discusses how stories become narratives that in turn develop into national myths. He defines three myths about the 2012 U. S. Presidential election, then concludes:
All of this raises the question of whether campaigns and tactics matter. They do, but only in a very limited way, and they are insignificant compared with the overall political environment and the grand movements of the world and our country. The most successful people in life and in politics learn to recognize the big waves happening in the world and then surf them as best they can. President Obama, despite many flaws and vulnerabilities, had the traits and attributes that made him more able to surf the movements than Romney or the Republican Party.
Recommended Reading: Memoirs
The Minneapolis Star Tribune suggests five books. The page also includes a link to the newspapers complete annual holiday books roundup.
Book Riot has put together Behind-the-Scenes Memoirs: A Reading List from readers’ suggestions: “Here’s a list of books that satisfy your voyeuristic literary urges and provide a peek behind the curtain of an otherwise secretive culture or group.”
What I’m reading this week:
Since I’m fairly new to Twitter ( @MDBrownPhD ), I look for informative articles and blog posts about how to use Twitter effectively. Here are a couple of things I’ve looked at recently:
Writer and editor Meghan Ward has a good summary of both sides of this issue. There are also a lot of comments on her post that also contribute to the conversation.
Twitter’s research into how journalists can best grow their followings uses data to confirm what you’ve probably been told at a dozen social media seminars: Be a firehose of information about your beat, use hashtags and @ mentions as much as you can, and share what you’re reading.
The experience of meaningful coincidences is universal. They are reported by people of every culture, every belief system, and time period. Traditionally these synchronistic events are made acceptable by ascribing them to outside supernatural forces such as divinities, or in modern times, impersonal archetypal influences.
Dr. Kirby Surprise demonstrates that synchronistic events, based on the activity of the mind, are actually caused by the person who perceives them, and reflect many levels of their consciousness.
His research reveals that what we believe and the way we look for patterns in the world generates synchronistic events that mirror our own assumptions. By decoding the science of synchronicity, Dr. Kirby uncovered how we actually create events and how we co‐create our reality.
The term dementia is used broadly to describe a condition which is characterized by cognitive decline, but there are many different types of dementia. Although it is usually progressive, properly diagnosing dementia can reverse the effects and be treated and even cured completely by addressing the underlying cause. However, dementia caused by incurable conditions such as Alzheimer’s disease, are irreversible.
In The American Scholar, Priscilla Long explains what happens when we drift off to dreamland.
If you were paying attention to news about successful treatments for opioid addiction you probably fell out of your chair when you saw a stream of headlines one mid-month morning proclaiming that a way to “block” addiction to heroin had been found. All of the articles were based on one of two press releases, both of which were issued by the institutions where the lead authors of an animal study work (University of Colorado and University of Adelaide). The study was published in the August 15 issue of the Journal of Neuroscience.
In fact, the study was about how a promising new immune system component can contribute to pain relief with lower amounts of opioids. It was not about addiction.
Thanks to SciCurious, a Scientific American blogger, the truth got out pretty fast. But it was too late for the hundreds of articles that have already been published—with no retractions or corrections that we can find.
This is a cautionary tale about journalists who, instead of going to the easily-findable study itself, rely on the press release.
We formulate stories about our own behavior and that of others all the time. If we’re not sure about the details, we make them up – or rather, our brain does, without so much as thinking about asking our permission.
Feminist author Naomi Wolf’s new book is Vagina: A New Biography. On Goodreads she lists her five favorite books on gender:
New research suggests that letting your mind wander can pay off in creativity:
Writer Geoffrey Gray explains why authors need to do more than work their fingers on the keyboard:
Ann Marie Rasmussen finds a multitude of female archetypes in Game of Thrones:
Shirley Hershey Showalter offers a well researched and well presented explanation of the popularity of memoir writing:
- “The hyenas that solved the puzzle tested more potential solutions — including biting, flipping or pushing the box — than the ones that failed, the researchers said.”
- “those that quickly approached the foreign object were more likely to get the box open than the hesitant hyenas, suggesting that risk-taking has some benefits, the researchers said.”
We are all the main character in our own life story, but many other characters appear in those stories, too. At what point does a particular episode in our life stop being just about us and become the other characters’ story as well? And when that happens, is the episode ours to tell? How will our revelation of the episode affect the other people involved?
At some point all writers of memoir intended for publication—or even for distribution within a limited group such as a family—have to ask themselves such questions. John Eakin, a professor at Indiana University and one of the foremost authorities on autobiography and memoir, recently addressed this issue in the final installment of a speaker series at the Janet Prindle Institute for Ethics at DePauw University in Indiana on the ethics of life writing:
“Complicating my thinking about this question was my belief that our identities are relational, that is my sense of my self as an individual is a function in no small part in my understanding of my relationships to other people, particularly the near and dear siblings and friends,” Eakin said. “So if that’s the case, if our identities are relational and hence our privacies are shared, where does one life end and another begin?”
Despite such complexities, Eakin pointed out the value of life writing as a tool for self-discovery and identity formation:
“I can suggest three reasons why we engage in life writing. We are trained to do it, it answers a metaphysical need to know our place in the larger scheme of things, and self-narration promotes the well-being of the organisms that we are,” Eakin said.
He called life writing “a step towards preparing for the future, as people must ‘accept rather than disavow the lives that they’ve lived.’”
The Wall Street Journal recently published this adapted excerpt from Jonah Lehrer’s book Imagine: How Creativity Works published by Houghton Mifflin Harcourt in March.
The image of the ‘creative type’ is a myth. Jonah Lehrer on why anyone can innovate—and why a hot shower, a cold beer or a trip to your colleague’s desk might be the key to your next big idea.
Lehrer cites research from the relatively new science of the study of creativity:
But creativity is not magic, and there’s no such thing as a creative type. Creativity is not a trait that we inherit in our genes or a blessing bestowed by the angels. It’s a skill. Anyone can learn to be creative and to get better at it. New research is shedding light on what allows people to develop world-changing products and to solve the toughest problems. A surprisingly concrete set of lessons has emerged about what creativity is and how to spark it in ourselves and our work.
Read about how inventions like the Post-It Note often come about in the unlikeliest of places and when people aren’t concentrating on a problem that needs solving. Particularly intersting in the list “10 Quick Creativity Hacks” at the end of the article. Also interesting are the many comments from readers.
This isn’t an article. It’s a collection of quotations about and references (some with links) to resources about how and why our brain allows us to use memory to relive our experiences. If that topic interests you, this is a good place to start your research.
Are we governed by unconscious processes? Neuroscience believes so – but isn’t the human condition more complicated than that? Two experts offer different views.
What makes us the person we are? Does our sense of “my self” refer to our minds, our bodies, or some combination of the two? How does the mass of gray matter within our skulls determine or discover who we are and why we think, feel, and act they way we do?
David Eagleman, neuroscientist at Baylor College of Medicine in Texas, and Raymond Tallis, former professor of geriatric medicine at Manchester University [U. K.], here debate questions such as these.
It is clear at this point that we are irrevocably tied to the 3 lb of strange computational material found within our skulls. The brain is utterly alien to us, and yet our personalities, hopes, fears and aspirations all depend on the integrity of this biological tissue. How do we know this? Because when the brain changes, we change. Our personality, decision-making, risk-aversion, the capacity to see colours or name animals – all these can change, in very specific ways, when the brain is altered by tumours, strokes, drugs, disease or trauma. As much as we like to think about the body and mind living separate existences, the mental is not separable from the physical.
This clarifies some aspects of our existence while deepening the mystery and the awe of others.
For example, take the vast, unconscious, automated processes that run under the hood of conscious awareness. We have discovered that the large majority of the brain’s activity takes place at this low level: the conscious part – the “me” that flickers to life when you wake up in the morning – is only a tiny bit of the operations. This understanding has given us a better understanding of the complex multiplicity that makes a person. A person is not a single entity of a single mind: a human is built of several parts, all of which compete to steer the ship of state. As a consequence, people are nuanced, complicated, contradictory. We act in ways that are sometimes difficult to detect by simple introspection. To know ourselves increasingly requires careful studies of the neural substrate of which we are composed.
Yes, of course, everything about us, from the simplest sensation to the most elaborately constructed sense of self, requires a brain in some kind of working order. Remove your brain and bang goes your IQ. It does not follow that our brains are pretty well the whole story of us, nor that the best way to understand ourselves is to stare at “the neural substrate of which we are composed”.
This is because we are not stand-alone brains. We are part of community of minds, a human world, that is remote in many respects from what can be observed in brains. Even if that community ultimately originated from brains, this was the work of trillions of brains over hundreds of thousands of years: individual, present-day brains are merely the entrance ticket to the drama of social life, not the drama itself. Trying to understand the community of minds in which we participate by imaging neural tissue is like trying to hear the whispering of woods by applying a stethoscope to an acorn.
Of course brain activity is automated and, as you say, runs “under the hood of conscious awareness”, but this doesn’t mean that we are automatons or that we are largely unconscious of the reasons we do things.
Read on to discover some of the newest hypotheses and discoveries from the burgeoning field of neuroscience. And take a look at all the comments this debate generated.