Information

Why aren't the brains of most advanced life forms in the middle of their bodies?

Why aren't the brains of most advanced life forms in the middle of their bodies?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

This question is designed to be the successor to Why don't most animals have "heads" in the middle of their bodies?

The previous question was flawed as it fails to accurately define what constitutes a "head".

Keeping on topic, the question here is largely focused on placement of brains and brain-like structures.

Intuitively, common sense would dictate keeping the brain roughly equidistant from everything it controls, decreasing the number of potential failure points between the brain and peripherals.

The standard argument in favor of putting the brain in the head rather than in the middle is for dedicated specialized visual processing. This argument is easily debunked seeing that there's no reason why you couldn't have a brain in the middle of the body and a smaller dedicated optical co-processor in the head. (similar to how a certain premium laptop design 7 years ago had only integrated video built-in but comes with a dock that supplies a high-end external video card). The smaller head-brain would run all of the intensive processing and send a compact representation to the central brain, adding a modest amount of extra reaction time lag. Extra credit if the central brain can partially offload high-level or heavily spatial thought processes to the smaller head-brain (think of it as using a CPU and GPU working together to solve a problem more quickly than the CPU could on its own).


Because the head is where most of the sensory organs are, and once the brain started developing there it was basically impossible to move it. There is so much relying on its placement, that if it was moved many nerves would no longer be directed to the right place, as there are myoskeletal constraints around it. Basically most of an animals form around the central nervous system and moving major basal organ complexes just doesn't happen in evolution. Moving the brain would be impossible once you got to things like vertebrates.

Smaller CPU only work in systems with lots of redundancy which is not an animal thing. Centralizing processing has it's own advantage because parts of the brain have to communicate with other parts a lot. You would basically have to have almost an entire other brain, which defeats any advantage.

Lastly, there really is not much of an advantage in moving the brain into the chest cavity, as for most animals the brain is small enough to be just as safe in the head as anywhere else.

That said, there are animals with a brain in their midsection; they are called cephalopods.


Could Mitochondria Be the Key to a Healthy Brain?

Long before the earliest animals swam through the water-covered surface of Earth&rsquos ancient past, one of the most important encounters in the history of life took place. A primitive bacterium was engulfed by our oldest ancestor &mdash a solo, free-floating cell. The two fused to form a mutually beneficial relationship that has lasted more than a billion years, with the latter providing a safe, comfortable home and the former becoming a powerhouse, fueling the processes necessary to maintain life.

That&rsquos the best hypothesis to date for how the cellular components, or organelles, known as mitochondria came to be. Today, trillions of these bacterial descendants live within our bodies, churning out ATP, the molecular energy source that sustains our cells. Despite being inextricably integrated into the machinery of the human body, mitochondria also carry remnants of their bacterial past, such as their own set of DNA.

The DNA that constitutes the human genome is contained within the nucleus of our cells. But mitochondria possess their own set of circular DNA, which is likely a remnant of their ancient bacterial past.

These features make mitochondria both a critical element of our cells and a potential source of problems. Like the DNA inside the nuclei of our cells that makes up the human genome, mitochondrial DNA can harbor mutations. Age, stress and other factors may disrupt mitochondria&rsquos many functions. On top of that, mitochondrial injury can release molecules that, due to their similarities to those made by bacteria, can be mistaken by our immune system as foreign invaders, triggering a harmful inflammatory response against our own cells.

There is one organ that appears to be particularly vulnerable to mitochondrial damage: our power-hungry brains. &ldquoThe more energetically demanding a cell is, the more mitochondria they have, and the more critical that mitochondria health is &mdash so there&rsquos more potential for things to go wrong,&rdquo says Andrew Moehlman, postdoctoral researcher who studies neurodegeneration at the US National Institute of Neurological Disorders and Stroke (NINDS). According to some estimates, each neuron can have up to 2 million mitochondria.

A small but growing number of scientists are now turning their attention to the contributions of mitochondria in brain health. Studies in humans and lab animals &mdash though much of it still preliminary &mdash suggest these organelles could be key players in virtually every type of brain disorder, including neurodevelopmental conditions such as autism, psychiatric illnesses like depression and schizophrenia, and neurodegenerative diseases such as Parkinson&rsquos. They may even be at the heart of an enduring mystery for researchers who study brain disorders: how genetic predispositions and environmental influences interact to put people at risk for developing these conditions.


Recommended Reading

Carbs: A Love Story

The Perks of Fasting, With None of the Work

The Only Good Reason to Ban Steroids in Baseball: To Prevent an Arms Race

Before reaching out for help on the diester molecule, D’Agostino looked Arnold up online and found his posts on bodybuilding message boards—the type of boards on which serious lifters debate the finer points of nutrition, supplements, and, sometimes, anabolic steroids. Arnold, who had become something of a celebrity on the boards, rarely missed a chance to show off his vast knowledge. D’Agostino says that he is “sort of indifferent to steroids.” Arnold, he reasoned, had made “a lot of different things over the years”—some legal and some not—to stretch the limits of human performance. While steroids had rocked the world of professional baseball, they seemed far less scandalous in the bodybuilding world with which D’Agostino was familiar. Besides, he says, “I had exhausted all my sources in academia. What did I have to lose?”

D’Agostino shot Arnold a message on Facebook. Hoping to spark Arnold’s interest in the project, he pointed out that, in addition to possible medical benefits, synthetic ketones might hold promise for performance enhancement. When the message arrived, Arnold was three years out of prison and back at his lab in Seymour, Illinois, a small town on the outskirts of Champaign. He had returned to the work he knew best: making and marketing his own lines of nutrition supplements.

But things weren’t going well. His new company was in disarray—and in the crosshairs of another federal investigation. D’Agostino’s out-of-the-blue note opened the possibility of reinventing Arnold’s business. Perhaps, he thought, it might even redeem his reputation.

A Ph.D. researcher and an ex-con without an advanced degree make for an unlikely pairing. But as they developed a working relationship, D’Agostino and Arnold saw that they had plenty in common. Over the years, Arnold has experimented with some of his own products, and he, too, once had the ripped physique of a competitive bodybuilder. Now 53, he still has the thick neck and boulder-like shoulders of someone who spends a lot of time at the gym.

In another life, Arnold might have gone down the same path as D’Agostino and become a successful academic or found a job with a pharmaceutical company. But differences between the two men were apparent from a young age. D’Agostino grew up in central New Jersey, baling hay and driving tractors on his neighbor’s 200-acre farm. After falling in love with biology in high school, he went to Rutgers, where he became a serious student. Arnold grew up in Guilford, Connecticut, a small coastal town. He was an erratic student, excelling in the classes he found interesting, particularly chemistry, and floundering in the ones he didn’t.

“I was antisocial,” says Arnold, who comes off in conversation as an unusual hybrid: one part musclehead bro, one part chemistry savant. “I got to school, I’m not smiling. Everyone else is smiling. Why aren’t you smiling? Fuck you. You know? Because you don’t understand.”

Arnold and his two older brothers spent a lot of time in their basement, where, after finding an old set of weights at their grandfather’s house, they set up a small gym. By high school, Arnold had already fallen into the role of nutrition and muscle-building guru he would assume later in life. His brother John had begun to compete in bodybuilding competitions, and Arnold’s job was to figure out what John should eat each day as he trained. Arnold would make his own supplements in the family kitchen, forming small rolls from milk, egg-protein powder, peanut butter, and honey, then freezing them for later use. “I won my first contest because of Patrick,” John says.

When Arnold read about performance-enhancing drugs as a kid, the warnings against using them only sparked his curiosity. He first tried steroids while working in construction after dropping out of the University of New Haven. He eventually finished his degree and later took several graduate-level courses in organic chemistry. But his real training in making steroids, he says, began in 1990, when he started working for a company in New Jersey that made chemicals for hair gels and conditioners, among other products. Arnold’s entry-level work—synthesizing simple molecules and then waiting around to check the temperature of the reactions—was mind-numbing for someone of his abilities. But the job came with valuable perks: access to a lab and a well-stocked chemistry library right on Arnold’s floor.

He tried synthesizing testosterone first, using extract from a chopped-up yam, he says, but only succeeded in making a huge mess. That experience led Arnold to an insight that would serve him well throughout his career: If you want to synthesize chemical compounds, don’t start from scratch. Instead, find the closest raw material that’s commercially available. “It’s like if you want to build a car,” Arnold told the self-help guru Tim Ferriss a few years ago. “You don’t make your own rubber or make your own steel.”

Over the years, as Arnold continued to read and absorb more of the scientific literature on steroids, his knowledge of commercially available compounds would prove useful again and again. He says he first realized the potential performance-enhancing benefits of andro, the drug later used by McGwire, by reading old patents from the former East Germany.

The Clear, which Arnold would develop years later, wasn’t so much a stroke of creative genius as a testament to the breadth of his knowledge. Arnold had read about a molecule called norbolethone in some of the early literature on anabolic steroids, and he knew it had the power to add muscle and mass. He didn’t have a way to make norbolethone from scratch, but he recognized that the steroid’s chemical structure was very similar to that of progestin, a molecule used in birth-control pills. He soon discovered that it was easy to order progestin from China, and that by adding hydrogen, he could turn it into norbolethone.

When officials from the U.S. Anti-Doping Agency caught on, Arnold looked up other forms of progestin and discovered one that could be transformed into a steroid in almost the same way. That one would ultimately end up in the hands of Barry Bonds and land Arnold in prison in 2006. Arnold’s troubles didn’t end there, however. A few years after he started his new nutritional-supplements company in Illinois—strictly legal this time, he insists—a relief pitcher for the Phillies tested positive for andro, which Major League Baseball had since banned. The pitcher had been using a then-allowed testosterone booster, 6-OXO, made by Arnold’s new company, which led federal investigators to search his lab, on the suspicion that Arnold had spiked the 6-OXO with andro.

Arnold adamantly denies this accusation, and no charges were ever filed against him. But the investigation made it impossible for him to continue selling 6-OXO, his most lucrative product, and all but wiped out the business he was trying to build. Arnold now describes the affair as a “huge nightmare.” “I’m not saying my life was completely ruined,” he once said of the raid, “but I’m saying that it was significantly downgraded.”

N ot long after the 6-OXO ordeal, D’Agostino contacted Arnold and the two got to work making the ketone diester. They already had a recipe, or “synthesis,” to follow: The molecule had first been synthesized in the early 1970s by Henri Brunengraber, a biochemist at Case Western Reserve University, who was then working on compact foods that astronauts might be able to take on multiyear journeys to Mars. But making the molecule turned out to be far more complicated than Arnold had anticipated.

The challenge, as he explains it, was in finding a way to produce the diester efficiently so that he could supply D’Agostino with enough for a series of different studies. It was “kind of a bitch,” Arnold recalls. “I tried it and it didn’t work, and it didn’t work.” Getting a molecule that was almost right was relatively easy perfecting it was something else. There was no flash of inspiration. Arnold describes it as “step-wise kind of a revelation,” involving “dozens and dozens and dozens of experiments.”

After six months, the first batch arrived at D’Agostino’s lab in a cardboard box. Inside, wrapped in tinfoil, was a cylindrical tube holding 10 milliliters of an amber-colored liquid. “I’m not even sure it had any markings on it,” D’Agostino says. It was only enough for a few small tests, but D’Agostino’s excitement grew with each one. He says he began going to the lab in the middle of the night just to see how his rats were doing. For the most part, they were doing remarkably well. The diester raised their ketone levels rapidly, regardless of what they were eating, and kept the levels unusually high for hours.

Since the first test in 2011, D’Agostino and other researchers have shown that the diester can extend the lives of rats with a particular type of brain cancer, even when the rats are following normal diets. In rodents, at least, the compound is also more effective than other ketone supplements at preventing the oxygen-toxicity seizures that D’Agostino initially set out to study. And it can reduce seizures and other symptoms in mouse models of Angelman syndrome, a devastating genetic disease with few treatment options.

D’Agostino says that he now receives a request for the molecule from other scientists at least once a week. The research is “exploding.” Most studies are still being done with animals, but human trials for several different neurological conditions are now in the early stages. Thus far, the only completed human trial of the diester was not especially promising: It tested the diester’s impact on elite cyclists and found that it slowed them down and upset their stomachs. But D’Agostino maintains that the molecule was never intended to improve athletic performance and that, in this case, the diester was given to the cyclists in a formulation that was bound to cause gastric distress. (In its purest form, he says, the diester is almost intolerable: “The taste itself actually causes most people to throw up.”)

Commercially, the availability of ketones is growing, as well. A ketone monoester created by Richard Veech, a pioneer of ketone research at the National Institutes of Health, and Kieran Clarke, of the University of Oxford, is now being sold by the startup HVMN at $33 per serving. When I spoke with Veech, he sounded all but convinced that the monoester can not only help treat Parkinson’s and other neurological diseases, but can also protect cells from radiation, which damages them in much the same way as oxygen toxicity. Clarke believes that the ketone monoester will eventually be “more or less a general tonic for the general population.”

It’s still far too soon to say with any certainty that ketones in a bottle or pill will emerge as a valuable therapeutic tool or athletic performance enhancer. What works for rats and mice often does not work for humans. “This is a 5-mile race,” Mike McCandless, a supplement maker who has funded some of D’Agostino’s research, says. “We’re literally at the 10-foot mark.”

Eugene Fine, of the Einstein College of Medicine, is studying the ketogenic diet for the treatment of cancer. He is convinced of the diet’s safety, noting that people have been successfully eating low-carbohydrate, Atkins-style diets for decades. He thinks it’s unlikely that ketone supplements will prove harmful, but he cautions that we don’t yet have long-term studies on the effects of consuming ketones and carbohydrates at the same time—as would happen if someone took a ketone supplement on a typical American diet.

Researchers still aren’t even certain about the fundamental reason ketogenic diets might benefit health. Gary Yellen, a Harvard researcher who studies how ketogenic diets prevent epileptic seizures, says that while there are almost certainly multiple mechanisms at play, his research suggests that when it comes to the brains of people with epilepsy, it’s the shift away from burning glucose that makes the difference. “I don’t believe that ketone bodies themselves are the key to the diet,” Yellen says.

D’Agostino, however, still suspects that the ketones may be playing an essential role. His own research suggests that ketone bodies function as signaling molecules inside of cells, altering gene expression in ways that are associated with life extension. In 2015, a paper in Nature Medicine that D’Agostino cowrote with a leading inflammation researcher at Yale and other scientists found that some ketone supplements, including the diester Arnold created, seem to have striking anti-inflammatory effects when tested on mice, which could help prevent disease. D’Agostino says most of his research is now devoted to studying this phenomenon and other ways in which ketones might affect gene expression.

W here Patrick Arnold will ultimately fall in the story of ketones is difficult to predict. He has continued to collaborate with D’Agostino, and even has launched his own foray into ketone supplements, but he’s still somewhat unknown in the ketone world. Brunengraber, the chemist who first synthesized the diester molecule, didn’t know who Arnold was before I asked about him. Clarke, the Oxford researcher who cocreated the ketone monoester, said she had heard of Arnold but had never met him. “He is probably a good chemist,” she said. “I don’t know about his principles.”

Arnold says that most of the people who know him through his ketone work haven’t heard of the BALCO scandal. “Occasionally, someone says, ‘Oh, don’t trust him. He’s a criminal,’” he says. But even more people, he claims, tell him that they “don’t give a damn.”

Arnold, it’s safe to say, is an imperfect vessel for any scientific advance. For many baseball fans, BALCO does remain a painful memory. Arnold’s contributions to ketone research can’t undo his role in that scandal or the damage it did to professional sports. But his checkered past is unlikely to matter to people with epilepsy or cancer if ketone supplements one day help them live longer, healthier lives.

Jason Karlawish, a professor of medicine, medical ethics, and health policy at the University of Pennsylvania, believes that Arnold’s history shouldn’t discourage mainstream scientists like D’Agostino from working with the diester, as long as Arnold’s scientific work itself is valid. “D’Agostino took an ethical risk only if there was a chance Arnold could use their work for dangerous ends,” Karlawish says.

D’Agostino maintains that there was “no risk” of Arnold doing harm. And Arnold says he has no interest in making any dangerous substance. He is eager, though, to make something profitable. Almost as soon as D’Agostino reached out to him in 2009, Arnold began dreaming up new ketone products that he could sell as safe and effective weight-loss or athletic-performance supplements. Today, Arnold sells products like ketone salts via KetoSports, a new company that occupies the same lab and facilities where Arnold made his more infamous creations.

As D’Agostino’s own research has progressed—in 2017, he spent 10 days living on ketone supplements on a NASA mission at the bottom of the Atlantic Ocean—he has kept Arnold up to date on his latest findings. Arnold’s name even appears as one of the coauthors on some of the papers D’Agostino has published in scientific journals, including an article in the International Journal of Cancer. Arnold, in turn, has highlighted developments in cancer and other health research on Facebook and other online platforms. “I don’t think I’ve seen him so focused in years, with this ketogenic thing,” says Arnold’s brother John. “This is his baby now.”

If ketone supplements do turn out to be more than yet another performance booster—and at least one brand marketing ketone-salt formulations first dreamed up by Arnold and D’Agostino already highlights testimonials from children with epilepsy—much of the credit will go to D’Agostino, Veech, Clarke, and the other scientists who have advanced the field of ketone research. But some of the credit will have to go to Patrick Arnold. With more funding from the Office of Naval Research, D’Agostino has turned to a new lab that can make the diester in greater quantities and according to standards necessary for human trials, but he has not forgotten about what Arnold achieved. “Patrick is the reason my whole research program exists right now,” D’Agostino says, adding that he might have given up on science altogether if Arnold hadn’t succeeded in making the ketones he needed.

In 2006, Arnold told Sports Illustrated that he didn’t want BALCO to be his legacy. He couldn’t say, though, what exactly he wanted his legacy to be. Now he has his answer. He would prefer to be remembered as “the ketone guy” who “also did that stuff,” he says. He has been fortunate to work with a scientist as generous as D’Agostino. If he makes it big again, baseball fans might be less forgiving.


Nothing is set in stone

Naturally occurring variations in sex chromosomes are many and varied. This can also have an effect on the visible sexual characteristics, the genitals. Here, too, there are several gradations between the fully formed penis and the externally visible part of the clitoris.

Individuals who cannot clearly be assigned one of the binary sexes refer to themselves as intersex or inter*. The United Nations estimates that 1.7% of the world population belongs to this group. The number is comparable to that of red-haired people in the world.

Since 2018, newborns like this can be registered as "diverse" in Germany. Other countries, such as Australia, Bangladesh and India, also recognize a third sex.

Sex can also change over a lifetime ― or more precisely the gonadal sex identity can. Chinese researchers found this out in a study on mice.

The genes responsible for this change are DMRT1 and FOXL2, which normally balance the development of ovaries and testes in a kind of yin-and-yang relationship. When there was a change in these genes, the gonadal sex phenotype could change even in adult animals.

Hijras are a recognized third gender in India


Use it or lose it

If positive experiences do not happen, the pathways needed for normal human experiences may be lost. This is often referred to as the ‘use it or lose it’ principle.[5] Tragic case studies of �ral’ children who have survived with minimal human contact illustrate the severe lack of language and emotional development in the absence of love, language and attention. In the same way, even though babies have a deep genetic predisposition to bond to a loving parent, this can be disrupted if a baby’s parents or caregivers are neglectful and inconsistent.

Indeed longitudinal studies have reported that a child’s ability to form and maintain healthy relationships throughout life may be significantly impaired by having an insecure attachment to a primary caregiver.[6]

Teicher [7] has reported the following pathology in children who suffered neglect (an extreme form of insecure attachment) in their early years

  • • Reduced growth in the left hemisphere which may lead to associated increased depression risk for depression.
  • • Increased sensitivity in the limbic system which can lead to anxiety disorders.
  • • Reduced growth in the hippocampus that could contribute to learning and memory impairments.

These findings have been backed up by cases of extreme neglect and outcomes of children raised in Romanian orphanages. Rutter et al. [8] studied the development of children adopted from Romanian orphanages who were adopted into loving families at different ages. When each child was 6 years old, the researchers assessed what proportion of these adopted children was functioning ‘normally’. They found that 69% of the children adopted before the age of 6 months 43% of the children adopted between the ages of 7 months and 2 years and only 22% of the children adopted between the ages of 2 years and 3½ years were functioning normally.


Is the human brain still evolving?

When we daydream about the future, we tend to focus on the fabulous belongings we're going to have. Jet packs, flying cars, weapons to kill aliens, cell phones that make today's sleek models look clunky -- you name it, we're going to have it. We don't tend to focus, however, on who we'll be in the future. Most of us probably picture ourselves exactly the same, though maybe thinner, as surely we'll all have robot personal trainers by then. While we see the world's technology evolving to meet our needs, we may not think about how we ourselves might be evolving.­

­The story of evolution up to this point explains how we became the upright walking, tool-using homo sapiens of today. The turning point of this story so far concerns cranial expansion. About 2.5 million years ago, hominids started out with a brain weighing approximately 400-450 grams (approximately 1 pound), but around 200,000 to 400,000 years ago, our brains became much bigger than those of other primates [source: Kouprina et al.]. Now, we humans walk around with brains tipping the scales at 1350 to 1450 grams (approximately 3 pounds) [source: Kouprina et al.].

­As humans, we enjoy a much larger neocortex. This area of the brain is the key ingredient that separates us from other species -- it allows us to do our deep thinking, make decisions and form judgments. And while our brain has served us well so far, it certainly has a few defects we wouldn't mind eliminating, like disease, depression and the tendency to make drunken phone calls at 2 a.m. to an ex-boyfriend. But until recently, scientists thought that we were done evolving, that we had reached a sort of evolutional apex. Now, though, some researchers think that we're not quite done.

Could our brains be evolving right now? Could we gain the intelligence to make our dreams of the future come true, or will we return to the hominid state of yesteryear? Go to the next page to find out if brain evolution is possible.

Genetic Evidence of Brain Evolution

One way to determine if brain evolution is in our future is to consider how our brain evolved in the past. Since scientists don't know exactly how we ended up with brains bigger than other primates, they're left looking at examples of when the brain doesn't grow to the expected size. One such condition is microcephaly, a disorder in which the brain is much smaller than normal researchers believe that the size of a microcephalic brain is roughly similar to that of an early hominid [source: Kouprina et al.].

Microcephaly has been tied to at least two genes: ASPM and microcephalin. When mutations in these genes occur, brain size is affected. Since ASPM seems to have evolved faster in apes than in creatures such as mice, it's possible that it may have something to do with how our brains evolved. A 2004 study that compared ASPM in humans to other primates found that the sequence of the gene was roughly similar, which seems to suggest that ASPM alone wasn't responsible for differentiating humans from chimps [source: Kouprina et al.]. But ASPM could have facilitated something else in the human brain that caused our noggins to expand so dramatically.

The following year, a study led by Dr. Bruce Lahn of the University of Chicago continued tracking the presence of ASPM, as well as microcephalin, in human populations. But Lahn had noticed that these genes were changing slightly these alternative forms of a gene are known as alleles. Lahn's group tracked the alleles in the DNA of several populations, including individuals from Europe, Africa, the Middle East and East Asia, to ensure diversity.

In the case of ASPM, a new allele emerged approximately 5,800 years ago, and is now present in about 50 percent of the populations of the Middle East and Europe [source: Wade]. It's found to a much lesser extent in the peoples of East Asia and Africa. The allele associated with microcephalin is believed to have developed about 37,000 years ago about 70 percent of the European and East Asian populations exhibited this allele [source: Wade]. Lahn's team deemed the variations common enough to suggest that their presence was evidence of natural selection as opposed to an accidental mutation, suggesting that the brain may still be evolving [source: Associated Press].

Lahn's hypothesis that these genes have evolved as they conferred advantages to the brain comes with the same caveat as the earlier study. Scientists simply aren't sure what role ASPM plays in brain size, and it's a given that not all of the brain-size determining genes have been identified yet. African populations, who didn't appear to be carrying either gene in great frequencies, may have other genes at work on their brains, while it may turn out that ASPM and microcephalin have persisted in the other populations for some reason completely unrelated to the brain.

More work is needed on the role of ASPM, microcephalin and other genes involved in the growth of our brain, but one reason why scientists are so interested in brain size is that it has been linked with intelligence. Bigger brains might portend bigger IQs. So if the ASPM and microcephalin alleles are in fact causing our brains to evolve, what are the possible destinations? Will we be bigger-brained and smart enough to realize some amazing inventions? Or is mankind on a slippery slope down to Stupidtown? On the next page, we'll investigate what the fallout of all this evolution might be.

Possible Outcomes of Brain Evolution

So if it turns out that the alleles in ASPM and microcephalin are causing our brains to evolve, what might the outcome be? We might like to think that there's nothing but bigger and better things ahead of us, but British researchers have claimed that our brain is already operating at maximum capacity. After creating models of how our brain works now, it seems that we've reached our maximum ability to process information, or we're probably within 20 percent of that number [source: Ward]. If our brain did get bigger, other organs would have to grow as well, particularly the heart, which would have to work harder to power a bigger brain.

The researchers also found that we're facing a bit of a vicious circle in terms of increased intelligence. For the brain to take in more information, the connections between brain cells would have to become wider, so as to speed up the rate of the brain's information superhighway. But to support that, we'd need more insulation for those connections, as well as more blood flow to the brain to support the connections. That, in turn, leaves less room for the expanded connections. And if the brain became bigger, the messages would only have farther to go, slowing down our already efficient processing times [source: Ward]. Other research suggests that the metabolic demands necessary for evolution mirror genetic changes that occur in schizophrenia, perhaps indicating that neurological disorders accompany brain evolution [source: BioMed Central].

But no one wants to imagine a future in which we become dumber, right? That means the next step for our brains may not be a natural evolution so much as a genetic engineering to ensure that our brains are the best possible brains they can be. Think of how our society already relies on antidepressants and other drugs to correct brain malfunctions. Eventually, we may be able to engineer defects out of existence.

And if we wanted to improve our intelligence? Some are starting to stay that if we want to do that, we may have to form an alliance with computers. Roboticists at Carnegie Mellon University estimated that computers will surpass our processing capacity by 2030 [source: Lavelle]. After we exhaust genetic engineering mechanism to improve our brains, we may have to supplement our minds with a computer interface. A futurist named Ian Pearson has considered how an evolution with the aid of computer parts might proceed.

First, Pearson suggests, we'd become a species called Homo cyberneticus, a human species that's slightly assisted by some silicon enhancements. As this species proves successful, we'd use the practice more, to the point where our "brain" was entirely computer-based. This species would be known as Homo hybridus, as it would have a body similar to ours. But Pearson foresees one major flaw with Homo hybridus -- eventually, the organic parts of the individual would wear out and die. This will lead to the rise of Homo machinus this species will be made entirely out of silicon and will essentially have immortality. The brain will be able to back itself up, and parts will be repaired or replaced.

The thought of Homo machinus may make you uncomfortable, particularly if you've seen a little film called "The Terminator." But you can already sense how our reliance on computers is growing consider, for example, a job applicant who shows up without basic computer skills. That candidate likely doesn't stand a chance against applicants who could whip up PowerPoint presentations or Excel spreadsheets in their dreams. Similarly, humans that try to opt out of machine-based parts may find themselves unable to compete successfully with the new species.

And sure, there will probably be things we'll lose forever in this transition, some attributes that those computer brains can never have, like creativity. But really, one could argue that with the glut of reality shows that are already on the air, creativity may already have died.

So yes, the human brain could evolve and change. The question is, will we still be humans after it happens?


Think you’re not biased? Think again

Data show that most Americans have a pro-white, anti-black bias &mdash even when they don&rsquot think they do.

Share this:

A little misbehavior at school can land kids in hot water. How much? In many cases, that depends on the color of a student’s skin. Black students more frequently get detention for being disruptive or loud. White students acting the same way are more likely to get off with a warning.

That doesn’t mean that teachers and administrators are racist. At least, most don’t intend to be unfair. Most want what’s best for all students, no matter what their race or ethnicity might be. And they usually believe that they treat all students equally.

But all people harbor beliefs and attitudes about groups of people based on their race or ethnicity, gender, body weight and other traits. Those beliefs and attitudes about social groups are known as biases. Biases are beliefs that are not founded by known facts about someone or about a particular group of individuals. For example, one common bias is that women are weak (despite many being very strong). Another is that blacks are dishonest (when most aren’t). Another is that obese people are lazy (when their weight may be due to any of a range of factors, including disease).

People often are not aware of their biases. That’s called an unconscious or implicit bias. And such implicit biases influence our decisions whether or not we mean for them to do so.

Educators and Parents, Sign Up for The Cheat Sheet

Weekly updates to help you use Science News for Students in the learning environment

Having implicit biases doesn’t make someone good or not-so-good, says Cheryl Staats. She’s a race and ethnicity researcher at Ohio State University in Columbus. Rather, biases develop partly as our brains try to make sense of the world.

Our brains process 11 million bits of information every second. (A bit is a measure of information. The term is typically used for computers.) But we can only consciously process 16 to 40 bits. For every bit that we’re aware of, then, our brains are dealing with hundreds of thousands more behind the scenes. In other words, the vast majority of the work that our brains do is unconscious. For example, when a person notices a car stopping at a crosswalk, that person probably notices the car but is not consciously aware of the wind blowing, birds singing or other things happening nearby.

To help us quickly crunch through all that information, our brains look for shortcuts. One way to do this is to sort things into categories. A dog might be categorized as an animal. It might also be categorized as cuddly or dangerous, depending on the observers’ experiences or even stories that they have heard.

As a result, people’s minds wind up lumping different concepts together. For example, they might link the concept of “dog” with a sense of “good” or “bad.” That quick-and-dirty brain processing speeds up thinking so we can react more quickly. But it also can allow unfair biases to take root.

“Implicit biases develop over the course of one’s lifetime through exposure to messages,” Staats says. Those messages can be direct, such as when someone makes a sexist or racist comment during a family dinner. Or they can be indirect — stereotypes that we pick up from watching TV, movies or other media. Our own experiences will add to our biases.

The good news is that people can learn to recognize their implicit biases by taking a simple online test. Later, there are steps people can take to overcome their biases.

Can people be ‘colorblind’?

“People say that they don’t ‘see’ color, gender or other social categories,” says Amy Hillard. However, she notes, they’re mistaken. Hillard is a psychologist at Adrian College in Michigan. Studies support the idea that people can’t be truly “blind” to minority groups, she notes. Everyone’s brain automatically makes note of what social groups other people are part of. And it takes only minor cues for our minds to call up, or activate, cultural stereotypes about those groups. Those cues may be a person’s gender or skin color. Even something as simple as a person’s name can trigger stereotypes, Hillard says. This is true even in people who say they believe all people are equal.

Many people are not aware that stereotypes can spring to mind automatically, Hillard explains. When they don’t know, they are more likely to let those stereotypes guide their behaviors. What’s more, when people try to pretend that everyone is the same — to act as though they don’t have biases — it doesn’t work. Those efforts usually backfire. Instead of treating people more equally, people fall back even more strongly onto their implicit biases.

Young people demonstrate as part of the Black Lives Matter movement — a push to recognize and overcome racial bias in the United States. Gerry Lauzon/Flickr (CC-BY 2.0)

Race is one big area in which people may exhibit bias. Some people are explicitly biased against black people. That means they are knowingly racist. Most people are not. But even judges who dedicate their lives to being fair can show implicit bias against blacks. They have tended, for instance, to hand down harsher sentences to black men than to white men committing the same crime, research has shown.

And whites aren’t the only people who have a bias against blacks. Black people do, too — and not just in terms of punishment.

Consider this 2016 study: It found teachers expect white students to do better than black ones. Seth Gershenson is an education policy researcher at American University in Washington, D.C. He was part of a team that studied more than 8,000 students and two teachers of each of those students.

They looked at whether the teacher and student were the same race. And about one in every 16 white students had a non-white teacher. Six in every 16 black students had a teacher who was not black. Gershenson then asked whether the teachers expected their students to go to — and graduate from — college.

White teachers had much lower expectations for black students than black teachers did. White teachers said they thought a black student had a one-in-three chance of graduating from college, on average. Black teachers of those same students gave a much higher estimate they thought nearly half might graduate. In comparison, nearly six in 10 teachers — both black and white — expected white students to complete a college degree, Gershenson says. In short, both sets of teachers showed some bias.

“We find that white teachers are significantly more biased than black teachers,” he notes. Yet the teachers were not aware they were biased in this way.

Does gender matter?

Implicit bias is a problem for women, as well. Take, for instance, the unfounded claim that women aren’t good at science, technology, engineering or math (STEM). Women can (and frequently do) excel in all of these areas. In fact, women earn 42 percent of science and engineering PhDs. Yet only 28 percent of people who get jobs in STEM fields are women. And women who do work in STEM tend to earn less than do men of equal rank. They also receive fewer honors and are promoted less frequently than the men they work with.

On average, women trained in the sciences have more difficulty than men in finding jobs and getting promotions. USAID Asia/Flickr (CC BY-NC 2.0)

This gender difference in hiring and promotion may be due partly to a bias in how recommendation letters are written. Such letters help employers know how well a person has done in a past job.

In one 2016 study, researchers at Columbia University in New York City probed what was said in those recommendations. The team examined 1,224 letters of recommendation written by professors in 54 different countries. Around the world, both men and women were more likely to describe male students as “excellent” or “brilliant.” In contrast, letters written for female students described them as “highly intelligent” or “very knowledgeable.” Unlike the terms used for men, these phrases do not set women apart from their competition, the researchers say.

Biases against women don’t only happen in the sciences. Research by Cecilia Hyunjung Mo finds that people are biased against women in leadership positions, too. Mo is a political scientist at Vanderbilt University in Nashville, Tenn.

Women make up 51 percent of the U.S. population. Yet they make up only 20 percent of people serving in the U.S. Congress. That’s a big difference. One reason for the gap may be that fewer women than men run for political office. But there’s more to it, Mo finds.

In one 2014 study, she asked 407 men and women to take a computerized test of implicit bias. It’s called the implicit association test, or IAT. This test measures how strongly people link certain concepts, such as “man” or “woman,” with stereotypes, such as “executive” or “assistant.”

During the test, people are asked to quickly sort words or pictures into categories. They sort the items by pressing two computer keys, one with their left hand and one with their right. For Mo’s test, participants had to press the correct key each time they saw a photo of a man or a woman. They had to choose from the same two keys each time they saw words having to do with leaders versus followers. Halfway through the tests, the researchers switched which concepts were paired together on the same key on the keyboard.

Story continues below video.

Cecilia Hyunjung Mo discusses how voters tend to prefer men unless it’s clear that a woman is more qualified.
Vanderbilt University

People tended to respond faster when photos of men and words having to do with leadership shared the same key, Mo found. When photos of women and leadership-related words were paired together, it took most people longer to respond. “People typically found it easier to pair words like ‘president,’ ‘governor’ and ‘executive’ with males, and words like ‘secretary,’ ‘assistant’ and ‘aide’ with females,” Mo says. “Many people had a lot more difficulty associating women with leadership.” It wasn’t only men who had trouble making that association. Women struggled, too.

Mo also wanted to know how those implicit biases might be related to how people behave. So she asked the study participants to vote for fictional candidates for a political office.

She gave each participant information about the candidates. In some, the male candidate and the female candidate were equally qualified for the position. In others, one candidate was more qualified than the other. Mo’s results showed that people’s implicit biases were linked to their voting behavior. People who showed stronger bias against women in the IAT were more likely to vote for the male candidate — even when the woman was better qualified.

Story continues below image.

A century ago, U.S. Congresswoman Jeannette Rankin of Montana (left) was the first woman elected to national office. In 2013, when the photo at right was taken, only 20 of 100 U.S. Senators were women. Although women are gaining ground in leadership positions, that progress has been slow. U.S. Library of Congress Wikimedia/Office of U.S. Sen. Barbara Mikulski

Size matters

One of the strongest social biases is against the obese. Chances are, you harbor a dislike for people who are severely overweight, says Maddalena Marini. She is a psychologist at Harvard University in Cambridge, Mass. Implicit weight bias seems universal, she says. “Everyone possesses it. Even people who are overweight or obese.”

To reach that conclusion, she and her team used data from Harvard’s Project Implicit website. This site allows people to take an IAT. There are currently 13 types of these tests of implicit bias on the site. Each probes for a different type of bias. More than 338,000 people from around the world completed the weight-bias test between May 2006 and October 2010, the time leading up to Marini’s study. This IAT was similar to the one for race. But it asked participants to categorize words and images that are associated with good and bad, and with thin and fat.

After taking the IAT, participants answered questions about their body mass index. This is a measure used to characterize whether someone is at a healthy weight.

Story continues below image.

On this IAT test, when “good” shared a key with a thin person and “bad” with an obese person (the “congruent” condition, shown left), most people responded faster than they did when the pairings were switched (the “incongruent” condition, right). Taking longer to link “good” with obesity is a sign of implicit weight bias. Maddalena Marini

Marini found that heavier people have less bias against people who are overweight or obese. “But they still prefer thin people, on average,” she notes. They just don’t feel this way as strongly as thin people do. “Overweight and obese people tend to identify with and prefer their weight group,” Marini says. But they may be influenced by negativity at the national level that leads them to prefer thin people.

People from 71 nations took part in the study. That allowed Marini to examine whether an implicit bias against heavy people was linked in any way to whether weight problems were more common in their nation. To do this, she combed public databases for weight measurements from each country. And nations with high levels of obesity had the strongest bias against the obese, she found.

She’s not sure why obese nations have such a strong implicit bias against overweight people. It could be because those nations have more discussions about the health problems associated with obesity, Marini says. It may also come from people seeing more ads for “diet plans, healthy foods and gym memberships aimed at decreasing obesity,” she notes. Or perhaps people in these countries simply see that people with high social status, good health and beauty tend to be thin.

Weight bias seems to be more commonly accepted than race and gender bias. In other words, people tend to feel freer to verbally express their weight bias. That’s according to a 2013 study led by Sean Phelan. He is a policy researcher at the Mayo Clinic in Rochester, Minn. Medical students often express weight bias openly, he finds. And that can translate into poorer health care for people who are severely overweight. “Health-care providers display less respect for obese patients,” he reports. He also notes that research shows that “physicians spend less time educating obese patients about their health” than they do with patients who aren’t obese.

Embracing diversity breaks down bias

Antonya Gonzalez is a psychologist in Canada at the University of British Columbia in Vancouver. “We may think we treat everyone equally,” she says, but “unconscious biases can shape our behavior in ways we aren’t always aware of.” Knowing that you might be biased “is the first step to understanding how you treat other people — and trying to change your own behavior,” she says.

Gonzalez knows about changing behavior. In a 2016 study with 5- to 12-year-old children, she found that their implicit bias against black people could change. The children were told positive stories about people, such as a firefighter who works hard to protect his community. Some children saw a photo of a white man or woman while they heard the story. Others saw a photo of a black person. After the story, each child took a race IAT. Children who learned about a black person were less biased when they took the test, compared with children who had heard about a white person.

“Learning about people from different social groups who engage in positive behaviors can help you to unconsciously associate that group with positivity,” Gonzalez says. “That’s part of the reason why diversity in the media is so essential,” she notes. It helps us to “learn about people who defy traditional stereotypes.”

Hillard at Adrian College also found that diversity training can help adults counteract a bias against women. “The first step is awareness,” she says. Once we are aware of our biases, we can take steps to block them.

It also helps to step back and think about whether stereotypes could possibly provide good information to act on, she notes. Could a stereotype that is supposed to be true of a large portion of the population, such as “all women” or “all people of color,” truly be accurate?

The key is to embrace diversity, says Staats — not to pretend it doesn’t exist. One of the best ways to do this is to spend time with people who are different than you. That will help you see them as individuals, rather than part of a stereotypical group.

“The good news is that our brains are malleable,” she says. “We are able to change our associations.”

Power Words

average (in science) A term for the arithmetic mean, which is the sum of a group of numbers that is then divided by the size of the group.

behavior The way a person or other organism acts towards others, or conducts itself.

bias The tendency to hold a particular perspective or preference that favors some thing, some group or some choice. Scientists often “blind” subjects to the details of a test (don’t tell them what it is) so that their biases will not affect the results.

body mass index (BMI) A person’s weight in kilograms divided by the square of his or her height in meters. BMI can be used to evaluate if someone is overweight or obese. However, because BMI does not account for how much muscle or fat a person has, it is not an accurate measure.

Congress The part of the U.S. federal government charged with writing laws, setting the U.S. budget, and confirming many presidential appointments to the courts, to represent the U.S. government interests overseas and to run administrative agencies. The U.S. Congress is made of two parts: the Senate, consisting of two members from each state, and the House of Representatives, which consists of a total of 435 members, with at least one from each state (and dozens more for the states with the biggest populations).

diet The foods and liquids ingested by an animal to provide the nutrition it needs to grow and maintain health. (verb) To adopt a specific food-intake plan for the purpose of controlling body weight.

diversity (in biology) A range of different life forms.

engineering The field of research that uses math and science to solve practical problems.

ethnicity (adj. ethnic) The background of an individual based on cultural practices that tend to be associated with religion, country (or region) of origin, politics or some mix of these.

gender The attitudes, feelings, and behaviors that a given culture associates with a person’s biological sex.

high school A designation for grades 9 through 12 in the U.S. system of compulsory public education. High-school graduates may apply to colleges for further, advanced education.

implicit bias To unknowingly hold a particular perspective or preference that favors some thing, some group or some choice — or, conversely, holds some unrecognized prejudice against it.

malleable Something whose shape can be altered, usually by hammering or otherwise deforming with pressure. (in social science) Attitudes or behaviors that can be changed with social pressure or logic.

media (in the social sciences) A term for the ways information is delivered and shared within a society. It encompasses not only the traditional media — newspapers, magazines, radio and television — but also Internet- and smartphone-based outlets, such as blogs, Twitter, Facebook and more.

obesity (adj. obese) Extreme overweight. Obesity is associated with a wide range of health problems, including type 2 diabetes and high blood pressure.

online A term that refers to things that can be found or done on the internet.

overweight A medical condition where the body has accumulated too much body fat. People are not considered overweight if they weigh more than is normal for their age and height, but that extra weight comes from bone or muscle.

political scientist Someone who studies or deals with the governing of people, largely by elected officials and governments.

population (in biology) A group of individuals from the same species that lives in the same area.

psychologist A scientist or mental-health professional who studies the human mind, especially in relation to actions and behavior.

social (adj.) Relating to gatherings of people a term for animals (or people) that prefer to exist in groups. (noun) A gathering of people, for instance those who belong to a club or other organization, for the purpose of enjoying each other’s company.

STEM An acronym (abbreviation made using the first letters of a term) for science, technology, engineering and math.

stereotype A widely held view or explanation for something, which often may be wrong because it has been overly simplified.

trait A characteristic feature of something. (in genetics) A quality or characteristic that can be inherited.

Citations

Journal: A.M. Gonzalez et al. Reducing children's implicit racial bias through exposure to positive out-group exemplars. Child Development. Vol. 88, January/February 2017, p. 123. doi: 10.1111/cdev.12582.

Journal: K. Dutt et al. Gender differences in recommendation letters for postdoctoral fellowships in geoscience. Nature Geoscience. Vol. 9, November 2016, p. 805. doi: 10.1038/ngeo2819.

Journal: S. Gershenson et al. Who believes in me? The effect of student-teacher demographic match on teacher expectations. Economics of Education Review. Vol. 52, June 2016, p. 209. doi: 10.1016/j.econedurev.2016.03.002.

Report: C. Staats. State of the science: Implicit bias review 2014. Kirwan Institute for the Study of Race and Ethnicity.

Journal: S.M. Jackson, A.L. Hillard and T.R. Schneider. Using implicit bias training to improve attitudes toward women in STEM. Social Pshycology of Education. Vol. 17, September 2014, p. 419. doi: 10.1007/s11218-014-9259-5.

Journal: M. Marini et al. Overweight people have low levels of implicit weight bias, but overweight nations have high levels of implicit weight bias. PLOS One. Vol. 8, Published online December 17, 2013, p. e83543. doi: 10.1371/journal.pone.0083543.

About Alison Pearce Stevens

Alison Pearce Stevens is a former biologist and forever science geek who writes about science and nature for kids. She lives with her husband, their two kids and a small menagerie of cuddly (and not-so cuddly) critters.

Classroom Resources for This Article Learn more

Free educator resources are available for this article. Register to access:


Conclusion

That the brain changes with increasing chronological age is clear, however, less clear is the rate of change, the biological age of the brain, and the processes involved. The brain changes that may affect cognition and behaviour occur at the levels of molecular ageing, intercellular and intracellular ageing, tissue ageing, and organ change. There are many areas of research under investigation to elucidate the mechanisms of ageing and to try to alleviate age associated disorders, particularly dementias that have the biggest impact on the population. In terms of personal brain ageing the studies suggest that a healthy lifestyle that reduces cardiovascular risk will also benefit the brain. Medical care in this area may even offer limited protection in terms of cognitive decline but this needs to be shown for antihypertensives, antiplatelet, and anticholesterol agents. It is also important to take note of the limitations in studies on the ageing brain. Many studies are cross sectional in nature, have small numbers of participants with wide ranges in chronological age, lack control for risk factors or protective factors, take no account of education that may improve performance on cognitive tests, and finally lack assessment with regard to depression that may also affect performance. It must be remembered that the brains of an elderly group may show cohort effects related to wider environmental influences, for example, lack of high energy foods while growing up. 2 It is also extremely difficult to separate out and measure single cognitive processes to fully understand any changes. 106

Future studies need to take full account of these factors and 𠇌ross sequential”, a combination of cross sectional and longitudinal studies, have been proposed. 3 It is clear that our understanding of the ageing brain continues to grow but still requires much research that is especially important given the numbers of elderly people in society and their potential levels of cognitive impairment. Where appropriate, randomised controlled trials of therapeutic measures may, in future, lead the way to greater understanding.


The Marks of Maturity

You may have noticed a paradox among students today. Although there are exceptions, this generation is advanced intellectually, but behind emotionally. They are missing many of the marks of maturity they should possess.

From an intellectual perspective, students today have been exposed to so much more than I was growing up—and far sooner. They’ve consumed information on everything from cyberspace to sexual techniques before they graduate from middle school. Everything is coming at them sooner.

Sociologist Tony Campolo said, “I am convinced we don’t live in a generation of bad kids. We live in a generation of kids who know too much too soon.”

On the other hand, students have been stunted in their emotional maturity. They seem to require more time to actually “grow up” and prepare for the responsibility that comes with adulthood. This is a result of many factors, including well-intentioned parents who hover over kids, not allowing them to experience the pain of maturation. It’s like the child who tries to help a new butterfly break out of the cocoon, and realizes later that they have done it a disservice: That butterfly is not strong enough to fly once it is free.

There is another reason why teens struggle with maturation. Scientists are gaining new insights into remarkable changes in the brain that may explain why the teen years are so hard on young people and their parents. From ages 11-14, kids lose some of the connections between cells in the part of their brain that enables them to think clearly and make good decisions.

Pruning the Brain

What happens is that the brain prunes itself, going through changes that will allow a young person to move into adult life effectively. “Ineffective or weak brain connections are pruned in much the same way a gardener would prune a tree or bush, giving the plant a desired shape,” says Alison Gopnik, professor of child development at UC Berkley.

Adolescents experiencing these brain changes can react emotionally, according to Ian Campbell, a neurologist at the U.C. Davis Sleep Research Laboratory. Mood swings and uncooperative and irresponsible attitudes can all be the result of these changes. Sometimes, students can’t explain why they feel the way they do. Their brain is changing from a child brain to an adult brain.

Regions that specialize in language, for example, grow rapidly until about age 13 and then stop. The frontal lobes of the brain which are responsible for high-level reasoning and decision-making aren’t fully mature until the early 20s, according to Deborah Yurgelun-Todd, a neuroscientist at Harvard’s Brain Imaging Center. There’s a portion of time when the child part of the brain has been pruned, but the adult portion is not fully formed. They are “in-between" — informed but not prepared.

The bottom line?

Students today are consuming information they aren’t completely ready to handle. The adult part of their brain is still forming and isn’t ready to apply all that society throws at it. Their mind takes it in and files it, but their will and emotions are not prepared to act on it in a healthy way. They can become paralyzed by all the content they consume.

They want so much to be able to experience the world they’ve seen on websites or heard on podcasts, but don’t realize that they are unprepared for that experience emotionally. They are truly in between a child and an adult. (This is the genius behind movie ratings and viewer discretion advisories on TV.) I believe a healthy, mature student is one who has developed intellectually, volitionally, emotionally, and spiritually. I also believe there are marks we can look for as we coach them into maturity.

Signs to Look For

What are the marks of maturity? We all love it when we see a young person who carries themselves well and shows signs of being mature. They interact with adults in an adult manner. Those students are downright refreshing.

At Growing Leaders we seek to build these marks in young people, ages 16-24, as we partner with schools. The following isn’t exhaustive, but it is a list of characteristics I notice in young people who are unusually mature — intellectually, emotionally, and spiritually. If you are a parent, this is a good list of qualities to begin developing in your child. If you are a coach, teacher, or dean, these are the signs we wish every student possessed when they graduate. For that matter, these are signs I wish every adult modeled for the generation coming behind them.

1. A mature person is able to keep long-term commitments. One key signal of maturity is the ability to delay gratification. Part of this means a student is able to keep commitments even when they are no longer new or novel. They can commit to continue doing what is right even when they don’t feel like it.

2. A mature person is unshaken by flattery or criticism. As people mature, they sooner or later understand that nothing is as good as it seems, and nothing is as bad as it seems. Mature people can receive compliments or criticism without letting it ruin them or sway them into a distorted view of themselves. They are secure in their identity.

3. A mature person possesses a spirit of humility. Humility parallels maturity. Humility isn’t thinking less of yourself it is thinking of yourself less. Mature people aren’t consumed with drawing attention to themselves. They see how others have contributed to their success and can honor them. This is the opposite of arrogance.

4. A mature person’s decisions are based on character, not feelings. Mature people—students and adults—live by values. They have principles that guide their decisions. They are able to progress beyond merely reacting to life’s options, and be proactive as they live their life. Their character is master over their emotions.

5. A mature person expresses gratitude consistently. I have found that the more I mature, the more grateful I am, for both big and little things. Immature children presume they deserve everything good that happens to them. Mature people see the big picture and realize how good they have it, compared to most of the world’s population.

6. A mature person knows how to prioritize others before themselves. A wise man once said: A mature person is one whose agenda revolves around others, not self. Certainly this can go to an extreme and be unhealthy, but I believe a pathway out of childishness is getting past your own desires and beginning to live to meet the needs of others less fortunate.

7. A mature person seeks wisdom before acting. Finally, a mature person is teachable. They don’t presume they have all the answers. The wiser they get, the more they realize they need more wisdom. They’re not ashamed of seeking counsel from adults (teachers, parents, coaches) or other sources. Only the wise seek wisdom.

In my book, Artifical Maturity, I offer practical solutions for parents to instill the marks of maturity in their kids. Susan Peters once said, "Children have a much better chance of growing up if their parents have done so first." Here's to modeling and developing authentic maturity in your kids.

Are you displaying the marks of maturity? How about your kids?


Self-controlled children tend to be healthier middle-aged adults

DURHAM, N.C. -- Self-control, the ability to contain one's own thoughts, feelings and behaviors, and to work toward goals with a plan, is one of the personality traits that makes a child ready for school. And, it turns out, ready for life as well.

In a large study that has tracked a thousand people from birth through age 45 in New Zealand, researchers have determined that people who had higher levels of self-control as children were aging more slowly than their peers at age 45. Their bodies and brains were healthier and biologically younger.

In interviews, the higher self-control group also showed they may be better equipped to handle the health, financial and social challenges of later life as well. The researchers used structured interviews and credit checks to assess financial preparedness. High childhood self-control participants expressed more positive views of aging and felt more satisfied with life in middle age.

"Our population is growing older, and living longer with age-related diseases," said Leah Richmond-Rakerd, an assistant professor of psychology at the University of Michigan, who is the first author on the study. "It's important to identify ways to help individuals prepare successfully for later-life challenges, and live more years free of disability. We found that self-control in early life may help set people up for healthy aging."

The children with better self-control tended to come from more financially secure families and have higher IQ. However, the findings of slower aging at age 45 with more self-control can be separated from their childhood socio-economic status and IQ. Their analyses showed that self-control was the factor that made a difference.

And childhood is not destiny, the researchers are quick to point out. Some study participants had shifted their self-control levels as adults and had better health outcomes than their childhood assessments would have predicted.

Self-control also can be taught, and the researchers suggest that a societal investment in such training could improve life span and quality of life, not only in childhood, but also perhaps in midlife. There is ample evidence that changing behaviors in midlife (quitting smoking or taking up exercise) leads to improved outcomes.

"Everyone fears an old age that's sickly, poor, and lonely, so aging well requires us to get prepared, physically, financially, and socially," said Terrie Moffitt, the Nannerl O. Keohane Professor of Psychology & Neuroscience at Duke, and last author on the paper. "We found people who have used self-control since childhood are far more prepared for aging than their same-age peers."

The study appears the week of Jan. 4 in the Proceedings of the National Academy of Sciences.

The Dunedin Multidisciplinary Health and Development Study, based in New Zealand, has tracked these people since they were born in 1972 and 73, putting them through a battery of psychological and health assessments at regular intervals since, the most recent being at age 45.

Childhood self-control was assessed by teachers, parents and the children themselves at ages 3, 5, 7, 9 and 11. The children were measured for impulsive aggression and other forms of impulsivity, over-activity, perseverance and inattention.

From ages 26 to 45, the participants also were measured for physiological signs of aging in several organ systems, including the brain. In all measures, higher childhood self-control correlated with slower aging.

The people with the highest self-control were found to walk faster and have younger-looking faces at age 45 as well.

"But if you aren't prepared for aging yet, your 50's is not too late to get ready," Moffitt added.

This research was supported the U.S. National Institute on Aging (AG032282, AG049789) and National Institute of Child Health and Human Development (T32-HD007376), the U.K. Medical Research Council (P005918), the Jacobs Foundation, the U.S. National Science Foundation, and the Lundbeck Foundation. The Dunedin Multidisciplinary Health and Development Study is supported by the New Zealand Health Research Council and the New Zealand Ministry of Business, Innovation and Employment.

CITATION: "Childhood Self-Control Forecasts the Pace of Midlife Aging and Preparedness for Old Age," Leah S. Richmond-Rakerd, Avshalom Caspi, Antony Ambler, Tracy d'Arbeloff, Marieke de Bruine, Maxwell Elliott, HonaLee Harrington, Sean Hogan, Renate M. Houts, David Ireland, Ross Keenan, Annchen R. Knodt, Tracy R. Melzer, Sena Park, Richie Poulton, Sandhya Ramrakha, Line Jee Hartmann Rasmussen, Elizabeth Sack, Adam T. Schmidt, Maria L. Sison, Jasmin Wertz, Ahmad R. Hariri, and Terrie E. Moffitt. Proceedings of the National Academy of Sciences, Jan. 4, 2021. DOI: 10.1073/pnas.2010211118

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.


Results

T1-weighted images of 169 females and 112 males (Table 1) were preprocessed and assessed for gray matter volume using voxel-based morphometry (VBM) (Fig. 1 A and B). Of the 116 regions of gray matter defined using the Automated Anatomical Labeling atlas (AAL) (11) (Fig. 1C), 10 regions showing the largest sex/gender differences (|Cohen’s d| > 0.70, the largest |d| was 0.84 all P < 0.0001) were included in subsequent analyses (Table S1). Using the actual distributions of males and females in the sample, “male-end” and “female-end” zones were arbitrarily defined as the scores of the 33% most extreme males and females, respectively, and an “intermediate” zone was defined as the area in-between these two (Fig. 1D, we use the terms “male-end”/“female-end” as a shorthand for “the end of the continuum in which males/females are more prevalent,” respectively note that our method of categorizing the continuum into three discrete classes inherently places some females at the “male-end” and some males at the “female-end”). Fig. 1E presents the gray matter volume of the 10 regions in each of the females (Left) and in each of the males (Right) using a color volume scale (as shown in Fig. 1D), and clearly illustrates the lack of internal consistency in most brains. The latter is also evident in Fig. 1F, which presents the number of “female-end” (x axis) and “male-end” (y axis) characteristics in females (red) and males (green). The circles at the (10,0), (0,10), and (0,0) coordinates represent individuals with only “female-end”, only “male-end”, or only “intermediate” characteristics, respectively All other circles on the x and y axes represent individuals who have either “female-end” or “male-end” characteristics, as well as “intermediate” characteristics The rest of the circles represent individuals with substantial variability, having both regions at the “male-end” and regions at the “female-end”. Thirty-five percent of brains showed substantial variability, and only 6% of brains were internally consistent (see Table 1 for more details). Notably, additional definitions of the “male-end” and “female-end” zones (50%, 20%, and 10%) similarly revealed a much higher prevalence of brains showing substantial variability compared with brains showing internal consistency (Table S2). Importantly, substantial variability is not a result of the overlap between females and males in each of the brain regions, as evident in Fig. S1A, which depicts the results that would have been obtained for these data under perfect internal consistency (for comparison, Fig. S1 B–D depicts these data under internal consistency with different degrees of random noise, and Fig. S1E under no internal consistency).

Internal consistency and substantial variability in human brain and behavior


Watch the video: Why Brains Do Not Exist (June 2022).