Вы находитесь на странице: 1из 311

Picasso on Success and Why You Should Never

Compromise in Your Art


by Maria Popova
One must have the courage of ones vocation and the courage to make
a living from ones vocation.
Imagine immensities. Pick yourself up from rejection and plow ahead.
Dont compromise,Debbie Millman advised in her magnificent meditation
on what it takes to design a good life. But how does one resist compromising
ones creative ideals when straining to meet the practical essentials of
survival? An uncompromising answer comes from one of the greatest
creators humanity has ever known.
In 1932, the famed Hungarian photographerBrassa, nicknamed by Henry
Miller the eye of Paris, was asked to photograph Picassos sculptures,
which at the time were practically unknown, for the first issue of the
pioneering surrealist art review Minotaure, edited by Andr Breton. Picasso
had just turned fifty. While already an established artist, he was still on the
cusp of achieving worldwide acclaim.
But when Brassa arrived at 23 rue La Botie and entered Picassos studio, he quickly realized that beyond his
modest photographic assignment lay a much greater reward an invitation into Picassos private world and
the gift of intimate perspective into his singular mind. After each session, Brassa would return home and
carefully record his talks with Picasso on scraps of paper, which hed then stuff into a giant vase not with
the intent of future publication, but with the intuition that Picassos thoughts on life and art would be
enormously valuable to posterity. This went on for thirty years, over the course of which the two got to know
each other intellectually, creatively, spiritually while they explored together such timelessly alluring
subjects as the ego, the creative process, the role of romantic infatuation in art, and a universe more.
In 1964, Brassa who was as talented a writer as he was a photographer reached into his vase and
decided to make his affectionate records of these dimensional tte--ttes public in the remarkable
volume Conversations with Picasso (public library).

Picasso by Brassa
One of these conversations took place on May 3, 1944. Though Brassa was by then a successful commercial
photographer the very reputation by which he had entered Picassos life he had dabbled in drawing
twenty years prior, and had shown Picasso some of his early art. On that particular spring afternoon, Picasso
expressed his admiration for Brassas gift for drawing, insisted that he must have an exhibition, and began
probing the photographer about why he had abandoned the pencil. Despite Brassas success as a photographer,
Picasso saw the relinquishing of any sort of talent in this case, drawing as creative cowardice, as
compromising, as selling oneself short of fulfillment. Never one to bite his lip, he gave Brassa a piece of his
mind. While unsolicited, his words ring with timeless advice to all struggling artists on the importance of long-
run perseverance and faith in ones sense of purpose:
When you have something to say, to express, any submission becomes unbearable in the long run. One must have
the courage of ones vocation and the courage to make a living from ones vocation. The second career is an illusion! I
was often broke too, and I always resisted any temptation to live any other way than from my painting In the
beginning, I did not sell at a high price, but I sold. My drawings, my canvases went. Thats what counts.
When Brassa protests that few artists are gifted enough to be successful, citing something Matisse had once
told him You have to be stronger than your gifts to protect them. Picasso counters by bringing down
the ivory tower and renouncing the myth that art suffers the moment other people start paying for it. Unlike
those who maintain that commercial success is the enemy of creative integrity including such well-meaning
idealists as Sherwood Anderson Picasso was sensitive to the layered, dissonant nature of the issue. He
understood the fragility of the creative impulse as a serf of the human ego an ego that thrives, much to our
dismay and inner turmoil, on constant positive reinforcement. He tells Brassa:
Well, success is an important thing! Its often been said that an artist ought to work for himself, for the love of art, that he
ought to have contempt for success. Untrue! An artist needs success. And not only to live off it, but especially to produce
his body of work. Even a rich painter has to have success. Few people understand anything about art, and not
everyone is sensitive to painting. Most judge the world of art by success. Why, then,leave success to best-selling
painters? Every generation has its own. But where is it written that success must always go to those who cater to the
publics taste? For myself, I wanted to prove that you can have success in spite of everyone, without compromise. Do
you know what? Its the success I had when I was young that became my wall of protection. The blue period, the rose
period, they were screens that shielded me.
Picasso translates this ethos of not compromising from the ideological to the pragmatic as he sends Brassa off
with some practical advice on selling his drawings:
Dont price them too high. What matters is that you sell a large number of them. Your drawings must go out into the
world.

Brassa and Picasso
Conversations with Picasso is an absolute treasure in its entirety, the brilliance of which Henry Miller captures
in the preface:
In some inexplicable way it seems to me that the spirit which animates Picasso can never be fully accounted for by his
work, no matter how prodigious it may be. Not that I deny the greatness of his work, but that the man himself is and will
remain far greater than anything or everything which he accomplishes with his hands. He is so much more than the
painter, sculptor, or whatever he may choose to be while breathing is in him. He is outsized, a human phenomenon.


How to Find Your Purpose and Do What You Love
by Maria Popova
Why prestige is the enemy of passion, or how to master the balance of
setting boundaries and making friends.
Find something more important than you are, philosopher Dan Dennett once said in
discussing the secret of happiness,and dedicate your life to it. But how, exactly, do we
find that? Surely, it isnt by luck. I myself am a firm believer in the power of curiosity and
choice as the engine of fulfillment, but precisely how you arrive at your true calling is an
intricate and highly individual dance of discovery. Still, there are certain factors certain
choices that make it easier. Gathered here are insights from seven thinkers who have
contemplated the art-science of making your lifes calling a living.

PAUL GRAHAM ON HOW TO DO WHAT YOU LOVE
Every few months, I rediscover and redevour Y-Combinatorfounder Paul Grahams fantastic
2006 article, How to Do What You Love. Its brilliant in its entirety, but the part I find of especial
importance and urgency is his meditation on social validation and the false merit metric of
prestige:
What you should not do, I think, is worry about the opinion of anyone beyond your friends. You shouldnt worry about
prestige. Prestige is the opinion of the rest of the world.
[]
Prestige is like a powerful magnet that warps even your beliefs about what you enjoy. It causes you to work not on what
you like, but what youd like to like.
[]
Prestige is just fossilized inspiration. If you do anything well enough, youll make it prestigious. Plenty of things we now
consider prestigious were anything but at first. Jazz comes to mindthough almost any established art form would do.
So just do what you like, and let prestige take care of itself.
Prestige is especially dangerous to the ambitious. If you want to make ambitious people waste their time on errands, the
way to do it is to bait the hook with prestige. Thats the recipe for getting people to give talks, write forewords, serve on
committees, be department heads, and so on. It might be a good rule simply to avoid any prestigious task. If it didnt
suck, they wouldnt have had to make it prestigious.
More of Grahams wisdom on how to find meaning and make wealth can be found in Hackers & Painters: Big
Ideas from the Computer Age.
ALAIN DE BOTTON ON SUCCESS
Alain de Botton, modern philosopher and creator of theliterary self-help genre, is a keen
observer of the paradoxes and delusions of our cultural conceits.
In The Pleasures and Sorrows of Work, he takes his singular lens of wit and wisdom to the
modern workplace and the ideological fallacies of success.
His terrific 2009 TED talk offers a taste:
One of the interesting things about success is that we think we know what it means. A lot of the time our ideas about
what it would mean to live successfully are not our own. Theyre sucked in from other people. And we also suck in
messages from everything from the television to advertising to marketing, etcetera. These are hugely powerful forces
that define what we want and how we view ourselves. What I want to argue for is not that we should give up on our
ideas of success, but that we should make sure that they are our own. We should focus in on our ideas and make sure
that we own them, that were truly the authors of our own ambitions. Because its bad enough not getting what you
want, but its even worse to have an idea of what it is you want and find out at the end of the journey that it isnt, in fact,
what you wanted all along.
HUGH MACLEOD ON SETTING BOUNDARIES
Cartoonist Hugh MacLeod is as well-known for his irreverent doodles as he is
for his opinionated musings on creativity, culture, and the meaning of life.
In I gnore Everybody: and 39 Other Keys to Creativity, he gathers his most
astute advice on the creative life. Particularly resonant with my own beliefs about the importance of choices is
this insight about setting boundaries:
16. The most important thing a creative person can learn professionally is where to draw the red line that separates
what you are willing to do, and what you are not.
Art suffers the moment other people start paying for it. The more you need the money, the more people will tell you what
to do. The less control you will have. The more bullshit you will have to swallow. The less joy it will bring. Know this and
plan accordingly.
Later, MacLeod echoes Grahams point about prestige above:
28. The best way to get approval is not to need it.
This is equally true in art and business. And love. And sex. And just about everything else worth
having.
LEWIS HYDE ON WORK VS. LABOR
After last years omnibus of 5 timeless books on fear and the creative process, a number of
readers rightfully suggested an addition: Lewis Hydes 1979 classic, The Gift: Creativity and
the Artist in the Modern World, of which David Foster Wallace famously said, No one who is
invested in any kind of art can read The Gift and remain unchanged.
In this excerpt, originally featured here in January, Hyde articulates the essential difference
between work and creative labor, understanding which takes us a little closer to the holy grail of
vocational fulfillment:
Work is what we do by the hour. It begins and, if possible, we do it for money. Welding car bodies on an assembly line is
work; washing dishes, computing taxes, walking the rounds in a psychiatric ward, picking asparagus these are work.
Labor, on the other hand, sets its own pace. We may get paid for it, but its harder to quantify Writing a poem, raising a
child, developing a new calculus, resolving a neurosis, invention in all forms these are labors.
Work is an intended activity that is accomplished through the will. A labor can be intended but only to the extent of doing
the groundwork, or of not doing things that would clearly prevent the labor. Beyond that, labor has its own schedule.

There is no technology, no time-saving device that can alter the rhythms of creative labor. When the worth of labor is
expressed in terms of exchange value, therefore, creativity is automatically devalued every time there is an advance in
the technology of work.
Psychologist Mihaly Csikszentmihalyi has a term for the quality that sets labor apart from work: flow a
kind of intense focus and crisp sense of clarity where you forget yourself, lose track of time, and feel like
youre part of something larger. If youve ever pulled an all-nighter for a pet project, or even spent 20
consecutive hours composing a love letter, youve experienced flow and you know creative
labor.
STEVE JOBS ON NOT SETTLING
In his now-legendary 2005 Stanford commencement address, an absolute treasure in its
entirety, Steve Jobs makes an eloquent case for not settling in the quest for finding your
calling a case that rests largely on his insistence upon the power of intuition:

Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great
work. And the only way to do great work is to love what you do. If you havent found it yet, keep looking. Dont settle. As
with all matters of the heart, youll know when you find it. And, like any great relationship, it just gets better and better as
the years roll on. So keep looking until you find it. Dont settle.
ROBERT KRULWICH ON FRIENDS
Robert Krulwich, co-producer of WNYCs fantastic Radiolab, author of the ever-
illuminating Krulwich Wonders and winner of a Peabody Award for broadcast excellence, is
one of the finest journalists working today. In another great commencement address, he
articulates the infinitely important social aspect of loving what you do a kind of social connectedness far
more meaningful and genuine than those notions of prestige and peer validation.

You will build a body of work, but you will also build a body of affection, with the people youve helped whove helped
you back. This is the era of Friends in Low Places. The ones you meet now, who will notice you, challenge you, work
with you, and watch your back. Maybe they will be your strength.

If you can fall in love, with the work, with people you work with, with your dreams and their dreams. Whatever it was
that got you to this school, dont let it go. Whatever kept you here, dont let that go. Believe in your friends. Believe that
what you and your friends have to say that the way youre saying it is something new in the world.
THE HOLSTEE MANIFESTO
You might recall The Holstee Manifesto as one of our 5 favorite manifestos for the creative
life, an eloquent and beautifully written love letter to the life of purpose. (So beloved is the
manifesto around here that it has earned itself a permanent spot in the Brain Pickings sidebar, a
daily reminder to both myself and you, dear reader, of what matters most.)

This is your life. Do what you love, and do it often. If you dont like something, change it. If you dont like your job, quit. If
you dont have enough time, stop watching TV. If you are looking for the love of your life, stop; they will be waiting for you
when you start doing things you love.
The Holstee Manifesto is now available as a beautiful letterpress print, a 57greeting card printed on
handmade paper derived from 50% elephant poo and 50% recycled paper, and even a baby bib because its
never too early to instill the values of living from passion.
The Secret to Learning Anything: Albert Einsteins
Advice to His Son
by Maria Popova
That is the way to learn the most, that when you are doing something
with such enjoyment that you dont notice that the time passes.
With Fathers Day around the corner, here comes a fine addition to
historys greatest letters of fatherly advice from none other thanAlbert
Einstein brilliant physicist,proponent of peace, debater of science and
spirituality, champion of kindness who was no stranger to
dispensing epistolary empowerment to young minds.
In 1915, aged thirty-six, Einstein was living in wartorn Berlin, while his
estranged wife, Mileva, and their two sons, Hans Albert Einstein and Eduard
Tete Einstein, lived in comparatively safe Vienna. On November 4 of that
year, having just completed the two-page masterpiece that would catapult
him into international celebrity and historical glory, his theory of general
relativity, Einstein sent 11-year-old Hans Albert the following letter, found
in Posterity: Letters of Great Americans to Their Children(public library)
the same wonderful anthology that gave us some of historys greatest
motherly advice, Benjamin Rushs wisdom on travel and life, andSherwood Andersons counsel on the
creative life. Einstein, who takes palpable pride in his intellectual accomplishments, speaks to the rhythms of
creative absorption as the fuel for the internal engine of learning:
My dear Albert,
Yesterday I received your dear letter and was very happy with it. I was already afraid you wouldnt write to me at all any
more. You told me when I was in Zurich, that it is awkward for you when I come to Zurich. Therefore I think it is better if
we get together in a different place, where nobody will interfere with our comfort. I will in any case urge that each year we
spend a whole month together, so that you see that you have a father who is fond of you and who loves you. You can
also learn many good and beautiful things from me, something another cannot as easily offer you. What I have
achieved through such a lot of strenuous work shall not only be there for strangers but especially for my own boys.
These days I have completed one of the most beautiful works of my life, when you are bigger, I will tell you about it.
I am very pleased that you find joy with the piano. This and carpentry are in my opinion for your age the best pursuits,
better even than school. Because those are things which fit a young person such as you very well. Mainly play the
things on the piano which please you, even if the teacher does not assign those. That is the way to learn the most, that
when you are doing something with such enjoyment that you dont notice that the time passes. I am sometimes so
wrapped up in my work that I forget about the noon meal. . . .
Be with Tete kissed by your
Papa.
Regards to Mama.
.
Complement with more timeless advice from famous dads, including Ted Hughes, Charles Dickens, F. Scott
Fitzgerald, Sherwood Anderson, John Steinbeck, and much more wisdom found in Posterity.
5 Timeless Books of Insight on Fear and the Creative
Process
by Maria Popova
From Monet to Tiger Woods, or why creating rituals and breaking
routines dont have to be conflicting notions.
Creativity is like chasing chickens, Christoph Niemannonce said. But sometimes it
can feel like being chased by chickens giant, angry, menacing chickens. Whether
youre a writer, designer, artist or maker of anything in any medium, you know the
creative process can be plagued by fear, often so paralyzing it makes it hard to actually
create. Today, we turn to insights on fear and creativity from five favorite books on the
creative process and the artists way.
ART & FEAR
Despite our best-argued cases forincremental innovation andcreativity via hard
work, the myth of the genius and the muse perseveres in how we think about
great artists. And yet most art, statistically speaking, is made by non-geniuses
but people with passion and dedication who face daily challenges and doubts, both
practical and psychological, in making their art. (And lets pause here to observe that art
can encompass an incredible range of creative output, from painting to music to literature
and everything in between.) In Art & Fear: Observations On the Perils (and Rewards) of
Artmaking, working artists David Bayles and Ted Orland explore not only how art gets
made, but also how it doesnt what stands in the way of the creative process and how to
overcome it. Fear, of course, is a cornerstone of those obstacles.
In the ideal that is to say, real artist, fears not only continue to exist, they exist side by side
with the desires that complement them, perhaps drive them, certainly feed them. Naive passion,
which promotes work done in ignorance of obstacles, becomes with courage informed passion, which promotes
work done in full acceptance of those obstacles.
THE WAR OF ART
Steven Pressfield is a prolific champion of the creative process, with all its trials and
tribulations. You might recall his most recent book, Do The Work, from our omnibus of five
manifestos for life. But Pressfield is best-known for The War of Art: Break Through the Blocks
and Win Your Inner Creative Battles, which tackles our greatest forms of resistance
(Resistance with a capital R, that is) to the creative process head-on.
Are you paralyzed with fear? Thats a good sign. Fear is good. Like self-doubt, fear is an indicator. Fear
tells us what we have to do. Remember our rule of thumb: The more scared we are of a work or calling,
the more sure we can be that we have to do it.
Resistance is experienced as fear; the degree of fear equates the strength of Resistance. Therefore, the more fear we
feel about a specific enterprise, the more certain we can be that that enterprise is important to us and to the growth of
our soul.
THE CREATIVE HABIT
Theres hardly a creative bibliophile who hasnt read, or at least heard of, Twyla TharpsThe
Creative Habit: Learn I t and Use I t for Life. It frames creativity as the product of
preparation, routine and persistent effort which may at first seem counterintuitive in the
context of the Eureka! myth and our notion of the genius suddenly struck by a brilliant idea,
but Tharp demonstrates its the foundation of how cultural icons and everyday creators alike,
from Beethoven to professional athletes to ordinary artists, hone their craft, cultivate their
genius and overcome their fears.
Theres nothing wrong with fear; the only mistake is to let it stop you in your tracks.
Athletes know the power of triggering a ritual. A pro golfer may walk along the fairway chatting with his caddie, his
playing partner, a friendly official or scorekeeper, but when he stands behind the ball and takes a deep breath, he has
signaled to himself its time to concentrate. A basketball player comes to the free-throw line, touches his socks, his
shorts, receives the ball, bounces it exactly three times, and then he is ready to rise and shoot, exactly as hes done a
hundred times a day in practice. By making the start of the sequence automatic, they replace doubt and fear with
comfort and routine.
THE COURAGE TO CREATE
In 1975, six years after the great success of his wildly influential book Love & Will, existential
psychologist Rollo May publishedThe Courage to Create an insightful and compelling case
for art and creativity as the centripetal force, not a mere tangent, of human experience, and a
foundation to science and logic. May draws on his extensive experience as a therapist to offer a
blueprint for breaking out of our patterns of creative stagnation. Particularly interesting are his observations on
fear, technology and irrationality, which might at first seem akin to the techno-paranoia propagated by Orson
Welles and, more recently, Nicholas Carr, but are in fact considered counsel against our propensity for
escapism, all the more relevant in the age of the digital convergence, nearly four decades after Mays insights.
What people today do of fear of irrational elements in themselves and in other people is to put tools and mechanisms
between themselves and the unconscious world. This protects them from being grasped by the the frightening and
threatening aspects of irrational experience. I am saying nothing whatever, I am sure it will be understood, against
technology or mechanics in themselves. What I am saying is that danger always exists that our technology will serve as
a buffer between us and nature, a block between us and the deeper dimensions of our experience. Tools and
techniques ought to be an extension of consciousness, but they can just as easily be a protection against
consciousness. [...] This means that technology can be clung to, believed in, and depended on far beyond its legitimate
sphere, since it also serves as a defense against our fears of irrational phenomena. Thus the very success of
technological creativity [...] is a threat to its own existence.
TRUST THE PROCESS
More than 13 years later, Shaun McNiffs Trust the Process: An Artists Guide to Letting
Goremains a cocoon of creative reassurance that unpacks the artists process into small, simple, yet
remarkably effective steps that together choreograph a productive and inspired sequence of
creativity. Never preachy or patronizing, McNiff captures both the exhilaration and the terror of
creating and, more importantly, how the two complement one another, even in the face of fear.
The empty space is the great horror and stimulant of creation. But there is also something predictable in the
way the fear and apathy encountered at the beginning are accountable for feelings of elation at the end. These
intensities of the creative process can stimulate desires of consistency and control, but history affirms that few
transformative experiences are generated by regularity.
When asked for advice on painting, Claude Monet told people not to fear mistakes. The discipline of art requires
constant experimentation, wherein errors are harbingers of original ideas because they introduce new directions for
expression. The mistake is outside the intended course of action, and it may present something that we never saw
before, something unexpected and contradictory, something that may be put to use.
9 Books on Reading and Writing
by Maria Popova
Dancing with the absurdity of life, or what symbolism has to do with the
osmosis of trash and treasure.
Hardly anything does ones mental, spiritual, and creative health more good than
resolving to read more and write better. Todays reading list addresses these parallel
aspirations. And since the number of books written about reading and writing likely far
exceeds the reading capacity of a single human lifetime, this omnibus couldnt be
shouldnt be an exhaustive list. It is, instead, a collection of timeless texts bound to
radically improve your relationship with the written word, from whichever side of the
equation you approach it.
THE ELEMENTS OF STYLE
If anyone can make grammar fun, itsMaira Kalman The Elements of Style
I llustrated marries Kalmans signature whimsy with Strunk and Whites
indispensable style guide to create an instant classic.
The original Elements of Style was published in 1919 in-house at Cornell University for
teaching use and reprinted in 1959 to become cultural canon, and Kalmans inimitable version
is one of our 10 favorite masterpieces of graphic nonfiction.


On a related unmissable note, let the Elements of Style Rap make your day.
BIRD BY BIRD
Anne Lamott might be best known as a nonfiction writer, but Bird by Bird: Some
I nstructions on Writing and Lifeaffirms her as a formidable modern philosopher
as well. The 1994 classic is as much a practical guide to the writers life as it is a
profound wisdom-trove on the life of the heart and mind, with insight on everything from
overcoming self-doubt to navigating the osmotic balance of intuition and rationality.
On the itch of writing, Lamott banters:
We are a species that needs and wants to understand who we are. Sheep lice do not seem to share this
longing, which is one reason why they write so little. But we do. We have so much we want to say and figure
out.
And on the grit that commits mind to paper, she counsels:
You begin to string words together like beads to tell a story. You are desperate to communicate, to edify or entertain, to
preserve moments of grace or joy or transcendence, to make real or imagined events come alive. But you cannot will
this to happen. It is a matter of persistence and faith and hard work. So you might as well just go ahead and get started.
On why we read and write:
Writing and reading decrease our sense of isolation. They deepen and widen and expand our sense of life: they feed
the soul. When writers make us shake our heads with the exactness of their prose and their truths, and even make us
laugh about ourselves or life, our buoyancy is restored. We are given a shot at dancing with, or at least clapping along
with, the absurdity of life, instead of being squashed by it over and over again. Its like singing on a boat during a terrible
storm at sea. You cant stop the raging storm, but singing can change the hearts and spirits of the people who are
together on that ship.
ON WRITING
Hailed as one of the most successful writers alive, Stephen King has hundreds of books under his
belt, most of which bestsellers. On Writing: A Memoir of the Craft is part master-blueprint, part
memoir, part meditation on the writers life, filtered through the lens of his near-fatal car crash
and the newfound understanding of living it precipitated.
Though some have voiced skepticism regarding the capacity of a popular writer to be taken
seriously as an oracle of good writing, Roger Ebert put itbest: After finding that his book On
Writing had more useful and observant things to say about the craft than any book since Strunk
and Whites The Elements of Style, I have gotten over my own snobbery.
A few favorites from the book follow.
On open-endedness:
Description begins in the writers imagination, but should finish in the readers.
On feedback:
Write with the door closed, rewrite with the door open.
On the lifeblood of writing:
It starts with this: put your desk in the corner, and every time you sit down there to write, remind yourself why it isnt in the
middle of the room. Life isnt a support system for art. Its the other way around.
On the relationship between reading and writing, which I wholeheartedly second:
Can I be blunt on this subject? If you dont have time to read, you dont have the time (or the tools) to write. Simple as
that.
ZEN IN THE ART OF WRITING
In Zen in the Art of Writing: Releasing the Creative Genius Within You, Ray Bradbury
acclaimed author, dystopian novelist, hater of symbolism shares not only his wisdom and
experience in writing, but also his contagious excitement for the craft. Blending practical how-tos
on everything from finding your voice to negotiating with editors with snippets and glimpses of the
authors own career, the book is at once a manual and a manifesto, imbued with equal parts insight
and enthusiasm.
On the key to creativity (cue in Elizabeth GilbertsTED talk):
Thats the great secret of creativity. You treat ideas like cats: you make them follow you.
On what to read:
In your reading, find books to improve your color sense, your sense of shape and size in the world.
On art and truth:
We have our Arts so we wont die of Truth.
On signal and noise, with an embedded message that you are a mashup of what you let into your life:
Ours is a culture and a time immensely rich in trash as it is in treasures.
THE WAR OF ART
Steven Pressfield is a prolific champion of the creative process, with all its trials
and tribulations, best-known for The War of Art: Break Through the Blocks and
Win Your I nner Creative Battles a personal defense system of sorts against
our greatest forms of resistance. Resistance with a capital R, that is.
Are you paralyzed with fear? Thats a good sign. Fear is good. Like self-doubt, fear is an
indicator. Fear tells us what we have to do. Remember our rule of thumb: The more
scared we are of a work or calling, the more sure we can be that we have to do it.
Resistance is experienced as fear; the degree of fear equates the strength of
Resistance. Therefore, the more fear we feel about a specific enterprise, the more
certain we can be that that enterprise is important to us and to the growth of our soul.
Also of note: Pressfields recent companion guide to the text, Do The Work, one of our 5 favorite manifestos
for the creative life.
ADVICE TO WRITERS
Advice to Writers is a compendium of quotes, anecdotes, and writerly wisdom from a dazzling
array of literary lights, originally published in 1999. From how to find a good agent to what makes
characters compelling, it spans the entire spectrum of the aspirational and the utilitarian, covering
grammar, genres, material, money, plot, plagiarism, and, of course, encouragement.
Here are a few favorites:
Finish each day before you begin the next, and interpose a solid wall of sleep between the two. This you cannot do
without temperance. ~ Ralph Waldo Emerson
Dont ever write a novel unless it hurts like a hot turd coming out. ~ Charles Bukowski
Breathe in experience, breathe out poetry. ~ Muriel Rukeyser
Begin with an individual and you find that you have created a type; begin with a type and you find that you have created
nothing. ~ F. Scott Fitzgerald
You never have to change anything you got up in the middle of the night to write. ~ Saul Bellow
Immature poets imitate; mature poets steal. ~ T. S. Eliot
Fiction is a lie, and good fiction is the truth inside the lie. ~Stephen King
Good fiction is made of what is real, and reality is difficult to come by. ~ Ralph Ellison
Listen, then make up your own mind. ~ Gay Talese
Find a subject you care about and which you in your heart feel others should care about. It is this genuine caring, not
your games with language, which will be the most compelling and seductive element in your style. ~ Kurt Vonnegut
Write without pay until somebody offers pay; if nobody offers within three years, sawing wood is what you were intended
for. ~ Mark Twain
Originally featured, with more quotes, last December.
HOW TO WRITE A SENTENCE
Humbly titled yet incredibly ambitious, How to Write a Sentence: And How to Read
Oneby Stanley Fish isnt merely a prescriptive guide to the craft of writing its also a rich
and layered exploration of language as an evolving cultural organism. It belongs not on the shelf
of your home library but in your brains most deep-seated amphibian sensemaking underbelly
an insightful, rigorous manual on the art of language that may just be one of the best such tools
since The Elements of Style.
In fact, Fish offers an intelligent rebuttal of some of the cultish mandates of Strunk and Whites bible, most
notably the blind insistence on brevity and sentence minimalism. To argue his case, he picks apart some of
historys most powerful sentences, from Shakespeare to Dickens to Lewis Carroll, using a kind of literary
forensics to excavate the essence of beautiful language. As Adam Haslett eloquently observes in his
excellent FTreview:
[Pared-down prose] is a real loss, not because we necessarily need more Jamesian novels but because too often the
instruction to omit needless words (Rule 17) leads young writers to be cautious and dull; minimalist style becomes
minimalist thought, and that is a problem.
To dissect the Tetris-like quality of words, Fish examines the following Anthony Burgess sentence from his
1968 novel Enderby Outside:
And the words slide into the slots ordained by syntax, and glitter as with atmospheric dust with those impurities which
we call meaning.
Before the words slide into their slots, they are just discrete items, pointing everywhere and nowhere. Once the words
are nested in the places ordained for them ordained is a wonderful word that points to the inexorable logic of
syntactic structures they are tied by ligatures of relationships to one another. They are subjects or objects or actions
or descriptives or indications of manner, and as such they combine into a statement about the world, that is, into a
meaning that one can contemplate, admire, reject, or refine.
Originally featured here last January.
ERNEST HEMINGWAY ON WRITING
Ernest Hemingway famously maintained that it was bad luck to talk about writing. Yet, over the
course of his career, he frequently wrote about writing in his novels and short stories, his letters to
editors, friends, critics, and lovers, in interviews, and even in articles specifically commissioned on
the subject. In Ernest Hemingway on Writing, editorLarry W. Phillips culls the finest, wittiest,
most profound of Hemingways reflections on writing, the nature of the writer, and the elements of
the writers life. The slender volume packs insights on everything from work habits to mood
management to discipline to knowing what to leave out, delivered with Hemingways unmistakable
personality and his signature zeal for integrity.
On what makes a great book:
All good books are alike in that they are truer than if they had really happened and after you are finished reading one
you will feel that all that happened to you and afterwards it all belongs to you: the good and the bad, the ecstasy, the
remorse and sorrow, the people and the places and how the weather was. If you can get so that you can give that to
people, then you are a writer.
On symbolism:
There isnt any symbolysm [sic]. The sea is the sea. The old man is an old man. The boy is a boy and the fish is a fish.
The sharks are all sharks no better and no worse. All the symbolism that people say is shit. What goes beyond is what
you see beyond when you know.
(Cue in other famous writers on symbolism, from Jack Kerouac to Ray Bradbury to Ayn Rand.)
On the qualities of a writer:
All my life Ive looked at words as though I were seeing them for the first time.
First, there must be talent, much talent. Talent such as Kipling had. Then there must be discipline. The discipline of
Flaubert. Then there must be the conception of what it can be and an absolute conscience as unchanging as the
standard meter in Paris, to prevent faking. Then the writer must be intelligent and disinterested and above all he must
survive. Try to get all these things in one person and have him come through all the influences that press on a writer.
The hardest thing, because time is so short, is for him to survive and get his work done.
The most essential gift for a good writer is a built-in, shockproof shit detector. This is the writers radar and all great writers
have had it.
HOW TO READ A BOOK
How to Read a Book, originally written by Mortimer Adler in 1940 and revised with Charles van
Doren in 1972, is the kind of book often described as a living classic classic because it
deals with the fundamental and unchanging mesmerism of the written word, and living because
it does so in a way that divorces this mesmerism from its hard medium, allowing the essence to
evolve as our culture has evolved over the decades. From basic reading to systematic skimming
and inspectional reading to speed reading, Adlers how-tos apply as efficiently to practical
textbooks and science books as they do to poetry and fiction.
One of the books finest points deals with the fundamental yin-yang of how ideas travel and permeate minds
the intertwined acts of reading and writing. Marginalia those fragments of thought and seeds of insight
we scribble in the margins of a book have a social life all their own: just ask The New York TimesSam
Anderson, who recently shared his years worth of marginalia in a wonderful interactive feature. Hardly
anything captures both the utilitarian necessity and creative allure of marginalia better than this excerpt from
Adlers classic:
When you buy a book, you establish a property right in it, just as you do in clothes or furniture when you buy and pay for
them. But the act of purchase is actually only the prelude to possession in the case of a book. Full ownership of a book
only comes when you have made it a part of yourself, and the best way to make yourself a part of it which comes to
the same thing is by writing in it.
Why is marking a book indispensable to reading it? First, it keeps you awake not merely conscious, but wide awake.
Second, reading, if it is active, is thinking, and thinking tends to express itself in words, spoken or written. The person
who says he knows what he thinks but cannot express it usually does not know what he thinks. Third, writing your
reactions down helps you to remember the thoughts of the author.
Reading a book should be a conversation between you and the author. Presumably he knows more about the subject
than you do; if not, you probably should not be bothering with his book. But understanding is a two-way operation; the
learner has to question himself and question the teacher, once he understands what the teacher is saying. Marking a
book is literally an expression of your differences or your agreements with the author. It is the highest respect you can
pay him.
First featured here, along with a meditation on modern marginalia, in December.
Anatomy of Anagrammatic Pseudonyms: The Many
Incarnations of Edward Gorey
by Maria Popova
An infant poet, a postcard-writer, a movie reviewer, a girl detective, and
a spirit control walk into a bar
A master of the subversive and the darkly delightful, Edward Gorey is among
the most celebrated illustrators of the past century. His creations, ranging
from irreverent childrensbooks to paperback covers for literary
classics tonaughty delights for grownups to his illustrated envelopes, are as
singularly distinctive as they are timelessly enchanting. But Gorey, himself a
darkly enigmatic character, was himself a curious creation so much so that
people have regularly questioned his very name. While many were surprised
to know it was real, Gorey did indeed have a number of pseudonyms.
In Whos Writing This?: Notations on the Authorial I with Self-
Portraits (public library) the same fantastic 1996 volume that gave
usfamous authors illustrated self-portraits Gorey draws his self-portrait
and tells the story of his name and his pseudonyms.

Whats particularly interesting is that while the question of whether Gorey was gay has been the subject of
much speculation speculation he gladly played into by stating, Im neither one thing nor the other
particularly Ive never said that I was gay and Ive never said that I wasnt. most of his imaginary pen-
name personae are female:
About the time the first book was published over forty years ago I found my name lent itself to an edifying number of
anagrams, some of which Ive used as pen names, as imaginary authors, and as characters in their or my books. A
selection of examples follows.
Ogded Weary has written The Curious Sofa, a porno-graphic work, and The Beastly Baby, a book no one wanted to
publish.
Mrs. Regera Dowdy, who lived in the nineteenth century, is the author of The Pious Infant and such unwritten works
asThe Rivulets of Gore and Nets to Subdue the Deranged; she also translated The Evil Garden by Eduard Blutig, the
pictures for which were drawn by O. Mde.
Madame Groeda Weyrd devise the Fantod Pack of fortune-telling cards.
Miss D. Awdrey-Gore was a celebrated and prolific mystery writer. . . . Her detective is Waredo Dyrge, whose favorite
reading is the Dreary Rwedgo Series for Intrepid Young Ladies. . . .
Dogear Wrydes work appears only on postcards.
Adde Gorrwy is known as the Postcard Poetess.
Wardore Edgy wrote movie reviews for a few months.
Wee Graddory was an Infant Poet of an earlier century.
Dora Greydew, Girl Detective, is the heroine of a series (The Creaking Knot, The Curse on the Sagwood Estate, etc.)
by Edgar E. Wordy.
Garrod Weedy is the author of The Pointless Book.
Agowy Erderd is a spirit control.
However, I am still taken aback whenever someone asks me if that indeed is my real name.

Edward Gorey
You can see, and support, more of Goreys work and legacy at Edward Gorey House. Meanwhile, Whos
Writing This?, which features contributions from such beloved authors as John Updike, Susan Sontag, Mark
Helprin, Diane Ackerman, Edward Albee, Arthur Miller, and Margaret Atwood, remains well worth the full
read.
Carl Sagan on Science and Spirituality
by Maria Popova
The notion that science and spirituality are somehow mutually
exclusive does a disservice to both.
The friction between science and religion stretches from Galileos famous
letter to todays leading thinkers. And yet were seeing that, for all its capacity
for ignorance, religion might havesome valuable lessons for secular thought and
the two need not be regarded as opposites.
In 1996, mere months before his death, the greatCarl Sagan cosmic
sage, voracious reader,hopeless romantic explored the relationship between
the scientific and the spiritual in The Demon-Haunted World: Science as a
Candle in the Dark (public library). He writes:
Plainly there is no way back. Like it or not, we are stuck with science. We had better
make the best of it. When we finally come to terms with it and fully recognize its beauty
and its power, we will find, in spiritual as well as in practical matters, that we have made
a bargain strongly in our favor.
But superstition and pseudoscience keep getting in the way, distracting us, providing easy answers, dodging skeptical
scrutiny, casually pressing our awe buttons and cheapening the experience, making us routine and comfortable
practitioners as well as victims of credulity.
And yet science, Sagan argues, isnt diametrically opposed to spirituality. He echoes Ptolemys timeless awe at
the cosmos and reflects on what Richard Dawkins has called the magic of reality, noting the intense spiritual
elevation that science is capable of producing:
In its encounter with Nature, science invariably elicits a sense of reverence and awe. The very act of understanding is a
celebration of joining, merging, even if on a very modest scale, with the magnificence of the Cosmos. And the
cumulative worldwide build-up of knowledge over time converts science into something only a little short of a trans-
national, trans-generational meta-mind.
Spirit comes from the Latin word to breathe. What we breathe is air, which is certainly matter, however thin. Despite
usage to the contrary, there is no necessary implication in the word spiritual that we are talking of anything other than
matter (including the matter of which the brain is made), or anything outside the realm of science. On occasion, I will feel
free to use the word. Science is not only compatible with spirituality; it is a profound source of spirituality. When we
recognize our place in an immensity of light years and in the passage of ages, when we grasp the intricacy, beauty and
subtlety of life, then that soaring feeling, that sense of elation and humility combined, is surely spiritual. So are our
emotions in the presence of great art or music or literature, or of acts of exemplary selfless courage such as those of
Mohandas Gandhi or Martin Luther King Jr. The notion that science and spirituality are somehow mutually exclusive
does a disservice to both.

Reminding us once again of his timeless wisdom on the vital balance between skepticism and openness and the
importance of evidence, Sagan goes on to juxtapose the accuracy of science with the unfounded prophecies of
religion:
Not every branch of science can foretell the future paleontology cant but many can and with stunning accuracy. If
you want to know when the next eclipse of the Sun will be, you might try magicians or mystics, but youll do much better
with scientists. They will tell you where on Earth to stand, when you have to be there, and whether it will be a partial
eclipse, a total eclipse, or an annular eclipse. They can routinely predict a solar eclipse, to the minute, a millennium in
advance. You can go to the witch doctor to lift the spell that causes your pernicious anaemia, or you can take vitamin
Bl2. If you want to save your child from polio, you can pray or you can inoculate. If youre interested in the sex of your
unborn child, you can consult plumb-bob danglers all you want (left-right, a boy; forward-back, a girl or maybe its the
other way around), but theyll be right, on average, only one time in two. If you want real accuracy (here, 99 per cent
accuracy), try amniocentesis and sonograms. Try science.
Think of how many religions attempt to validate themselves with prophecy. Think of how many people rely on these
prophecies, however vague, however unfulfilled, to support or prop up their beliefs. Yet has there ever been a religion
with the prophetic accuracy and reliability of science? There isnt a religion on the planet that doesnt long for a
comparable ability precise, and repeatedly demonstrated before committed skeptics to foretell future events. No
other human institution comes close.
Nearly two decades after The Demon-Haunted World, Sagans son, Dorion, made a similar and similarly
eloquent case for why science and philosophy need each other. Complement it with this meditation on science
vs. scripture and the difference between curiosity and wonder.
The 13 Best Books of 2013: The Definitive Annual
Reading List of Overall Favorites
by Maria Popova
Soul-stirring, brain-expanding reads on intuition, love, grief, attention,
education, and the meaning of life.
All gratifying things must come to an end: The seasons subjective selection of best-of
reading lists which covered writing and creativity, photography, psychology and
philosophy, art and design, history and biography,science and technology, childrens
literature, and pets and animals comes full-circle with this final omnibus of the
years most indiscriminately wonderful reads, a set of overall favorites that spill across
multiple disciplines, cross-pollinate subjects, and defy categorization in the most
stimulating of ways. (Revisit last years selection here.)
1. ON LOOKING
How we spend our days, Annie Dillard wrote in her timelessly beautiful
meditation on presence over productivity, is, of course, how we spend our
lives. And nowhere do we fail at the art of presence most miserably and most
tragically than in urban life in the city, high on the cult of productivity,
where we float past each other, past the buildings and trees and the little boy
in the purple pants, past life itself, cut off from the breathing of the world by
iPhone earbuds and solipsism. And yet: The art of seeing has to be
learned, Marguerite Duras reverberates and itcan be learned, as cognitive
scientist Alexandra Horowitz invites us to believe in her breathlessly
wonderful On Looking: Eleven Walks with Expert Eyes (public library), also
among the years best psychology books a record of her quest to walk
around a city block with eleven different experts, from an artist to a
geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar
world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most
enchanting thing Ive read in ages. In a way, its the opposite but equally delightful mirror image of Christoph
Niemanns Abstract City a concrete, immersive examination of urbanity blending the mindfulness of
Sherlock Holmes with the expansive sensitivity of Thoreau.
Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently
call reality:
Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in
your body, in the distance, and right in front of you.
By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an
unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights,
the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the
roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your
body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your
peripheral vision, a chirp of a bug or whine of a kitchen appliance.
This adaptive ignorance, she argues, is there for a reason we celebrate it as concentration and welcome its
way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the
stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (Attention is an intentional,
unapologetic discriminator, Horowitz tells us. It asks what is relevant right now, and gears us up to notice
only that.) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us
inhabit a largely unlived and unremembered life, day in and day out.
For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her
curly haired, sage mixed breed (who also inspired Horowitzs first book, the excellent Inside of a Dog: What
Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more
and more aware of the dramatically different experiences she and her canine companion were having along the
exact same route:
Minor clashes between my dogs preferences as to where and how a walk should proceed and my own indicated that I
was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right
before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to
see; what my dog showed me was that my attention invited along attentions companion: inattention to everything else.
The book was her answer to the disconnect, an effort to attend to that inattention. It is not, she warns us,
about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.
Rather, it is an invitation to the art of observation:
Together, we became investigators of the ordinary, considering the block the street and everything on itas a living
being that could be observed.
In this way, the familiar becomes unfamiliar, and the old the new.
Horowitzs approach is based on two osmotic human tendencies: our shared capacity to truly see what is in
front of us, despite our conditioned concentration that obscures it, and the power of individual bias in
perception or what we call expertise, acquired by passion or training or both in bringing attention to
elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive
bias as Horowitz, with her archetypal New Yorkers special fascination with the humming life-form that is an
urban street, and her diverse companions take to the city.

Art by Maira Kalman from 'On Looking: Eleven Walks with Expert Eyes'
First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides
her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (The walkers trod
silently; the dogs said nothing. The only sound was the hum of air conditioners, she beholds her own block;
passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, how does a Q-tip escape?;
turning her final corner, she gazes at the entrance of a mansion and its pair of stone lions waiting patiently for
royalty that never arrives. Stunning.)
But as soon as she joins her experts, Horowitz is faced with the grimacing awareness that despite her best,
most Sherlockian efforts, she was missing pretty much everything. She arrives at a newfound, profound
understanding of what William James meant when he wrote, My experience is what I agree to attend to. Only
those items which I notice shape my mind.:
I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is
that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing,
frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.
Originally featured in August, with a closer look at the expert insights. For another peek at this gem, which is
easily among my top three favorite books of the past decade, learn how to do the step-and-slide.
2. ADVICE TO LITTLE GIRLS
In 1865, when he was only thirty, Mark Twainpenned a playful short story
mischievously encouraging girls to think independently rather than blindly
obey rules and social mores. In the summer of 2011, I chanced upon and fell
in love with a lovely Italian edition of this little-known gem with Victorian-
scrapbook-inspired artwork by celebrated Russian-born childrens book
illustratorVladimir Radunsky. I knew the book had to come to life in
English, so I partnered with the wonderful Claudia Zoe Bedrick of
Brooklyn-based indie publishing house Enchanted Lion, maker
of extraordinarily beautiful picture-books, and we spent the next two years
bringing Advice to Little Girls (public library) to life in America a true
labor-of-love project full of so much delight for readers of all ages. (And
how joyous to learn that it was also selected amongNPRs best books of 2013!)

While frolicsome in tone and full of wink, the story is colored with subtle hues of grown-up philosophy on the
human condition, exploring all the deft ways in which we creatively rationalize our wrongdoing and reconcile
the good and evil we each embody.

Good little girls ought not to make mouths at their teachers for every trifling offense. This retaliation should only be
resorted to under peculiarly aggravated circumstances.

If you have nothing but a rag-doll stuffed with sawdust, while one of your more fortunate little playmates has a costly
China one, you should treat her with a show of kindness nevertheless. And you ought not to attempt to make a forcible
swap with her unless your conscience would justify you in it, and you know you are able to do it.
One cant help but wonder whether this particular bit may have in part inspired the irreverent 1964
anthology Beastly Boys and Ghastly Girls and its mischievous advice on brother-sister relations:
If at any time you find it necessary to correct your brother, do not correct him with mud never, on any account, throw
mud at him, because it will spoil his clothes. It is better to scald him a little, for then you obtain desirable results. You
secure his immediate attention to the lessons you are inculcating, and at the same time your hot water will have a
tendency to move impurities from his person, and possibly the skin, in spots.

If your mother tells you to do a thing, it is wrong to reply that you wont. It is better and more becoming to intimate that
you will do as she bids you, and then afterward act quietly in the matter according to the dictates of your best judgment.

Good little girls always show marked deference for the aged. You ought never to sass old people unless they sass
you first.

Originally featured in April see more spreads, as well as the story behind the project, here.
3. THIS EXPLAINS EVERYTHING
Every year since 1998, intellectual impresario and Edge editor John
Brockman has been posing a single grand question to some of our times
greatest thinkers across a wide spectrum of disciplines, then collecting the
answers in an annual anthology. Last years answers to the question What
scientific concept will improve everybodys cognitive toolkit? were
released in This Will Make You Smarter: New Scientific Concepts to Improve
Your Thinking, one of the years best psychology and philosophy books.
In 2012, the question Brockman posed, proposed by none other than Steven
Pinker, was What is your favorite deep, elegant, or beautiful
explanation? The answers, representing an eclectic mix of 192 (alas,
overwhelmingly male) minds spanning psychology, quantum physics, social
science, political theory, philosophy, and more, are collected in the edited compendium This Explains
Everything: Deep, Beautiful, and Elegant Theories of How the World Works (UK;public library) and are
also available online.
In the introduction preceding the micro-essays, Brockman frames the question and its ultimate objective,
adding to historys most timeless definitions of science:
The ideas presented on Edge are speculative; they represent the frontiers in such areas as evolutionary biology,
genetics, computer science, neurophysiology, psychology, cosmology, and physics. Emerging out of these
contributions is a new natural philosophy, new ways of understanding physical systems, new ways of thinking that call
into question many of our basic assumptions.
[]
Perhaps the greatest pleasure in science comes from theories that derive the solution to some deep puzzle from a
small set of simple principles in a surprising way. These explanations are called beautiful or elegant.
[]
The contributions presented here embrace scientific thinking in the broadest sense: as the most reliable way of gaining
knowledge about anything including such fields of inquiry as philosophy, mathematics, economics, history, language,
and human behavior. The common thread is that a simple and nonobvious idea is proposed as the explanation of a
diverse and complicated set of phenomena.

Puffer fish with Akule by photographer Wayne Levin. Click image for details.
Stanford neuroscientist Robert Sapolsky, eloquent as ever, marvels at the wisdom of the crowd and the
emergence of swarm intelligence:
Observe a single ant, and it doesnt make much sense, walking in one direction, suddenly
careening in another for no obvious reason, doubling back on itself. Thoroughly
unpredictable.
The same happens with two ants, a handful of ants. But a colony of ants makes fantastic
sense. Specialized jobs, efficient means of exploiting new food sources, complex
underground nests with temperature regulated within a few degrees. And critically, theres
no blueprint or central source of commandeach individual ants has algorithms for their
behaviors. But this is not wisdom of the crowd, where a bunch of reasonably informed individuals outperform a single
expert. The ants arent reasonably informed about the big picture. Instead, the behavior algorithms of each ant consist of
a few simple rules for interacting with the local environment and local ants. And out of this emerges a highly efficient
colony.
Ant colonies excel at generating trails that connect locations in the shortest possible way, accomplished with simple
rules about when to lay down a pheromone trail and what to do when encountering someone elses trail
approximations of optimal solutions to the Traveling Salesman problem. This has useful applications. In ant-based
routing, simulations using virtual ants with similar rules can generate optimal ways of connecting the nodes in a
network, something of great interest to telecommunications companies. It applies to the developing brain, which must
wire up vast numbers of neurons with vaster numbers of connections without constructing millions of miles of
connecting axons. And migrating fetal neurons generate an efficient solution with a different version of ant-based
routine.
A wonderful example is how local rules about attraction and repulsion (i.e., positive and negative charges) allow simple
molecules in an organic soup to occasionally form more complex ones. Life may have originated this way without the
requirement of bolts of lightning to catalyze the formation of complex molecules.
And why is self-organization so beautiful to my atheistic self? Because if complex, adaptive systems dont require a blue
print, they dont require a blue print maker. If they dont require lightning bolts, they dont require Someone hurtling
lightning bolts.
Developmental psychologist Howard Gardner, who famously coined the seminal theory of multiple
intelligences, echoes Anas Nin in advocating for the role of the individual and Susan Sontag in stressing the
impact of individual acts on collective fate. His answer, arguing for the importance of human beings, comes as
a welcome antidote to a question that suffers the danger of being inherently
reductionist:
In a planet occupied now by seven billion inhabitants, I am amazed by the difference that
one human being can make. Think of classical music without Mozart or Stravinsky; of
painting without Caravaggio, Picasso or Pollock; of drama without Shakespeare or Beckett. Think of the incredible
contributions of Michelangelo or Leonardo, or, in recent times, the outpouring of deep feeling at the death of Steve Jobs
(or, for that matter, Michael Jackson or Princess Diana). Think of human values in the absence of Moses or Christ.
[]
Despite the laudatory efforts of scientists to ferret out patterns in human behavior, I continue to be struck by the impact of
single individuals, or of small groups, working against the odds. As scholars, we cannot and should not sweep these
instances under the investigative rug. We should bear in mind anthropologist Margaret Meads famous injunction:
Never doubt that a small group of thoughtful committed citizens can change the world. It is the only thing that ever has.
Uber-curator Hans Ulrich Obrist, who also contributed to last years volume, considers the parallel role of
patterns and chance in the works of iconic composer John Cage and painter Gerhard Richter, and the role of
uncertainty in the creative process:
In art, the title of a work can often be its first explanation. And in this context I am thinking
especially of the titles of Gerhard Richter. In 2006, when I visited Richter in his studio in
Cologne, he had just finished a group of six corresponding abstract paintings which he
gave the title Cage.
There are many relations between Richters painting and the compositions of John Cage.
In a book about the Cage series, Robert Storr has traced them from Richters attendance
of a Cage performance at the Festum Fluxorum Fluxus in Dsseldorf 1963 to analogies in
their artistic processes. Cage has often applied chance procedures in his compositions, notably with the use of the I
Ching. Richter in his abstract paintings also intentionally allows effects of chance. In these paintings, he applies the oil
paint on the canvas by means of a large squeegee. He selects the colors on the squeegee, but the factual trace that the
paint leaves on the canvas is to a large extent the outcome of chance.
[]
Richters concise title, Cage, can be unfolded into an extensive interpretation of these abstract paintings (and of other
works)but, one can say, the short form already contains everything. The title, like an explanation of a phenomenon,
unlocks the works, describing their relation to one of the most important cultural figures of the twentieth century, John
Cage, who shares with Richter the great themes of chance and uncertainty.
Writer, artist, and designer Douglas Coupland, whose biography of Marshall McLuhan remains
indispensable, offers a lyrical meditation on the peculiar odds behind coincidences
and dja vus:
I take comfort in the fact that there are two human moments that seem to be doled out
equally and democratically within the human conditionand that there is no satisfying
ultimate explanation for either. One is coincidence, the other is dja vu. It doesnt matter if
youre Queen Elizabeth, one of the thirty-three miners rescued in Chile, a South Korean housewife or a migrant herder
in Zimbabwein the span of 365 days you will pretty much have two dja vus as well as one coincidence that makes
you stop and say, Wow, that was a coincidence.
The thing about coincidence is that when you imagine the umpteen trillions of coincidences that can happen at any
given moment, the fact is, that in practice, coincidences almost never do occur. Coincidences are actually so rare that
when they do occur they are, in fact memorable. This suggests to me that the universe is designed to ward off
coincidence whenever possiblethe universe hates coincidenceI dont know whyit just seems to be true. So when
a coincidence happens, that coincidence had to workawfully hard to escape the system. Theres a message there.
What is it? Look. Look harder. Mathematicians perhaps have a theorem for this, and if they do, it might, by default be a
theorem for something larger than what they think it is.
Whats both eerie and interesting to me about dja vus is that they occur almost like metronomes throughout our lives,
about one every six months, a poetic timekeeping device that, at the very least, reminds us we are alive. I can safely
assume that my thirteen year old niece, Stephen Hawking and someone working in a Beijing luggage-making factory
each experience two dja vus a year. Not one. Not three. Two.
The underlying biodynamics of dja vus is probably ascribable to some sort of tingling neurons in a certain part of the
brain, yet this doesnt tell us why they exist. They seem to me to be a signal from larger point of view that wants to
remind us that our lives are distinct, that they have meaning, and that they occur throughout a span of time. We are
important, and what makes us valuable to the universe is our sentience and our curse and blessing of perpetual self-
awareness.
Originally featured in January read more here.
4. TIME WARPED
Given my soft spot for famous diaries, it should come as no surprise that I
keep one myself. Perhaps the greatest gift of the practice has been the daily
habit of reading what I had written on that day a year earlier; not only is it a
remarkable tool of introspection and self-awareness, but it also illustrates that
our memory is never a precise duplicate of the original [but] a continuing act
of creation and how flawed our perception of time is almost everything
that occurred a year ago appears as having taken place either significantly
further in the past (a different lifetime, Id often marvel at this time-illusion)
or significantly more recently (this feels like just last month!). Rather than a
personal deficiency of those of us befallen by this tendency, however, it turns
out to be a defining feature of how the human mind works, the science of
which is at first unsettling, then strangely comforting, and altogether intensely interesting.
Thats precisely what acclaimed BBC broadcaster and psychology writerClaudia Hammond explores in Time
Warped: Unlocking the Mysteries of Time Perception (public library) a fascinating foray into the idea that
our experience of time is actively created by our own minds and how these sensations of what neuroscientists
and psychologists call mind time are created, and also amongthe years best psychology books. As
disorienting as the concept might seem after all, weve been nursed on the belief that time is one of those
few utterly reliable and objective things in life it is also strangely empowering to think that the very
phenomenon depicted as the unforgiving dictator of life is something we might be able to shape and benefit
from. Hammond writes:
We construct the experience of time in our minds, so it follows that we are able to change the elements we find troubling
whether its trying to stop the years racing past, or speeding up time when were stuck in a queue, trying to live more
in the present, or working out how long ago we last saw our old friends. Time can be a friend, but it can also be an
enemy. The trick is to harness it, whether at home, at work, or even in social policy, and to work in line with our
conception of time. Time perception matters because it is the experience of time that roots us in our mental reality. Time
is not only at the heart of the way we organize life, but the way we experience it.

Discus chronologicus, a depiction of time by German engraver Christoph Weigel, published in the early
1720s; from Cartographies of Time. (Click for details)
Among the most intriguing illustrations of mind time is the incredible elasticity of how we experience time.
(Where is it, this present?, William James famously wondered. It has melted in our grasp, fled ere we
could touch it, gone in the instant of becoming.) For instance, Hammond points out, we slow time down when
gripped by mortal fear the cliche about the slow-motion car crash is, in fact, a cognitive reality. This plays
out even in situations that arent life-or-death per se but are still associated with strong feelings of fear.
Hammond points to a study in which people with arachnophobia were asked to look at spiders the very
object of their intense fear for 45 seconds and they overestimated the elapsed time. The same pattern was
observed in novice skydivers, who estimated the duration of their peers falls as short, whereas their own, from
the same altitude, were deemed longer.
Inversely, time seems to speed up as we get older a phenomenon of which competing theories have
attempted to make light. One, known as the proportionality theory, uses pure mathematics, holding that a
year feels faster when youre 40 than when youre 8 because it only constitutes one fortieth of your life rather
than a whole eighth. Among its famous proponents are Vladimir Nabokov and William James. But Hammond
remains unconvinced:
The problem with the proportionality theory is that it fails to account for the way we experience time at any one moment.
We dont judge one day in the context of our whole lives. If we did, then for a 40-year-old every single day should flash
by because it is less than one fourteen-thousandth of the life theyve had so far. It should be fleeting and
inconsequential, yet if you have nothing to do or an enforced wait at an airport for example, a day at 40 can still feel long
and boring and surely longer than a fun day at the seaside packed with adventure for a child. It ignores attention and
emotion, which can have a considerable impact on time perception.
Another theory suggests that perhaps it is the tempo of life in general that has accelerated, making things from
the past appear as slower, including the passage of time itself.
But one definite change does take place with age: As we grow older, we tend to feel like the previous decade
elapsed more rapidly, while the earlier decades of our lives seem to have lasted longer. Similarly, we tend to
think of events that took place in the past 10 years as having happened more recently than they actually did.
(Quick: What year did the devastating Japanese tsunami hit? When did we lose Maurice Sendak?) Conversely,
we perceive events that took place more than a decade ago as having happened even longer ago. (When did
Princess Diana die? What year was the Chernobyl disaster?) This, Hammond points out, is known as forward
telescoping:
It is as though time has been compressed and as if looking through a telescope things seem closer than they
really are. The opposite is called backward or reverse telescoping, also known as time expansion. This is when you
guess that events happened longer ago than they really did. This is rare for distant events, but not uncommon for recent
weeks.
[]
The most straightforward explanation for it is called the clarity of memory hypothesis, proposed by the psychologist
Norman Bradburn in 1987. This is the simple idea that because we know that memories fade over time, we use the
clarity of a memory as a guide to its recency. So if a memory seems unclear we assume it happened longer ago.
Originally featured in July, with a deeper dive into the psychology of why time slows down when were afraid,
speeds up as we age, and gets warped when were on vacation.
5. SELF-PORTRAIT AS YOUR TRAITOR
Still this childish fascination with my handwriting, young Susan Sontag
wrote in her diary in 1949. To think that I always have this sensuous
potentiality glowing within my fingers.This is the sort of sensuous
potentiality that comes aglow in Self-Portrait as Your Traitor(public library)
the magnificent collection of hand-lettered poems and illustrated essays by friend-of-Brain-
Pickings and frequentcontributor Debbie Millman. In the introduction, design legend Paula Scher aptly
describes this singular visual form as a 21st-century illuminated manuscript. Personal bias aside, these
moving, lovingly crafted poems and essays some handwritten, some drawn with colored pencils, some
typeset in felt on felt vibrate at that fertile intersection of the deeply personal and the universally profound.


In Fail Safe, her widely read essay-turned-commencement-address on creative courage and embracing the
unknown from the 2009 anthology Look Both Ways, Millman wrote:
John Maeda once explained, The computer will do anything within its abilities, but it will do nothing unless commanded
to do so. I think people are the same we like to operate within our abilities. But whereas the computer has a fixed
code, our abilities are limited only by our perceptions. Two decades since determining my code, and after 15 years of
working in the world of branding, I am now in the process of rewriting the possibilities of what comes next. I dont know
exactly what I will become; it is not something I can describe scientifically or artistically. Perhaps it is a code in progress.
Self-Portrait as Your Traitor, a glorious large-format tome full of textured colors to which the screen does
absolutely no justice, is the result of this progress a brave and heartening embodiment of what it truly
means, as Rilke put it, to live the questions; the stunning record of one womans personal and artistic code-
rewriting, brimming with wisdom on life and art for all.



Originally featured in November. See an exclusive excerpt here, then take a peek at Debbies creative
process here.
6. SUSAN SONTAG: THE COMPLETE ROLLING STONE
INTERVIEW
In 1978, Rolling Stone contributing editorJonathan Cott interviewed Susan
Sontag in twelve hours of conversation, beginning in Paris and continuing in
New York, only a third of which was published in the magazine. More than
three decades later and almost a decade after Sontags death, the full, wide-
ranging magnificence of their tte--tte, spanning literature, philosophy,
illness, mental health, music, art, and much more, is at last released in Susan
Sontag: The Complete Rolling Stone I nterview (public library) a rare
glimpse of one of modern historys greatest minds in her element.
Cott marvels at what made the dialogue especially extraordinary:
Unlike almost any other person whom Ive ever interviewed the pianist Glenn Gould is the one other exception
Susan spoke not in sentences but in measured and expansive paragraphs. And what seemed most striking to me was
the exactitude and moral and linguistic fine-tuning as she once described Henry Jamess writing stylewith which
she framed and elaborated her thoughts, precisely calibrating her intended meanings with parenthetical remarks and
qualifying words (sometimes, occasionally, usually, for the most part, in almost all cases), the munificence and
fluency of her conversation manifesting what the French refer to as an ivresse du discours an inebriation with the
spoken word. I am hooked on talk as a creative dialogue, she once remarked in her journals, and added: For me, its
the principal medium of my salvation.
In one segment of the conversation, Sontag discusses how the false divide between high and pop culture
impoverishes our lives. In another, she makes a beautiful case for the value of history:
I really believe in history, and thats something people dont believe in anymore. I know that what we do and think is a
historical creation. I have very few beliefs, but this is certainly a real belief: that most everything we think of as natural is
historical and has roots specifically in the late eighteenth and early nineteenth centuries, the so-called Romantic
revolutionary period and were essentially still dealing with expectations and feelings that were formulated at that time,
like ideas about happiness, individuality, radical social change, and pleasure. We were given a vocabulary that came
into existence at a particular historical moment. So when I go to a Patti Smith concert at CBGB, I enjoy, participate,
appreciate, and am tuned in better because Ive read Nietzsche.
In another meditation, she argues for the existential and creative value of presence:
What I want is to be fully present in my life to be really where you are, contemporary with yourself in your life, giving
full attention to the world, which includes you. You are not the world, the world is not identical to you, but youre in it and
paying attention to it. Thats what a writer does a writer pays attention to the world. Because Im very against this
solipsistic notion that you find it all in your head. You dont, there really is a world thats there whether youre in it or not.
In another passage, she considers how taking responsibility empowers rather than disempowers us:
I want to feel as responsible as I possibly can. As I told you before, I hate feeling like a victim, which not only gives me no
pleasure but also makes me feel very uncomfortable. Insofar as its possible, and not crazy, I want to enlarge to the
furthest extent possible my sense of my own autonomy, so that in friendship and love relationships Im eager to take
responsibility for both the good and the bad things. I dont want this attitude of I was so wonderful and that person did
me in. Even when its sometimes true, Ive managed to convince myself that I was at least co-responsible for bad things
that have happened to me, because it actually makes me feel stronger and makes me feel that things could perhaps be
different.
The conversation, in which Sontag reaches unprecedented depths of self-revelation, also debunks some
misconceptions about her public image as an intellectual in the dry, scholarly sense of the term:
Most of what I do, contrary to what people think, is so intuitive and unpremeditated and not at all that kind of cerebral,
calculating thing people imagine it to be. Im just following my instincts and intuitions. [] An argument appears to me
much more like the spokes of a wheel than the links of a chain.
In one of her most poignant insights, Sontag admonishes against our cultures dangerous polarities and
imprisoning stereotypes:
A lot of our ideas about what we can do at different ages and what age means are so arbitrary as arbitrary as sexual
stereotypes. I think that the young-old polarization and the male-female polarization are perhaps the two leading
stereotypes that imprison people. The values associated with youth and with masculinity are considered to be the
human norms, and anything else is taken to be at least less worthwhile or inferior. Old people have a terrific sense of
inferiority. Theyre embarrassed to be old. What you can do when youre young and what you can do when youre old is
as arbitrary and without much basis as what you can do if youre a woman or what you can do if youre a man.
Originally featured in November take a closer look here and here.
7. LETTERS OF NOTE
As a hopeless lover of letters, I was thrilled for the release of Letters of Note:
Correspondence Deserving of a Wider Audience(public library) the
aptly titled, superb collection based onShaun Ushers indispensable
website of the same name, which stands as a heartening echelon of
independent online scholarship and journalism at the intersection of the
editorial and the curatorial and features timeless treasures from such diverse
icons and Brain Pickings favorites as E. B. White, Virginia Woolf, Ursula
Nordstrom, Nick Cave, Ray Bradbury, Amelia Earhart, Galileo Galilei, and
more.
One of the most beautiful letters in the collection comes from Hunter S. Thompson gonzo journalism
godfather, pundit of media politics, dark philosopher. The letter, which Thompson sent to his friend Hume
Logan in 1958, makes for an exquisite addition to luminaries reflections on the meaning of life, speaking to
what it really means to find your purpose.
Cautious that all advice can only be a product of the man who gives it a caveat other literary legends have
stressed with varying degrees of irreverence Thompson begins with a necessary disclaimer about the very
notion of advice-giving:
To give advice to a man who asks what to do with his life implies something very close to egomania. To presume to
point a man to the right and ultimate goal to point with a trembling finger in the RIGHT direction is something only a
fool would take upon himself.

And yet he honors his friends request, turning to Shakespeare for an anchor of his own advice:
To be, or not to be: that is the question: Whether tis nobler in the mind to suffer the slings and arrows of outrageous
fortune, or to take arms against a sea of troubles
And indeed, that IS the question: whether to float with the tide, or to swim for a goal. It is a choice we must all make
consciously or unconsciously at one time in our lives. So few people understand this! Think of any decision youve ever
made which had a bearing on your future: I may be wrong, but I dont see how it could have been anything but a choice
however indirect between the two things Ive mentioned: the floating or the swimming.
He acknowledges the obvious question of why not take the path of least resistance and float aimlessly, then
counters it:
The answer and, in a sense, the tragedy of life is that we seek to understand the goal and not the man. We set up
a goal which demands of us certain things: and we do these things. We adjust to the demands of a concept which
CANNOT be valid. When you were young, let us say that you wanted to be a fireman. I feel reasonably safe in saying
that you no longer want to be a fireman. Why? Because your perspective has changed. Its not the fireman who has
changed, but you.
Touching on the same notion that William Gibson termed personal micro-culture, Austin Kleon captured in
asserting that you are the mashup of what you let into your life, and Paula Scher articulated so succinctly in
speaking ofthe combinatorial nature of our creativity, Thompson writes:
Every man is the sum total of his reactions to experience. As your experiences differ and multiply, you become a
different man, and hence your perspective changes. This goes on and on. Every reaction is a learning process; every
significant experience alters your perspective.
So it would seem foolish, would it not, to adjust our lives to the demands of a goal we see from a different angle every
day? How could we ever hope to accomplish anything other than galloping neurosis?
The answer, then, must not deal with goals at all, or not with tangible goals, anyway. It would take reams of paper to
develop this subject to fulfillment. God only knows how many books have been written on the meaning of man and
that sort of thing, and god only knows how many people have pondered the subject. (I use the term god only knows
purely as an expression.)* Theres very little sense in my trying to give it up to you in the proverbial nutshell, because Im
the first to admit my absolute lack of qualifications for reducing the meaning of life to one or two paragraphs.
Resolving to steer clear of the word existentialism, Thompson nonetheless strongly urges his friend to read
Sartres Nothingness and the anthologyExistentialism: From Dostoyevsky to Sartre, then admonishes against
succumbing to faulty definitions of success at the expense of finding ones own purpose:
To put our faith in tangible goals would seem to be, at best, unwise. So we do not strive to be firemen, we do not strive to
be bankers, nor policemen, nor doctors. WE STRIVE TO BE OURSELVES.
But dont misunderstand me. I dont mean that we cant BE firemen, bankers, or doctorsbut that we must make the
goal conform to the individual, rather than make the individual conform to the goal. In every man, heredity and
environment have combined to produce a creature of certain abilities and desiresincluding a deeply ingrained need to
function in such a way that his life will be MEANINGFUL. A man has to BE something; he has to matter.
As I see it then, the formula runs something like this: a man must choose a path which will let his ABILITIES function at
maximum efficiency toward the gratification of his DESIRES. In doing this, he is fulfilling a need (giving himself identity by
functioning in a set pattern toward a set goal) he avoids frustrating his potential (choosing a path which puts no limit on
his self-development), and he avoids the terror of seeing his goal wilt or lose its charm as he draws closer to it (rather
than bending himself to meet the demands of that which he seeks, he has bent his goal to conform to his own abilities
and desires).
In short, he has not dedicated his life to reaching a pre-defined goal, but he has rather chosen a way of life he KNOWS
he will enjoy. The goal is absolutely secondary: it is the functioning toward the goal which is important. And it seems
almost ridiculous to say that a man MUST function in a pattern of his own choosing; for to let another man define your
own goals is to give up one of the most meaningful aspects of life the definitive act of will which makes a man an
individual.
Noting that his friend had thus far lived a vertical rather than horizontal existence, Thompson acknowledges
the challenge of this choice but admonishes that however difficult, the choice must be made or else it melts
away into those default modes of society:
A man who procrastinates in his CHOOSING will inevitably have his choice made for him by circumstance. So if you
now number yourself among the disenchanted, then you have no choice but to accept things as they are, or to seriously
seek something else. But beware of looking for goals: look for a way of life. Decide how you want to live and then see
what you can do to make a living WITHIN that way of life. But you say, I dont know where to look; I dont know what to
look for.
And theres the crux. Is it worth giving up what I have to look for something better? I dont know is it? Who can make
that decision but you? But even by DECIDING TO LOOK, you go a long way toward making the choice.
He ends by returning to his original disclaimer by reiterating that rather than a prescription for living, his
advice is merely a reminder that how and what we choose choices were in danger of forgetting even
exist shapes the course and experience of our lives:
Im not trying to send you out on the road in search of Valhalla, but merely pointing out that it is not necessary to accept
the choices handed down to you by life as you know it. There is more to it than that no one HAS to do something he
doesnt want to do for the rest of his life.
Originally featured in November.
8. INTUITION PUMPS
If you are not making mistakes, youre not taking enough risks, Debbie
Millman counseled. Make New Mistakes. Make glorious, amazing mistakes.
Make mistakes nobodys ever made before, Neil Gaiman advised young
creators. In I ntuition Pumps And Other Tools for Thinking (public library), also among the years best
psychology books, the inimitable Daniel Dennett, one of our greatest living philosophers, offers a set of
thinking tools handy prosthetic imagination-extenders and focus holders that allow us to think
reliably and even gracefully about really hard questions to enhance your cognitive toolkit. He calls these
tools intuition pumps thought experiments designed to stir a heartfelt, table-thumping intuition (which
we know is a pillar of even the most rational of science) about the question at hand, a kind of persuasion tool
the reverse-engineering of which enables us to think better about thinking itself. Intuition, of course, is a
domain-specific ability that relies on honed critical thinking rather than a mystical quality bestowed by the
gods but thats precisely Dennetts point, and his task is to help us hone it.
Though most of his 77 intuition pumps address concrete questions, a dozen are general-purpose tools that
apply deeply and widely, across just about any domain of thinking. The first of them is also arguably the most
useful yet most uncomfortable: making mistakes.
Echoing Dorion Sagans case for why science and philosophy need each other, Dennett begins with an astute
contribution to the best definitions of philosophy, wrapped in a necessary admonition about the value of
history:
The history of philosophy is in large measure the history of very smart people making very tempting mistakes, and if you
dont know the history, you are doomed to making the same darn mistakes all over again. There is no such thing as
philosophy-free science, just science that has been conducted without any consideration of its underlying philosophical
assumptions.
He speaks for the generative potential of mistakes and their usefulness as an empirical tool:
Sometimes you dont just want to risk making mistakes; you actually want to make them if only to give you
something clear and detailed to fix.
Therein lies the power of mistakes as a vehicle for, as Rilke famously put it, living the questions and thus
advancing knowledge in a way that certainty cannot for, as Richard Feynman memorably noted, the
scientists job is to remain unsure, and so seems the philosophers. Dennett writes:
We philosophers are mistake specialists. While other disciplines specialize in getting the right answers to their
defining questions, we philosophers specialize in all the ways there are of getting things so mixed up, so deeply wrong,
that nobody is even sure what the right questions are, let alone the answers. Asking the wrong questions risks setting
any inquiry off on the wrong foot. Whenever that happens, this is a job for philosophers! Philosophy in every field of
inquiry is what you have to do until you figure out what questions you should have been asking in the first place.
[]
Mistakes are not just opportunities for learning; they are, in an important sense, the only opportunity for learning or
making something truly new. Before there can be learning, there must be learners. There are only two non-miraculous
ways for learners to come into existence: they must either evolve or be designed and built by learners that evolved.
Biological evolution proceeds by a grand, inexorable process of trial and error and without the errors the trials
wouldnt accomplish anything.
Dennett offers a caveat that at once highlights the importance of acquiring knowledge and reminds us of the
power of chance-opportunism:
Trials can be either blind or foresighted. You, who know a lot, but not the answer to the question at hand, can take leaps
foresighted leaps. You can look before you leap, and hence be somewhat guided from the outset by what you
already know. You need not be guessing at random, but dont look down your nose at random guesses; among its
wonderful products is you!
And since evolution is the highest epitome of how the process of trial and error drives progress, Dennett makes
a case for understanding evolution as a key to understanding everything else we humans value:
Evolution is the central, enabling process not only of life but also of knowledge and learning and understanding. If you
attempt to make sense of the world of ideas and meanings, free will and morality, art and science and even philosophy
itself without a sound and quite detailed knowledge of evolution, you have one hand tied behind your back. For
evolution, which knows nothing, the steps into novelty are blindly taken by mutations, which are random copying errors
in DNA.
Dennett echoes Dostoyevsky (Above all, dont lie to yourself. The man who lies to himself and listens to his
own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all
respect for himself and for others.) and offers the key to making productive mistakes:
The chief trick to making good mistakes is not to hide them especially not from yourself. Instead of turning away in
denial when you make a mistake, you should become a connoisseur of your own mistakes, turning them over in your
mind as if they were works of art, which in a way they are. The trick is to take advantage of the particular details of the
mess youve made, so that your next attempt will be informed by it and not just another blind stab in the dark.
We have all heard the forlorn refrain Well, it seemed like a good idea at the time! This phrase has come to stand for the
rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any
agent, who can truly say, Well, it seemed like a good idea at the time! is standing
on the threshold of brilliance.
Originally featured in May read the full article here.
9. LOST CAT
Dogs are not about something else. Dogs are about dogs, Malcolm
Gladwell asserted indignantly in the introduction to The Big New Yorker
Book of Dogs. Though hailed as memetic rulers of the internet, cats have
also enjoyed a long history as artistic and literary muses, but never have they been at once more about cats and
more about something else than in Lost Cat: A True Story of Love, Desperation, and GPS
Technology (public library) by firefighter-turned-writer Caroline Paul and illustrator extraordinaire Wendy
MacNaughton, she ofmany wonderful collaborations a tender, imaginative memoir, among the years best
biographies and memoirs and best books on pets and animals, infused with equal parts humor and humanity.
Though about a cat, this heartwarming and heartbreaking tale is really about what it means to be human
about the osmosis of hollowing loneliness and profound attachment, the oscillation between boundless
affection and paralyzing fear of abandonment, the unfair promise of loss implicit to every possibility of love.

After Caroline crashes an experimental plane she was piloting, she finds herself severely injured and spiraling
into the depths of depression. It both helps and doesnt that Caroline and Wendy have just fallen in love,
soaring in the butterfly heights of new romance, the phase of love that didnt obey any known rules of
physics, until the crash pulls them into a place that would challenge even the most seasoned and grounded of
relationships. And yet they persevere as Wendy patiently and lovingly takes care of Caroline.

When Caroline returns from the hospital with a shattered ankle, her two thirteen-year-old tabbies the shy,
anxious Tibby (short for Tibia, affectionately and, in these circumstances, ironically named after the
shinbone) and the sociable, amicable Fibby (short for Fibula, after the calf bone on the lateral side of the tibia)
are, short of Wendy, her only joy and comfort:
Tibia and Fibula meowed happily when I arrived. They were undaunted by my ensuing stupor. In fact they were
delighted; suddenly I had become a human who didnt shout into a small rectangle of lights and plastic in her hand, peer
at a computer, or get up and disappear from the vicinity, only to reappear through the front door hours later. Instead, I
was completely available to them at all times. Amazed by their good luck, they took full feline advantage. They asked for
ear scratches and chin rubs. They rubbed their whiskers along my face. They purred in response to my slurred,
affectionate baby talk. But mostly they just settled in and went to sleep. Fibby snored into my neck. Tibby snored on the
rug nearby. Meanwhile I lay awake, circling the deep dark hole of depression.
Without my cats, I would have fallen right in.

And then, one day, Tibby disappears.

Wendy and Caroline proceed to flyer the neighborhood, visit every animal shelter in the vicinity, and even, in
their desperation, enlist the help of a psychic who specializes in lost pets but to no avail. Heartbroken, they
begin to mourn Tibbys loss.

And then, one day five weeks later, Tibby reappears. But once the initial elation of the recovery has worn off,
Caroline begins to wonder where hed been and why hed left. He is now no longer eating at home and
regularly leaves the house for extended periods of time Tibby clearly has a secret place he now returns to.
Even more worrisomely, hes no longer the shy, anxious tabby hed been for thirteen years instead, hes a
half pound heavier, chirpy, with a youthful spring in his step. But why would a happy cat abandon his loving
lifelong companion and find comfort find himself, even elsewhere?
When the relief that my cat was safe began to fade, and the joy of his prone, snoring form sprawled like an athlete
after a celebratory night of boozing started to wear thin, I was left with darker emotions. Confusion. Jealousy.
Betrayal. I thought Id known my cat of thirteen years. But that cat had been anxious and shy. This cat was a
swashbuckling adventurer back from the high seas. What siren call could have lured him away? Was he still going to
this gilded place, with its overflowing food bowls and endless treats?

There only one obvious thing left to do: Track Tibby on his escapades. So Caroline, despite Wendys lovingly
suppressed skepticism, heads to a spy store yes, those exist and purchases a real-time GPS tracker,
complete with a camera that they program to take snapshots every few minutes, which they then attach to
Tibbys collar.

What follows is a wild, hilarious, and sweet tale of tinkering, tracking, and tenderness. Underpinning the
obsessive quest is the subtle yet palpable subplot of Wendy and Carolines growing love for each other, the
deepening of trust and affection that happens when two people share in a special kind of insanity.

Evert quest is a journey, every journey a story. Every story, in turn, has a moral,writes Caroline in the final
chapter, then offers several possible morals for the story, the last of which embody everything that
makes Lost Cat an absolute treat from cover to cover:
6. You can never know your cat. In fact, you can never know anyone as completely as you want.
7. But thats okay, love is better.
Take a closer look here, then hear MacNaughton and Paul in conversation about combining creative
collaboration with a romantic relationship.
10. CODEX SERAPHINIANUS
In 1976, Italian artist, architect, and designerLuigi Serafini, only 27 at the
time, set out to create an elaborate encyclopedia of imaginary objects and
creatures that fell somewhere between Edward Goreys cryptic
alphabets,Albertus Sebas cabinet of curiosities, the book of surrealist games,
and Alice in Wonderland. Whats more, it wasnt written in any ordinary
language but in an unintelligible alphabet that appeared to be a conlang an
undertaking so complex it constitutes one of the highest feats of cryptography.
It took him nearly three years to complete the project, and three more to
publish it, but when it was finally released, the book a weird and wonderful
masterpiece of art and philosophical provocation on the precipice of the
information age attracted a growing following that continued to gather momentum even as the original
edition went out of print.
Now, for the first time in more than thirty years, Codex Seraphinianus (public library) is resurrected in a
lavish new edition by Rizzoli who have a penchant for excavating forgotten gems featuring a new
chapter by Serafini, now in his 60s, and a gorgeous signed print with each deluxe tome. Besides a visual
masterwork, its also a timeless meditation on what reality really is, one all the timelier in todays age of
such seemingly surrealist feats as bioengineering whole new lifeforms, hurling subatomic particles at each
other faster than the speed of light, and encoding an entire book onto a DNA molecule.




In an interview for Wired Italy, Serafini aptly captures the subtle similarity tochildrens books in how
the Codex bewitches our grown-up fancy with its bizarre beauty:
What I want my alphabet to convey to the reader is the sensation that children feel in front of books they cannot yet
understand. I used it to describe analytically an imaginary world and give a coherent framework. The images originate
from the clash between this fantasy vocabulary and the real world. The Codex became so popular because it makes
you feel more comfortable with your fantasies. Another world is not possible, but a fantasy one maybe is.


Playfully addressing the books towering price point, Serafini makes a more serious point about how it
bespeaks the seductive selectiveness of our attention:
The [new] edition is very rich and also pricey, I know, but its just like psychoanalysis: Money matters and the fee is part
of the process of healing. At the end of the day, the Codex is similar to the Rorschach inkblot test. You see what you
want to see. You might think its speaking to you, but its just your imagination.

Originally featured in October see more here.
11. MY BROTHERS BOOK
For those of us who loved legendary childrens book author Maurice
Sendak famed creator of wild things, little-known illustrator of velveteen
rabbits, infinitely warm heart, infinitely witty mind his death in 2012
was one of the years greatest heartaches. Now, half a century after his
iconic Where The Wild Things Are, comesMy Brothers Book (public library),
one of the years best childrens books a bittersweet posthumous farewell to
the world, illustrated in vibrant, dreamsome watercolors and written in verse
inspired by some of Sendaks lifelong influences: Shakespeare, Blake, Keats,
and the music of Mozart. In fact, a foreword by Shakespeare scholar Stephen
Greenblatt reveals the book is based on the Bards A Winters Tale.


It tells the story of two brothers, Jack and Guy, torn asunder when a falling star crashes onto Earth. Though on
the surface about the beloved authors own brother Jack, who died 18 years ago, the story is also about the love
of Sendaks life and his partner of fifty years, psychoanalyst Eugene Glynn, whose prolonged illness and
eventual loss in 2007 devastated Sendak the character of Guy reads like a poetic fusion of Sendak and
Glynn. And while the story might be a universal love letter to those who have gone before, as NPRs Renee
Montagne suggests in Morning Edition, it is in equal measure a private love letter to Glynn. (Sendak passed
away the day before President Obama announced his support for same-sex marriage, but Sendak fans were
quick to honor both historic moments with a bittersweet homage.)
Indeed, the theme of all-consuming love manifests viscerally in Sendaks books. Playwright Tony Kushner, a
longtime close friend of Sendaks and one of hismost heartfelt mourners, tells NPR:
Theres a lot of consuming and devouring and eating in Maurices books. And I think that when people play with kids,
theres a lot of fake ferocity and threats of, you know, devouring because love is so enormous, the only thing you can
think of doing is swallowing the person that you love entirely.



My Brothers Book ends on a soul-stirring note, tender and poignant in its posthumous light:
And Jack slept safe
Enfolded in his brothers arms
And Guy whispered Good night
And you will dream of me.
Originally featured in February.
12. EIGHTY DAYS
Anything one man can imagine, other men can make real, science fiction
godfather Jules Verne famously proclaimed. He was right about the general
sentiment but oh how very wrong about its gendered language: Sixteen years
after Vernes classic novel Eighty Days Around the World, his vision for
speed-circumnavigation would be made real but by a woman. On the
morning of November 14, 1889, Nellie Bly, an audacious newspaper
reporter, set out to outpace Vernes fictional itinerary by circumnavigating
the globe in seventy-five days, thus setting the real-world record for the
fastest trip around the world. InEighty Days: Nellie Bly and Elizabeth
Bislands History-Making Race Around the World (public library), also
among the yearsbest history books, Matthew Goodman traces the
groundbreaking adventure, beginning with a backdrop of Blys remarkable
journalistic fortitude and contribution to defying our stubbornly enduring biases about women writers:
No female reporter before her had ever seemed quite so audacious, so willing to risk personal safety in pursuit of a
story. In her first expos for The World, Bly had gone undercover feigning insanity so that she might report firsthand
on the mistreatment of the female patients of the Blackwells Island Insane Asylum. Bly trained with the boxing
champion John L. Sullivan; she performed, with cheerfulness but not much success, as a chorus girl at the Academy of
Music (forgetting the cue to exit, she momentarily found herself all alone onstage). She visited with a remarkable deaf,
dumb, and blind nine-year-old girl in Boston by the name of Helen Keller. Once, to expose the workings of New Yorks
white slave trade, she even bought a baby. Her articles were by turns lighthearted and scolding and indignant, some
meant to edify and some merely to entertain, but all were shot through with Blys unmistakable passion for a good story
and her uncanny ability to capture the publics imagination, the sheer force of her personality demanding that attention
be paid to the plight of the unfortunate, and, not incidentally, to herself.
For all her extraordinary talent and work ethic, Blys appearance was decidedly unremarkable a fact that
shouldnt matter, but one that would be repeatedly remarked upon by her critics and commentators, something
weve made sad little progress on in discussing womens professional, intellectual, and creative merit more
than a century later. Goodman paints a portrait of Bly:
She was a young woman in a plaid coat and cap, neither tall nor short, dark nor fair, not quite pretty enough to turn a
head: the sort of woman who could, if necessary, lose herself in a crowd.
[]
Her voice rang with the lilt of the hill towns of western Pennsylvania; there was an unusual rising inflection at the ends of
her sentences, the vestige of an Elizabethan dialect that had still been spoken in the hills when she was a girl. She had
piercing gray eyes, though sometimes they were called green, or blue-green, or hazel. Her nose was broad at its base
and delicately upturned at the end the papers liked to refer to it as a retrouss nose and it was the only feature
about which she was at all self-conscious. She had brown hair that she wore in bangs across her forehead. Most of
those who knew her considered her pretty, although this was a subject that in the coming months would be hotly
debated in the press.
But, as if the ambitious adventure werent scintillating enough, the story takes an unexpected turn: That fateful
November morning, as Bly was making her way to the journeys outset at the Hoboken docks, a man named
John Brisben Walker passed her on a ferry in the opposite direction, traveling from Jersey City to Lower
Manhattan. He was the publisher of a high-brow magazine titledThe Cosmopolitan, the same publication that
decades later, under the new ownership of William Randolph Hearst, would take a dive for the commercially
low-brow. On his ferry ride, Walker skimmed that mornings edition of The World and paused over the front-
page feature announcing Blys planned adventure around the world. A seasoned media manipulator of the
publics voracious appetite for drama, he instantly birthed an idea that would seize upon a unique publicity
opportunity The Cosmopolitan would send another circumnavigator to race against Bly. To keep things
equal, it would have to be a woman. To keep them interesting, shed travel in the opposite direction.
And so it went:
Elizabeth Bisland was twenty-eight years old, and after nearly a decade of freelance writing she had recently obtained a
job as literary editor of The Cosmopolitan, for which she wrote a monthly review of recently published books entitled In
the Library. Born into a Louisiana plantation family ruined by the Civil War and its aftermath, at the age of twenty she
had moved to New Orleans and then, a few years later, to New York, where she contributed to a variety of magazines
and was regularly referred to as the most beautiful woman in metropolitan journalism. Bisland was tall, with an elegant,
almost imperious bearing that accentuated her height; she had large dark eyes and luminous pale skin and spoke in a
low, gentle voice. She reveled in gracious hospitality and smart conversation, both of which were regularly on display in
the literary salon that she hosted in the little apartment she shared with her sister on Fourth Avenue, where members of
New Yorks creative set, writers and painters and actors, gathered to discuss the artistic issues of the day. Bislands
particular combination of beauty, charm, and erudition seems to have been nothing short of bewitching.

But Bisland was no literary bombshell. Wary of beautys fleeting and superficial nature she once
lamented, After the period of sex-attraction has passed, women have no power in America she
blended Edisons circadian relentlessness andTchaikovskys work ethic:
She took pride in the fact that she had arrived in New York with only fifty dollars in her pocket, and that the thousands of
dollars now in her bank account had come by virtue of her own pen. Capable of working for eighteen hours at a stretch,
she wrote book reviews, essays, feature articles, and poetry in the classical vein. She was a believer, more than
anything else, in the joys of literature, which she had first experienced as a girl in ancient volumes of Shakespeare and
Cervantes that she found in the library of her familys plantation house. (She taught herself French while she churned
butter, so that she might read RousseausConfessions in the original a book, as it turned out, that she hated.) She
cared nothing for fame, and indeed found the prospect of it distasteful.
And yet, despite their competitive circumstances and seemingly divergent dispositions, something greater
bound the two women together, some ineffable force of culture that quietly united them in a bold defiance of
their eras normative biases:
On the surface the two women were about as different as could be: one woman a Northerner, the other from the
South; one a scrappy, hard-driving crusader, the other priding herself on her gentility; one seeking out the most
sensational of news stories, the other preferring novels and poetry and disdaining much newspaper writing as a wild,
crooked, shrieking hodge-podge, a caricature of life. Elizabeth Bisland hosted tea parties; Nellie Bly was known to
frequent ORourkes saloon on the Bowery. But each of them was acutely conscious of the unequal position of women
in America. Each had grown up without much money and had come to New York to make a place for herself in big-city
journalism, achieving a hard-won success in what was still, unquestionably, a mans world.
Originally featured in May read the full article, including Blys entertaining illustrated packing list, here.
13. DONT GO BACK TO SCHOOL
The present education system is the trampling of the herd, legendary
architect Frank Lloyd Wright lamented in 1956. Half a century later, I
started Brain Pickings in large part out of frustration and disappointment with
my trampling experience of our culturally fetishized Ivy League education. I
found myself intellectually and creatively unstimulated by the industrialized
model of the large lecture hall, the PowerPoint presentations, the standardized
tests assessing my rote memorization of facts rather than my ability to
transmute that factual knowledge into a pattern-recognition mechanism that
connects different disciplines to cultivate wisdom about how the world works
and a moral lens on how it shouldwork. So Brain Pickings became the record
of my alternative learning, of that cross-disciplinary curiosity that took me
from art to psychology to history to science, by way of the myriad pieces of knowledge I discovered and
connected on my own. I didnt live up to the entrepreneurial ideal of the college drop-out and begrudgingly
graduated with honors, but refused to go to my own graduation and decided never to go back to school.
Years later, Ive learned more in the course of writing and researching the thousands of articles to date than in
all the years of my formal education combined.
So, in 2012, when I found out that writer Kio Stark was crowdfunding a book that would serve as a manifesto
for learning outside formal education, I eagerly chipped in. Now, Dont Go Back to School: A Handbook for
Learning Anything is out and is everything I couldve wished for when I was in college, an essential piece of
cultural literacy, at once tantalizing and practically grounded assurance that success doesnt lie at the end of a
single highway but is sprinkled along a thousand alternative paths. Stark describes it as a radical project, the
opposite of reform not about fixing school [but] about transforming learning and making traditional
school one among many options rather than the only option. Through a series of interviews with independent
learners who have reached success and happiness in fields as diverse as journalism, illustration, and molecular
biology, Stark who herself dropped out of a graduate program at Yale, despite being offered a prestigious
fellowship cracks open the secret to defining your own success and finding your purpose outside the factory
model of formal education. She notes the patterns that emerge:
People who forgo school build their own infrastructures. They create and borrow and reinvent the best that formal
schooling has to offer, and they leave the worst behind. That buys them the freedom to learn on their own terms.
[]
From their stories, youll see that when you step away from the prepackaged structure of traditional education, youll
discover that there are many more ways to learn outside school than within.
Reflecting on her own exit from academia, Stark articulates a much more broadly applicable insight:
A gracefully executed quit is a beautiful thing, opening up more doors than it closes.
But despite discovering in dismay that liberal arts graduate school is professional school for professors,
which she had no interest in becoming, Stark did learn something immensely valuable from her third year of
independent study, during which she read about 200 books of her own choosing:
I learned how to teach myself. I had to make my own reading lists for the exams, which meant I learned how to take a
subject I was interested in and make myself a map for learning it.

The interviews revealed four key common tangents: learning is collaborative rather than done alone; the
importance of academic credentials in many professions is declining; the most fulfilling learning tends to take
place outside of school; and those happiest about learning are those who learn out of intrinsic motivation rather
than in pursuit of extrinsic rewards. The first of these insights, of course, appears on the surface to contradict
the very notion of independent learning, but Stark offers an eloquent semantic caveat:
Independent learning suggests ideas such as self-taught, or autodidact. These imply that independence means
working solo. But thats just not how it happens. People dont learn in isolation. When I talk about independent learners, I
dont mean people learning alone. Im talking about learning that happens independent of schools.
[]
Anyone who really wants to learn without school has to find other people to learn with and from. Thats the open secret
of learning outside of school. Its a social act. Learning is something we do together.
Independent learners are interdependent learners.

Much of the argument for formal education rests on statistics indicating that people with college and graduate
degrees earn more. But those statistics, Stark notes, suffer an important and rarely heeded bias:
The problem is that this statistic is based on long-term data, gathered from a period of moderate loan debt, easy
employability, and annual increases in the value of a college degree. These conditions have been the case for college
grads for decades. Given the dramatically changed circumstances grads today face, we already know that the trends
for debt, employability, and the value of a degree have all degraded, and we cannot assume the trend toward greater
lifetime earnings will hold true for the current generation. This is a critical omission from media coverage. The fact is we
do not know. Theres absolutely no guarantee it will hold true.
Some heartening evidence suggests the blind reliance on degrees might be beginning to change. Stark cites
Zappos CEO Tony Hsieh:
I havent looked at a rsum in years. I hire people based on their skills and whether or not they are going to fit our
culture.

Another common argument for formal education extols the alleged advantages of its structure, proposing that
homework assignments, reading schedules, and regular standardized testing would motivate you to learn with
greater rigor. But, as Daniel Pink has written about the psychology of motivation, in school, as in work,
intrinsic drives far outweigh extrinsic, carrots-and-sticks paradigms of reward and punishment, rendering this
argument unsound. Stark writes:
Learning outside school is necessarily driven by an internal engine. [I]ndependent learners stick with the reading,
thinking, making, and experimenting by which they learn because they do it for love, to scratch an itch, to satisfy
curiosity, following the compass of passion and wonder about the world.
So how can you best fuel that internal engine of learning outside the depot of formal education? Stark offers an
essential insight, which places self-discovery at the heart of acquiring external knowledge:
Learning your own way means finding the methods that work best for you and creating conditions that support
sustained motivation. Perseverance, pleasure, and the ability to retain what you learn are among the wonderful
byproducts of getting to learn using methods that suit you best and in contexts that keep you going. Figuring out your
personal approach to each of these takes trial and error.
[]
For independent learners, its essential to find the process and methods that match your instinctual tendencies as a
learner. Everyone I talked to went through a period of experimenting and sorting out what works for them, and theyve
become highly aware of their own preferences. Theyre clear that learning by methods that dont suit them shuts down
their drive and diminishes their enjoyment of learning. Independent learners also find that their preferred methods are
different for different areas. So one of the keys to success and enjoyment as an independent learner is to discover how
you learn.
[]
School isnt very good at dealing with the multiplicity of individual learning preferences, and its not very good at helping
you figure out what works for you.

Echoing Neil deGrasse Tyson, who has argued that every child is a scientistsince curiosity is coded into our
DNA, and Sir Ken Robinson, who has lamented that the industrial model of education schools us out of our
inborn curiosity, Stark observes:
Any young child you observe displays these traits. But passion and curiosity can be easily lost. School itself can be a
primary cause; arbitrary motivators such as grades leave little room for variation in students abilities and interests, and
fail to reward curiosity itself. There are also significant social factors working against childrens natural curiosity and
capacity for learning, such as family support or the lack of it, or a degree of poverty that puts families in survival mode
with little room to nurture curiosity.

Stark returns to the question of motivators that do work, once again calling to mind Pinks advocacy
of autonomy, mastery, and purpose as the trifecta of success. She writes:
[T]hree broadly defined elements of the learning experience support internal motivation and the persistence it enables.
Internal motivation relies on learners having autonomy in their learning, a progressing sense of competence in their skills
and knowledge, and the ability to learn in a concrete or real world context rather than in the abstract. These are mostly
absent from classroom learning. Autonomy is rare, useful context is absent, and schools means for affirming
competence often feel so arbitrary as to be almost without use and are sometimes actively demotivating. . . .
[A]utonomy means that you follow your own path. You learn what you want to learn, when and how you want to learn it,
for your own reasons. Your impetus to learn comes from within because you control the conditions of your learning
rather than working within a structure thats pre-made and inflexible.
The second thing you need to stick with learning independently is to set your own goals toward an increasing sense of
competence. You need to create a feedback loop that confirms your work is worth it and keeps you moving forward. In
school this is provided by advancing through the steps of the linear path within an individual class or a set curriculum, as
well as from feedback from grades and praise.
But Stark found that outside of school, those most successful at learning sought their sense of competence
through alternative sources. Many, like James Mangan advised in his 1936 blueprint to acquiring knowledge,
solidified their learning by teaching it to other people, increasing their own sense of mastery and deepening
their understanding. Others centered their learning around specific projects, which enabled them to make
progress more modular and thus more attainable. Another cohort cited failure as an essential part of the road to
mastery. Stark continues:
The third thing [that] can make or break your ability to sustain internal motivation is to situate what youre learning in a
context that matters to you. In some cases, the context is a specific project you want to accomplish, which also
functions to support your sense of progress.
She sums up the failings of the establishment:
School is not designed to offer these three conditions; autonomy and context are sorely lacking in classrooms. School
can provide a sense of increasing mastery, via grades and moving from introductory classes to harder ones. But a
sense of true competence is harder to come by in a school environment. Fortunately, there are professors in higher
education who are working to change the motivational structures that underlie their curricula.

The interviews, to be sure, offer a remarkably diverse array of callings, underpinned by a number of shared
values and common characteristics. Computational biologist Florian Wagner, for instance, echoes Steve
Jobss famous words on the secret of life in articulating a sentiment shared by many of the other interviewees:
There is something really special about when you first realize you can figure out really cool things completely on your
own. That alone is a valuable lesson in life.
Investigative journalist Quinn Norton subscribes to Mangans prescription for learning by teaching:
I ended up teaching [my] knowledge to others at the school. Thats one of my most effective ways to learn, by teaching;
you just have to stay a week ahead of your students. Everything I learned, I immediately turned around and taught to
others.
She also used the gift of ignorance to proactively drive her knowledge forward:
When I wanted to learn something new as a professional writer, Id pitch a story on it. I was interested in neurology, and I
figured, why dont I start interviewing neurologists? The great thing about being a journalist is that you can pick up the
phone and talk to anybody. It was just like what I found out about learning from experts on mailing lists. People like to
talk about what they know.
Norton speaks to the usefulness of useless knowledge, not only in ones own intellectual development but also
as social currency:
Im stuffed with trivial, useless knowledge, on a panoply of bizarre topics, so I can find something that theyre interested
in that I know something about. Being able to do that is tremendously socially valuable. The exchange of knowledge is a
very human way to learn. I try never to walk into a room where I want to get information without knowing what Im
bringing to the other person.
[]
I think part of the problem with the usual mindset of the student is that its like being a sponge. Its passive. Its not about
having something to bring to the interaction. People who are experts in things are experts because they like learning.

Software engineer, artist, and University of Texas molecular biologist Zack Booth Simpson speaks to the value
of cultivating what William Gibson has calleda personal micro-culture and learning from the people with
whom you surround yourself:
In a way, the best education you can get is just talking with people who are really smart and interested in things, and you
can get that for the cost of lunch.
Artist Molly Crabapple, who inked this beautiful illustration of Salvador Dals creative credo and live-
sketched Susan Cains talk on the power of introverts, recalls how self-initiated reading shaped her life:
I was a constant reader. At home, I lived next to this thrift store that sold paperbacks for 10 apiece so I would go and
buy massive stacks of paperback books on everything. Everything from trashy 1970s romance novels to Plato. When I
went to Europe, I brought with me every single book that I didnt think I would read voluntarily, because I figured if I was
on a bus ride, I would read them. So I read Plato and Dantes Inferno, and all types of literature. I got my education on
the bus.

Originally featured in May read more here.
* * *
For a subject-specific lens on the years finest reading, revisit the best books onbest-of reading lists which
covered writing and creativity, photography,psychology and philosophy, art and design, history and
biography, science and technology, childrens literature, and pets and animals.
7 Essential Books on Music, Emotion, and the Brain
by Maria Popova
What Freud has to do with auditory cheesecake, European opera and
world peace.
Last year, Horizons fascinating documentary on how music works was one of our most-
liked pickings of 2010. But perhaps even more fascinating than the subject of how music
works is the question of why it makes us feel the way it does. Today, we try to answer it
with seven essential books that bridge music, emotion and cognition, peeling away at that
tender intersection of where your brain ends and your soul begins.
MUSICOPHILIA
We love the work of neuroscientist and prolific author Oliver Sacks, whose latest
book, The Minds Eye, was one of our favorite brain books last year. But some of
his most compelling work has to do with the neuropscyhology of how music can
transform our cognition, our behavior, and our very selves. InMusicophilia: Tales of Music
and the Brain, Revised and Expanded Edition, Sacks explores the most extreme of these
transformations and how simple harmonies can profoundly change lives. From clinical studies
to examples from pop culture did you know that Ray Charles believed he was born with
the music inside [him]? Sacks delivers a fascinating yet remarkably readable tale that tells
the story, our story, of humanity as a truly musical species.
THIS IS YOUR BRAIN ON MUSIC
Why music makes us feel the way it does is on par with questions about the
nature of divinity or the origin of love. In This I s Your Brain on Music: The
Science of a Human Obsession, Daniel Levitin sets out to answer it an
ambitious task he tackles through a range of lenses, from a digestible explanation of key technical
constructs like scale, tone and timbre to compelling cross-disciplinary reflections spanning
neurobiology, philosophy, cognitive psychology, memory theory, behavioral science, Gestalt
psychology and more. He illuminates diverse subjects like what accounts for the diversity of
musical tastes and what makes a music expert, framing music processing as a fundamental
cognitive function embedded in human nature. Most impressively, however, Levitin manages to do
this while preserving the without subtracting from the intuitive, intangible magic of powerful
music, dissecting its elements with the rigor of a researcher while preserving its magnetism with
the tenderness of a music lover.
Never ones to pass up a good ol fashioned erudite throw-down, we cant resist pointing out that the books
final chapter, The Music Instinct, may be the juciest: Its a direct response to Harvard psycholinguist Steven
Pinker, who in a 1997 talk famously called music auditory cheesecake and dismissed it as evolutionarily
useless, displacing demands from areas of the brain that should be handling more important functions like
language. (Obviously, as much as we love Pinker, we think hes dead wrong.) Levitin debunks this contention
with a mighty arsenal of research across anthropology, history and cognitive science, alongside chuckle-
worthy pop culture examples. (Its safe to assume that it was musical talent, rather than any other, erm,
evolutionary advantage, that helped Mick Jagger propagate his genes.)
MUSIC, LANGUAGE, AND THE BRAIN
As if to drive a stake through the heart of Levitin and Pinkers debate,Music, Language, and the
Brain by Aniruddh Patel both a musician himself and one of the greatest living neuroscientists
dissects the unique neuropsychological relationship between two of the most unique hallmarks
of our species. Rigorously researched and absorbingly narrated, the book traces the origins of
humanitys understanding of this correlation, dating as far back as the philosophical debates of
Ancient Greece, and challenges the scientific communitys longstanding assumption that music
and language evolved independently of one another. Its the kind of read that will leave you at once astounded
by how much youve learned about its subject and keenly aware of how little you how little we, as a culture
know about it.
Patel also offers this beautiful definition of what music is:
Sound organized in time, intended for, or perceived as, aesthetic experience.
Its worth noting that Music, Language, and the Brain makes a fine addition to our list of 5 must-read books
about language.
LISTEN TO THIS
In 2008, New Yorker music critic Alex Ross published The Rest Is Noise: Listening to the
Twentieth Century a remarkable historical and social context for contemporary music, which
went on to become one of the most influential music history books ever written. Last fall, Ross
released his highly anticipated sequel: Listen to This an outstanding effort to explain and
understand the world through its musical proclivities, from European opera to Chinese classical
music to Bjork. Though the book, an anthology of the authors most acclaimed essays with a
deeper focus on classical music, is further removed from neuroscience than the rest on this list,
Rosss astute observations on the emotional and social experience of music make it an indispensable addition
nonetheless.
MUSIC, THE BRAIN AND ECSTASY
If the human voice is the greatest instrument, as the widespread music teacher preaching goes, then
the brain is the greatest composer. Every time we perform, compose or merely listen to music, the
brain plays high-level Tetris with a range of devices, harmonies and patterns, creating emotional
meaning out of the elements of sound and often extracting intense pleasure. In Music, The Brain,
And Ecstasy: How Music Captures Our Imagination, composer Robert Jourdain examines
musics unusual emotive power through little-known facts and physiological phenomena and
historical anecdotes. Perhaps most fascinatingly, he pins down the origin of pleasure in music as a
consequence of a series of tonal deviations that create a conflict in the brain, resolved with a return to the tonal
center, which gives us a sensation of bliss. This sequence of conflict and resolution, he explains, can come
from the four key elements of music: rhythm, melody. phrase, and harmony. Ecstasy is the result of a
resolution that comes once a conflict has reached the limit of the listeners comprehension ability in tonal
space-time.
THE TAO OF MUSIC
Traditional self-help books are the pesky cold sore swapped between the lips of legitimate
literature and serious psychology. And then there are the books that actually help the self in smart,
non-pedantic ways involving no worksheets or mirror nodding. Thats exactly what John Ortiz
does in The Tao of Music: Sound Psychology, blending the extraordinary power of music with the
principles of Taoist philosophy to deliver an unusual yet captivating proposition: You can enlist
your music library in improving your performance and state of mind across everyday challenges
like keeping anger at bay, breaking the spell of procrastination, learning to be fully present with
romantic relationships, and mastering the art of true relaxation. Through cognitive-behavioral exercises,
meditative techniques and melodic visualizations, Ortiz offers a powerful music-driven toolkit for navigating
lifes obstacles, and even curates specific musical menus of songs and melodies that target specific
emotional states and psychological dispositions.
MUSIC AND THE MIND
Nearly two decades after its original publication, Anthony Storrs Music and the Mind remains
an essential and timeless prism for looking at one of humanitys greatest treasures. From the
biological basis of cognition to a thoughtful analysis of the views held by historys greatest
philosophers to the evolution of the Western tonal system, Storr addresses some of the most
fundamental questions about music, like why a minor scale always sounds sad and a major scale
happy, and offers an evidence-backed yet comfortingly human grand theory for the very purpose
of music: Peace, resolution and serenity of spirit.
Words on Words: 5 Timelessly Stimulating Books
About Language
by Maria Popova
What single Chinese men have to do with evolution and insults from
Virginia Woolf.
We love, love, love words and language. And what better way to celebrate them than through the written word
itself? Today, we turn to five of our favorite books on language, spanning the entire spectrum from serious
science to serious entertainment value.
THE STUFF OF THOUGHT
<="" a="" style="border: 0px; margin: 5px 0px 3px 15px;">Harvards Steven Pinker is
easily the worlds most prominent and prolific psycholinguist, whose multi-faceted work
draws on visual cognition, evolutionary science, developmental psychology and
computational theory of mind to explain the origin and function of language. The Stuff of Thought: Language
as a Window into Human Naturereverse-engineers our relationship with language, exploring what the words
we use reveal about the way we think. The book is structured into different chapters, each looking at a different
tool we use to manage information flow, from naming to swearing and politeness to metaphor and euphemism.
From Shakespeare to pop songs, Pinker uses a potent blend of digestible examples and empirical evidence to
distill the fundamental fascination of language: What we mean when we say.
Sample The Stuff of Thought with Pinkers fantastic 2007 TED talk:
THE SNARK HANDBOOK
In 2009, The Snark Handbook: A Reference Guide to Verbal Sparring became an instant favorite
with its enlightening and entertaining compendium of historys greatest masterpieces in the art of
mockery, contextualizing todays era of snark-humor and equipping us with the shiniest verbal
armor to thrive as victor knights in it. Last year, author Lawrence Dorfman released a worthy
sequel: The Snark Handbook: I nsult Edition: Comebacks, Taunts, and Effronteries a
linguistic arsenal full of strategic instructions on how and when to throw the jabs of well-timed
snark alongside a well-curated collection of historys most skilled literary insult-maestros.
Every time I read Pride and Prejudice, I want to dig her up and hit her over the skull with her own shin-bone. ~ Mark
Twain on Jane Austen
Its a new low for actresses when you have to wonder whats between her ears instead of her legs. ~ Katherine
Hepburn on Sharon Stone
I am reading Henry James and feel myself as one entombed in a block of smooth amber. ~ Virginia
Woolfon Henry James
He was a great friend of mine. Well, as much as you could be a friend of his, unless you were a fourteen-year-old
nymphet. ~ Capote on Faulkner
Ultimately, the book is the yellow brick road to what, deep down, you know you always knew you were:
Better than everybody else. (Read our full review here.)
KEYWORDS
Originally published in 1976 by legendary Welsh novelist and critic Raymond
Williams, Keywords: A Vocabulary of Culture and Society offers a fascinating
and timeless lens on language from a cultural rather than etymological
standpoint, examining the history of over 100 familiar yet misunderstood or
ambiguous words, from art to nature to welfare to originality.
The book begins with an essay on culture itself, dissecting the historical
development and social appropriation of this ubiquitous and far-reaching
semantic construct. It paints a living portrait of the constant transformation of
culture as reflected in natural language. So seminal was Williams work that in
2005, Blackwell attempted an ambitious update to his text in New Keywords: A
Revised Vocabulary of Culture and Society.
IN OTHER WORDS
As beautiful as the English language may be, it isnt without insufficiencies. C. J. Moorescurates
the most poetic of them rich words and phrases from other langauges that dont have an exact
translation in English, but convey powerful, deeply human concepts, often unique to the
experience of the culture from which they came. (For instance, in Tierra del Fuego there is a
specific word mamihlapinatapei for that an expressive, meaningful romantic silence
between two people. And in China, gagung literally means bare sticks but signifies the
growing population of men who will will remain unmarried because Chinas one-child policy and
unabashed preference for male progeny has reduced the proportion of women.)
Witty and illuminating, the book covers 10 different types of languages spanning across various
eras and locales, from ancient and classical to indigenous to African to Scandinavian, digging to find the
precious meanings lost in translation.
IM NOT HANGING NOODLES ON YOUR EARS
From researcher Jag Bhallacomes Im Not Hanging Noodles on Your Ears
and Other I ntriguing I dioms From Around the World an entertaining piece
of linguistic tourism, exploring how different cultures construct their worldview
through the nuances of language.
The book is divided into different themes, from food to love to just about
everything in between, that reveal specific cultural dispositions towards these
subjects through the language in which they are framed.
And on a semi-aside, @hangingnoodles is a must-follow on Twitter, a treasure
trove of interestingness at the intersection of science and culture.
J. R. R. Tolkien on Fairy Tales, Language, the
Psychology of Fantasy, and Why Theres No Such
Thing as Writing For Children
by Maria Popova
Creative fantasy, because it is mainly trying to do something else
may open your hoard and let all the locked things fly away like cage-
birds.
I do not believe that I have ever written a childrens book, the
great Maurice Sendak once said in an interview. I dont write for
children, he told Colbert. I write and somebody says, Thats for
children! This sentiment the idea that designating certain types of
literature as childrens is a choice entirely arbitrary and entirely made by
adults has since beeneloquently echoed by Neil Gaiman, but isnt, in fact, a
new idea.
On March 8, 1939, J. R. R. Tolkien, celebrated as one of the greatest fantasy
writers in history, gave a lecture titled Fairy Stories, eventually adapted into
an essay retitled On Fairy-Stories and included in the appendix to Tales
from the Perilous Realm(public library). At the crux of his argument, which
explores the nature of fantasy and the cultural role of fairy tales, is the same
profound conviction that there is no such thing as writing for children.

J. R. R. Tolkien's original illustrations for the first edition of The Hobbit, 1936. Click image for details.
Tolkien begins at the beginning, by defining what a fairy tale is:
A fairy-story is one which touches on or uses Faerie, whatever its own main purpose may be: satire, adventure,
morality, fantasy. Faerie itself may perhaps most nearly be translated by Magic but it is magic of a peculiar mood and
power, at the furthest pole from the vulgar devices of the laborious, scientific, magician. There is one proviso : if there is
any satire present in the tale, one thing must not be made fun of, the magic itself. That must in that story be taken
seriously, neither laughed at nor explained away.

Illustration for Hans Christian Andersen's 'The Snow Queen' by Katharine Beverley and Elizabeth
Ellender, 1929. Click image for details.
He then explores the relationship between fairy tales and language, denouncing Max Mllers view of
mythology as a disease of language:
Mythology is not a disease at all, though it may like all human things become diseased. You might as well say that
thinking is a disease of the mind. It would be more near the truth to say that languages, especially modern European
languages, are a disease of mythology. But Language cannot, all the same, be dismissed. The incarnate mind, the
tongue, and the tale are in our world coeval. The human mind, endowed with the powers of generalization and
abstraction, sees not only green-grass, discriminating it from other things (and finding it fair to look upon), but sees that it
is green as well as being grass. But how powerful, how stimulating to the very faculty that produced it, was the invention
of the adjective: no spell or incantation in Faerie is more potent. And that is not surprising: such incantations might
indeed be said to be only another view of adjectives, a part of speech in a mythical grammar. The mind that thought of
light, heavy, grey, yellow, still, swift, also conceived of magic that would make heavy things light and able to fly, turn grey
lead into yellow gold, and the still rock into a swift water. If it could do the one, it could do the other; it inevitably did both.
When we can take green from grass, blue from heaven, and red from blood, we have already an enchanters power
upon one plane; and the desire to wield that power in the world external to our minds awakes. It does not follow that we
shall use that power well upon any plane. We may put a deadly green upon a mans face and produce a horror; we
may make the rare and terrible blue moon to shine; or we may cause woods to spring with silver leaves and rams to
wear fleeces of gold, and put hot fire into the belly of the cold worm. But in such fantasy, as it is called, new form is
made; Faerie begins; Man becomes a sub-creator.

Illustration for Hans Christian Andersen's 'The Darning Needle' by Maurice Sendak, 1959. Click image
for details.
Like Sendak and Gaiman, Tolkien insists that fairy tales arent inherently for children but that we, as adults,
simply decide that they are, based on a series of misconceptions about both the nature of this literature and the
nature of children:
It is usually assumed that children are the natural or the specially appropriate audience for fairy-stories. In describing a
fairy-story which they think adults might possibly read for their own entertainment, reviewers frequently indulge in such
waggeries as: this book is for children from the ages of six to sixty. But I have never yet seen the puff of a new motor-
model that began thus: this toy will amuse infants from seventeen to seventy; though that to my mind would be much
more appropriate. Is there any essential connexion between children and fairy-stories? Is there any call for comment, if
an adult reads them for himself? Reads them as tales, that is, not studies them as curios. Adults are allowed to collect
and study anything, even old theatre programmes or paper bags.
[]
Among those who still have enough wisdom not to think fairy-stories pernicious, the common opinion seems to be that
there is a natural connexion between the minds of children and fairy-stories, of the same order as the connexion
between childrens bodies and milk. I think this is an error; at best an error of false sentiment, and one that is therefore
most often made by those who, for whatever private reason (such as childlessness), tend to think of children as a
special kind of creature, almost a different race, rather than as normal, if immature, members of a particular family, and
of the human family at large.

Illustration for Howard Pyle's 'The Swan Maiden' by Alice and Martin Provensen, 1971. Click image for
details.
He argues, instead, that the stereotype of fairy tales being associated with children and native to their world is
an accident of our domestic history:
Fairy-stories have in the modern lettered world been relegated to the nursery, as shabby or old-fashioned furniture is
relegated to the play-room, primarily because the adults do not want it, and do not mind if it is misused. It is not the
choice of the children which decides this. Children as a classexcept in a common lack of experience they are not
oneneither like fairy-stories more, nor understand them better than adults do; and no more than they like many other
things. They are young and growing, and normally have keen appetites, so the fairy-stories as a rule go down well
enough. But in fact only some children, and some adults, have any special taste for them; and when they have it, it is not
exclusive, nor even necessarily dominant. It is a taste, too, that would not appear, I think, very early in childhood without
artificial stimulus; it is certainly one that does not decrease but increases with age, if it is innate.
[]
The nursery and schoolroom are merely given such tastes and glimpses of the adult thing as seem fit for them in adult
opinion (often much mistaken). Any one of these things would, if left altogether in the nursery, become gravely impaired.
So would a beautiful table, a good picture, or a useful machine (such as a microscope), be defaced or broken, if it were
left long unregarded in a schoolroom. Fairy-stories banished in this way, cut off from a full adult art, would in the end be
ruined; indeed in so far as they have been so banished, they have been ruined.
Tolkien then moves on to the subject of fantasy, a frequently misunderstood faculty of the imagination.
The mental power of image-making is one thing, or aspect; and it should appropriately be called Imagination. The
perception of the image, the grasp of its implications, and the control, which are necessary to a successful expression,
may vary in vividness and strength: but this is a difference of degree in Imagination, not a difference in kind. The
achievement of the expression, which gives (or seems to give) the inner consistency of reality, is indeed another thing,
or aspect, needing another name: Art, the operative link between Imagination and the final result, Sub-creation. For my
present purpose I require a word which shall embrace both the Sub- creative Art in itself and a quality of strangeness
and wonder in the Expression, derived from the Image: a quality essential to fairy-story. I propose, therefore, to arrogate
to myself the powers of Humpty-Dumpty, and to use Fantasy for this purpose: in a sense, that is, which combines with
its older and higher use as an equivalent of Imagination the derived notions of unreality (that is, of unlikeness to the
Primary World), of freedom from the domination of observed fact, in short of the fantastic. I am thus not only aware but
glad of the etymological and semantic connexions of fantasy with fantastic: with images of things that are not only not
actually present, but which are indeed not to be found in our primary world at all, or are generally believed not to be
found there. But while admitting that, I do not assent to the depreciative tone. That the images are of things not in the
primary world (if that indeed is possible) is a virtue, not a vice. Fantasy (in this sense) is, I think, not a lower but a higher
form of Art, indeed the most nearly pure form, and so (when achieved) the most potent.

Illustration for the fairy tales of Hans Christian Andersen by Japanese artist Takeo Takei, 1928. Click
image for details.
He goes on to argue that, despite the many misconceptions that envelop it, fantasy is far more challenging an
art than nonfiction, for it necessitates the creation of an elaborate, immersive world from scratch, without the
crutch of reality:
Fantasy is difficult to achieve. Fantasy may be, as I think, not less but more sub-creative; but at any rate it is found in
practice that the inner consistency of reality is more difficult to produce, the more unlike are the images and the
rearrangements of primary material to the actual arrangements of the Primary World. It is easier to produce this kind of
reality with more sober material. Fantasy thus, too often, remains undeveloped; it is and has been used frivolously, or
only half-seriously, or merely for decoration: it remains merely fanciful. Anyone inheriting the fantastic device of human
language can say the green sun. Many can then imagine or picture it. But that is not enough though it may already
be a more potent thing than many a thumbnail sketch or transcript of life that receives literary praise.
To make a Secondary World inside which the green sun will be credible, commanding Secondary Belief, will probably
require labour and thought, and will certainly demand a special skill, a kind of elvish craft. Few attempt such difficult
tasks. But when they are attempted and in any degree accomplished then we have a rare achievement of Art: indeed
narrative art, story-making in its primary and most potent mode.

Scandinavian fairy tale illustration by Kay Nielsen, 1914. Click image for details.
Tolkien makes a curious argument about the oil-and-water relationship between fantasy and drama, managing
to slip in a subtle dig at none other than The Bard:
In human art Fantasy is a thing best left to words, to true literature. It is a misfortune that Drama, an art fundamentally
distinct from Literature, should so commonly be considered together with it, or as a branch of it. Among these
misfortunes we may reckon the depreciation of Fantasy. For in part at least this depreciation is due to the natural desire
of critics to cry up the forms of literature or imagination that they themselves, innately or by training, prefer. And criticism
in a country that has produced so great a Drama, and possesses the works of William Shakespeare, tends to be far too
dramatic. But Drama is naturally hostile to Fantasy. Fantasy, even of the simplest kind, hardly ever succeeds in Drama,
when that is presented as it should be, visibly and audibly acted. Fantastic forms are not to be counterfeited. Men
dressed up as talking animals may achieve buffoonery or mimicry, but they do not achieve Fantasy. . . .
In Macbeth, when it is read, I find the witches tolerable: they have a narrative function and some hint of dark significance;
though they are vulgarized, poor things of their kind. They are almost intolerable in the play. They would be quite
intolerable, if I were not fortified by some memory of them as they are in the story as read. I am told that I should feel
differently if I had the mind of the period, with its witch-hunts and witch-trials. But that is to say: if I regarded the witches as
possible, indeed likely, in the Primary World; in other words, if they ceased to be Fantasy. That argument concedes the
point. To be dissolved, or to be degraded, is the likely fate of Fantasy when a dramatist tries to use it, even such a
dramatist as Shakespeare. Macbeth is indeed a work by a playwright who ought, at least on this occasion, to have
written a story, if he had the skill or patience for that art.

Illustration for The Fairy Tales of E. E. Cummings by John Eaton, 1965. Click image for details.
Another misconception Tolkien debunks speaking to Susan Sontags conviction that polarities only rob life
of dimension is the notion that the fantastical is somehow diametrically opposed to the rational:
Fantasy is a natural human activity. It certainly does not destroy or even insult Reason; and it does not either blunt the
appetite for, nor obscure the perception of, scientific verity. On the contrary. The keener and the clearer is the reason,
the better fantasy will it make. If men were ever in a state in which they did not want to know or could not perceive truth
(facts or evidence), then Fantasy would languish until they were cured. If they ever get into that state (it would not seem
at all impossible), Fantasy will perish, and become Morbid Delusion.
For creative Fantasy is founded upon the hard recognition that things are so in the world as it appears under the sun; on
a recognition of fact, but not a slavery to it. So upon logic was founded the nonsense that displays itself in the tales and
rhymes of Lewis Carroll. If men really could not distinguish between frogs and men, fairy-stories about frog-kings would
not have arisen.
Returning to his notion of the Secondary World driven by Secondary Belief, Tolkien contributes to historys
greatest definitions of art:
Art is the human process that produces by the way (it is not its only or ultimate object) Secondary Belief.

Illustration for Seamus MacManus's 'Feather O' My Wing' by Alice and Martin Provensen, 1971. Click
image for details.
He then adds to the psychological functions of art by exploring the psychological functions of fairy tales, chief
among which is their capacity for rebooting our chronically blunted attention:
Recovery (which includes return and renewal of health) is a re-gainingregaining of a clear view. I do not say seeing
things as they are and involve myself with the philosophers, though I might venture to say seeing things as we are (or
were) meant to see themas things apart from ourselves. We need, in any case, to clean our windows; so that the
things seen clearly may be freed from the drab blur of triteness or familiarityfrom possessiveness. Of all faces those of
our familiares are the ones both most difficult to play fantastic tricks with, and most difficult really to see with fresh
attention, perceiving their likeness and unlikeness: that they are faces, and yet unique faces. This triteness is really the
penalty of appropriation: the things that are trite, or (in a bad sense) familiar, are the things that we have appropriated,
legally or mentally. We say we know them. They have become like the things which once attracted us by their glitter, or
their colour, or their shape, and we laid hands on them, and then locked them in our hoard, acquired them, and
acquiring ceased to look at them.
[]
Creative fantasy, because it is mainly trying to do something else (make something new), may open your hoard and let
all the locked things fly away like cage-birds.

Illustration for The Fairy Tales of E. E. Cummings by John Eaton, 1965. Click image for details.
The full fifteen-page essay, as well as the rest of Tales from the Perilous Realm, is well worth a read.
Complement it with Tolkiens little-known illustrations for the first edition of The Hobbit.
The 11 Best Psychology and Philosophy Books of
2011
by Maria Popova
What it means to be human, how pronouns are secretly shaping our
lives, and why we believe.
After the years best childrens books, art and design books, photography books, science books, history books,
and food books, the 2011 best-of series continues with the most compelling, provocative and thought-
provoking psychology and philosophy books featured here this year.
YOU ARE NOT SO SMART
We spend most of our lives going around believing we are rational, logical beings who make
carefully weighted decisions based on objective facts in stable circumstances. Of course, as
both a growing body of research and our own retrospective experience demonstrate, this
couldnt be further from the truth. For the past three years, David McRaneys cheekily titled
yet infinitely intelligent You Are Not So Smart has been one of my favorite smart blogs,
tirelessly debunking the many ways in which our minds play tricks on us and the false
interpretations we have of those trickeries. This month, YANSS joins my favorite blog-
turned-booksuccess stories with You Are Not So Smart: Why You Have Too Many Friends
on Facebook, Why Your Memory Is Mostly Fiction, and 46 Other Ways Youre Deluding Yourself an
illuminating and just the right magnitude of uncomfortable almanac of some of the most prevalent and
enduring lies we tell ourselves.
The original trailer for the book deals with something the psychology of which weve previously explored
procrastination:
And this excellent alternative trailer is a straight shot to our favorite brilliant book trailers:
From confirmation bias our tendency to seek out information, whether or not its true, that confirms our
existing beliefs, something all the more perilous in the age of the filter bubble to Dunbars Number, our
evolution-imposed upper limit of 150 friends, which pulls into question those common multi-hundred
Facebook friendships, McRaney blends the rigor of his career as a journalist with his remarkable penchant
for synthesis, humanizing some of the most important psychology research of the past century and framing it
in the context of our daily lives.
Despite his second-person directive narrative, McRaney manages to keep his tone from being preachy or
patronizing, instead weaving an implicit we into his you to encompass all our shared human fallibility.
From the greatest scientist to the most humble artisan, every brain within every body is infested with preconceived
notions and patterns of thought that lead it astray without the brain knowing it. So you are in good company. No matter
who your idols and mentors are, they too are prone to spurious speculation. ~ David McRaney
And in the age of Books That Shouldve Stayed Articles, its refreshing to see McRaney distill each of these
complex phenomena in articulate, lucid narratives just the right length to be stimulating without being
tediously prolix.
Originally featured in November.
MONOCULTURE
The universe is made of stories, not atoms, poet Muriel Rukeyser famously proclaimed. The
stories we tell ourselves and each other are how we make sense of the world and our place in it.
Some stories become so sticky, so pervasive that we internalize them to a point where we no
longer see their storiness they become not one of many lenses on reality, but reality itself. And
breaking through them becomes exponentially difficult because part of our shared human
downfall is our egos blind conviction that were autonomous agents acting solely on our own
volition, rolling our eyes at any insinuation we might be influenced by something external to our
selves. Yet we are were infinitely influenced by these stories weve come to internalize,
stories weve heard and repeated so many times theyve become the invisible underpinning of our entire lived
experience.
Thats exactly what F. S. Michaels explores in Monoculture: How One Story I s Changing Everything a
provocative investigation of the dominant story of our time and how its shaping six key areas of our lives: our
work, our relationships with others and the natural world, our education, our physical and mental health, our
communities, and our creativity.
The governing pattern a culture obeys is a master story one narrative in society that takes over the others, shrinking
diversity and forming a monoculture. When youre inside a master story at a particular time in history, you tend to accept
its definition of reality. You unconsciously believe and act on certain things, and disbelieve and fail to act on other things.
Thats the power of the monoculture; its able to direct us without us knowing too much about it. ~ F. S. Michaels
During the Middle Ages, the dominant monoculture was one of religion and superstition. When Galileo
challenged the Catholic Churchs geocentricity with his heliocentric model of the universe, he was accused of
heresy and punished accordingly, but he did spark the drawn of the next monoculture, which reached a tipping
point in the seventeenth century as humanity came to believe the world was fully knowable and discoverable
through science, machines and mathematics the scientific monoculture was born.

Ours, Micheals demonstrates, is a monoculture shaped by economic values and assumptions, and it shapes
everything from the obvious things (our consumer habits, the music we listen to, the clothes we wear) to the
less obvious and more uncomfortable to relinquish the belief of autonomy over (our relationships, our religion,
our appreciation of art).
A monoculture doesnt mean that everyone believes exactly the same thing or acts in exactly the same way, but that we
end up sharing key beliefs and assumptions that direct our lives. Because a monoculture is mostly left unarticulated until
it has been displaced years later, we learn its boundaries by trial and error. We somehow come to know how the mater
story goes, though no one tells us exactly what the story is or what its rules are. We develop a strong sense of whats
expected of us at work, in our families and communities even if we sometimes choose not to meet those
expectations. We usually dont ask ourselves where those expectations came from in the first place. They just exist
or they do until we find ourselves wishing things were different somehow, though we cant say exactly what we would
change, or how. ~ F. S. Michaels
Neither a dreary observation of all the ways in which our economic monoculture has thwarted our ability to
live life fully and authentically nor a blindly optimistic sticking-it-to-the-man kumbaya, Michaels offers a
smart and realistic guide to first recognizing the monoculture and the challenges of transcending its limitations,
then considering ways in which we, as sentient and autonomous individuals, can move past its confines to live
a more authentic life within a broader spectrum of human values.

The independent life begins with discovering what it means to live alongside the monoculture, given your particular
circumstances, in your particular life and time, which will not be duplicated for anyone else. Out of your own struggle to
live an independent life, a parallel structure may eventually be birthed. But the development and visibility of that parallel
structure is not the goal the goal is to live many stories, within a wider spectrum of human values. ~ F. S. Michaels
Weve previously examined various aspects of this dominant story why wechoose what we choose, how the
medias filter bubble shapes our worldview, why we love whom and how we love, how money came to rule
the world butMonoculture, which comes from the lovely Red Clover, weaves these threads and many more
into a single lucid narrative thats bound to first make you somewhat uncomfortable and insecure, then give
you the kind of pause from which you can step back and move forward with more autonomy, authenticity and
mindfulness than ever.
The books epilogue captures Michaels central premise in the most poetic and beautiful way possible:
Once weve thrown off our habitual paths, we think all is lost; but its only here that the new and the good begins. ~Leo
Tolstoy
Originally featured in September.
THINKING, FAST AND SLOW
Legendary Israeli-American psychologist Daniel Kahneman is one of the most influential
thinkers of our time. A Nobel laureate and founding father of modern behavioral economics,
his work has shaped how we think about human error, risk, judgement, decision-making,
happiness, and more. For the past half-century, he has profoundly impacted the academy and
the C-suite, but it wasnt until this years highly anticipated release of his intellectual
memoir, Thinking, Fast and Slow, that Kahnemans extraordinary contribution to
humanitys cerebral growth reached the mainstream in the best way possible.
Absorbingly articulate and infinitely intelligent, this intellectual memoir introduces what Kahneman calls the
machinery of the mind the dual processor of the brain, divided into two distinct systems that dictate how we
think and make decisions. One is fast, intuitive, reactive, and emotional. (If youve read Jonathan Haidts
excellent The Happiness Hypothesis, as you should have, this system maps roughly to the metaphor of the
elephant.) The other is slow, deliberate, methodical, and rational. (Thats Haidts rider.)

The mind functions thanks to a delicate, intricate, sometimes difficult osmotic balance between the two
systems, a push and pull responsible for both our most remarkable capabilities and our enduring flaws. From
the role of optimism in entrepreneurship to the heuristics of happiness to our propensity for error, Kahneman
covers an extraordinary scope of cognitive phenomena to reveal a complex and fallible yet, somehow
comfortingly so, understandable machine we call consciousness.
Much of the discussion in this book is about biases of intuition. However, the focus on error does not denigrate human
intelligence, any more than the attention to diseases in medical texts denies good health [My aim is to] improve the
ability to identify and understand errors of judgment and choice, in others and eventually in ourselves, by providing a
richer and more precise language to discuss them. ~Daniel Kahneman
Among the books most fascinating facets are the notions of the experiencing self and the remembering self,
underpinning the fundamental duality of the human condition one voiceless and immersed in the moment,
the other occupied with keeping score and learning from experience.
I am my remembering self, and the experiencing self, who does my living, is like a stranger to me. ~ Daniel
Kahneman
Kahneman spoke of these two selves and the cognitive traps around them in his fantastic 2010 TED talk:
The word happiness is just not a useful word anymore because we apply it to too many different things.
Whats most enjoyable and compelling about Thinking, Fast and Slow is that its so utterly, refreshingly anti-
Gladwellian. There is nothing pop about Kahnemans psychology, no formulaic story arc, no beating you over
the head with an artificial, buzzword-encrusted Big Idea. Its just the wisdom that comes from five decades of
honest, rigorous scientific work, delivered humbly yet brilliantly, in a way that will forever change the way
you think about thinking.
Originally featured in October.
THE SECRET LIFE OF PRONOUNS
Were social beings wired for communicating with one another, and as new modes and platforms
of communication become available to us, so do new ways of understanding the complex
patterns, motivations and psychosocial phenomena that underpin that communication. Thats
exactly what social psychologist and language expert James W. Pennebaker explores in The
Secret Life of Pronouns: What Our Words Say About Us a fascinating look at what
Pennebakers groundbreaking research in computational linguistics reveals about our emotions,
our sense of self, and our perception of our belonging in society. Analyzing the subtle linguistic
patterns in everything from Craigslist ads to college admission essays to political speeches to
Lady Gaga lyrics, Pennebaker offers hard evidence for the insight that our most unmemorable words
pronouns, prepositions, prefixes can be most telling of true sentiment and intention.
Both a fascinating slice of human psychology and a practical toolkit for deciphering our everyday email
exchanges, tweets and Facebook statuses, the research looks at what our choice of words like I, she,
mine and who reveals about our deeper thoughts, emotions and motivations and those of the people
with whom we communicate.
One of the most interesting results was part of a study my students and I conducted dealing with status in email
correspondence. Basically, we discovered that in any interaction, the person with the higher status uses I-words less
(yes, less) than people who are low in status. ~ James Pennebaker

Like much of scientific discovery, Pennebakers interest in pronouns began as a complete fluke in the
1980s, he and his students discovered when asked to write about emotional upheavals, peoples physical health
improved, indicating that putting emotional experiences into language changed the ways people thought about
their upheavals. They eventually developed a computerized text analysis program to examine how language
use might predict later health improvements, trying to find out whether there was a healthy way to write. To
his surprise, the greatest predictor of health was peoples choice of pronouns.
Scientific American has an excellent interview with Pennebaker:
As I pondered these findings, I started looking at how people used pronouns in other texts blogs, emails, speeches,
class writing assignments, and natural conversation. Remarkably, how people used pronouns was correlated with
almost everything I studied. For example, use of first-person singular pronouns (I, me, my) was consistently related to
gender, age, social class, honesty, status, personality, and much more. Although the findings were often robust, people
in daily life were unable to pick them up when reading or listening to others. It was almost as if there was a secret world
of pronouns that existed outside our awareness. ~ James Pennebaker
From gender differences that turn everything you know on its head to an analysis of the language of suicidal
vs. non-suicidal poets to unexpected insights into famous historical documents, The Secret Life of
Pronouns gleans insights with infinite applications, from government-level lie-detection to your everyday
email inbox, and makes a fine addition to these 5 essential books on language.
Originally featured in September.
INCOGNITO
Sum: Forty Tales from the Afterlives by neuroscientistDavid Eagleman is one of my favorite
books of the past few years, so I was thrilled for the release of Eaglemans latest gem,I ncognito:
The Secret Lives of the Brain a fascinating, dynamic, faceted look under the hood of the
conscious mind to reveal the complex machinery of the subconscious. Equal parts entertaining
and illuminating, the books case studies, examples, and insight are more than mere talking
points to impressed at the next dinner party, poised instead to radically shift your understanding
of the world, other people, and your own mind.
Bringing a storytellers articulate and fluid narrative to a scientists quest, Eagleman dances across an
incredible spectrum of issues brain damage, dating, drugs, beauty, synesthesia, criminal justice, artificial
intelligence, optical illusions and much more to reveal that things we take as passive givens, from our
capacity for seeing a rainbow to our ability to overhear our name in a conversation we werent paying attention
to, are the function of remarkable neural circuitry, biological wiring and cognitive conditioning.
The three-pound organ in your skull with its pink consistency of Jell-o is an alien kind of computational material. It
is composed of miniaturized, self-configuring parts, and it vastly outstrips anything weve dreamt of building. So if you
ever feel lazy or dull, take heart: youre the busiest, brightest thing on the planet. ~ David Eagleman
Sample some of Eaglemans fascinating areas of study with this excellent talk from TEDxAlamo:
Originally featured in June.
WHAT DOES IT MEAN TO BE HUMAN?
Last year, we explored what it means to be human from the perspectives of
three different disciplines philosophy, neuroscience, and evolutionary
biology and that omnibus went on to become one of the most-read articles
in Brain Pickings history. But the question at its heart is among the most
fundamental inquiries of existence, one that has puzzled, tormented, and
inspired humanity for centuries. That is exactly what Joanna Bourke (of Fear:
A Cultural Historyfame) explores in What I t Means to Be Human: Historical
Reflections from the 1800s to the Present.
Decades before women sought liberation in the bicycle or their biceps, a more
rudimentary liberation was at stake. The book opens with a letter penned in
1872 by an anonymous author identified simply as An Earnest
Englishwoman, a letter titled Are Women Animals? by the newspaper editor
who printed it:
Sir,
Whether women are the equals of men has been endlessly debated; whether they have souls has been a moot point;
but can it be too much to ask [for a definitive acknowledgement that at least they are animals? Many hon. members
may object to the proposed Bill enacting that, in statutes respecting the suffrage, 'wherever words occur which import
the masculine gender they shall be held to include women;' but could any object to the insertion of a clause in another
Act that 'whenever the word "animal" occur it shall be held to include women?' Suffer me, thorough your columns, to
appeal to our 650 [parliamentary] representatives, and ask Is there not one among you then who will introduce such
a motion? There would then be at least an equal interdict on wanton barbarity to cat, dog, or woman
Yours respectfully,
AN EARNEST ENGLISHWOMAN
The broader question at the heart of the Earnest Englishwomans outrage, of course, isnt merely about gender
women could have just as easily been any other marginalized group, from non-white Europeans to non-
Westerners to even children, or a delegitimized majority-politically-treated-as-minority more appropriate to
our time, such as the 99 percent. The question, really, is what entitles one to humanness.

But seeking an answer in the ideology of humanism, Bourke is careful to point out, is hasty and incomplete:
The humanist insistence on an autonomous, willful human subject capable of acting independently in the world was
based on a very particular type of human. Human civilization had been forged in the image of the male, white, well-off,
educated human. Humanism installed only some humans at the centre of the universe. It disparaged the woman, the
subaltern and the non-European even more than the animal. As a result, it is hardly surprising that many of these
groups rejected the idea of a universal and straightforward essence of the human, substituting something much more
contingent, outward-facing and complex. To rephrase Simone de Beauvoirs inspired conclusion about women, one is
not born, but made, a human.
Bourke also admonishes against seeing the historical trend in paradigms about humanness as linear, as shifting
from the theological towards the rationalist and scientific or from humanist to post-humanist. How, then,
are we to examine the porous boundary between the human and the animal?
In complex and sometimes contradictory ways, the ideas, values and practices used to justify the sovereignty of a
particular understanding of the human over the rest of sentient life are what create society and social life. Perhaps the
very concept of culture is an attempt to differentiate ourselves from our creatureliness, our fleshly vulnerability.
(Cue in 15 years of leading scientists meditations on culture.)
Bourke goes on to explore historys varied definitions of what it means to be human, which have used a wide
range of imperfect, incomplete criteria intellectual ability, self-consciousness, private property, tool-
making, language, the possession of a soul, and many more.
For Aristotle, writing in the 4th century B.C., it meant having a telos an appropriate end or goal and to
belong to a polis where man could truly speak:
the power of speech is intended to set forth the expedient and inexpedient, and therefore likewise the just and the
unjust. And it is a characteristic of man that he alone has any sense of good and evil, or just and unjust, and the like, and
the association of living beings who have this sense makes a family and a state.
In the early 17th century, Ren Descartes, whose famous statement Cogito ergo sum (I think, therefore I
am) implied only humans possess minds, argued animals were automata moving machines, driven by
instinct alone:
Nature which acts in them according to the disposition of their organs, as one sees that a clock, which is made up of
only wheels and springs can count the hours and measure time more exactly than we can with all our art.
For late 18th-century German philosopher Immanuel Kant, rationality was the litmus test for humanity,
embedded in his categorical claim that the human being was an animal endowed with the capacity of reason:
[The human is] markedly distinguished from all other living beings by his technical predisposition for manipulating things
(mechanically joined with consciousness), by his pragmaticpredisposition (to use other human beings skillfully for his
purposes), and by the moral predisposition in his being (to treat himself and others according to the principle of freedom
under the laws.)

In The Descent of Man, Darwin reflected:
The difference in mind between man and the higher animals, great as it is, is certainly one of degree and not of kind. We
have seen that the senses and intuitions, the various emotions and faculties, such as love, memory, attention, curiosity,
imitation, reason, etc., of which man boasts, may be found in an incipient, or even sometimes in a well-developed
condition, in the lower animals.
(For more on Darwins fascinating studies of emotion, dont forget Darwins Camera.)
Darwins concern was echoed quantitatively by Jared Diamond in 1990s when, in The Third Chimpanzee, he
wondered how the 2.9% genetic difference between two kids of birds or the 2.2% difference between two
gibbons made for a different species, but the 1.6% difference between humans and chimpanzees makes a
different genus.
In the 1930s, Bertrand Lloyd, who penned Humanitarianism and Freedom, observed a difficult paradox of
any definition:
Deny reason to animals, and you must equally deny it to infants; affirm the existence of an immortal soul in your baby or
yourself, and you must at least have the grace to allow something of the kind to your dog.
In 2001, Jacques Derrida articulated a similar concern:
None of the traits by which the most authorized philosophy or culture has thought it possible to recognize this proper of
man none of them is, in all rigor, the exclusive reserve of what we humans call human. Either because some
animals also possess such traits, or because man does not possess it as surely as is claimed.

A Mbius strip, from a 1963 poster of the woodcut by M. C. Escher: 'Which side of the strip are the ants
walking on?'
M. C. Escher's 'Mbius Strip 11' The M. C. Escher Company -- Holland
Curiously, Bourke uses the Mbius strip as the perfect metaphor for deconstructing the human vs. animal
dilemma. Just as the one-sided surface of the strip has no inside or outside; no beginning or end; no single
point of entry or exit; no hierarchical ladder to clamber up or slide down, so the boundaries of the human and
the animal turn out to be as entwined and indistinguishable as the inner and outer sides of a Mbius strip.
Bourke points to Derridas definition as the most rewarding, calling him the philosopher of the Mbius strip.
Ultimately, What I t Means to Be Human is less an answer than it is an invitation to a series of questions,
questions about who and what we are as a species, as souls, and as nodes in a larger complex ecosystem of
sentient beings. As Bourke poetically puts it,
Erasing the awe-inspiring variety of sentient life impoverishes all our lives.
And whether this lens applies to animals or social stereotypes, one thing is certain: At a time when the need to
celebrate both our shared humanity and our meaningful differences is all the more painfully evident, the
question of what makes us human becomes not one of philosophy alone but also of politics, justice, identity,
and every fiber of existence that lies between.
Originally featured earlier this month. For a related read that missed the cut by a hair, see Christian Smiths
excellent What Is A Person.
THE EGO TRICK
How you are you, really? Character is something we tend to think of as a static, enduring
quality, and yet we glorify stories of personal transformation. In reality, our essence oscillates
between a set of hard-wired patterns and a fluid spectrum of tendencies that shift over time and
in reaction to circumstances. This is exactly what journalist Julian Baggini, co-founder of The
Philosophers Magazine, tries to reconcile in The Ego Trick: I n Search of the Self an
absorbing journey across philosophy, anthropology, sociology, neuroscience, religion and
psychology, painting I as a dynamic verb rather than a static noun, a concept in conflict with
much of common sense and, certainly, with the ideals of Romantic individualism we examined this morning.
In his illuminating recent talk at The RSA, Baggini probes deeper into the theory of self-creation and the
essence of our identity.
The topic of personal identity is strictly speaking nonexistent. Its important to recognize that we are not the kind of things
that simply popped into existence at birth, continue to exist, the same thing, then die off the cliff edge or go into another
realm. We are these very remarkably ordered collections of things. It is because were so ordered that we are able to
think of ourselves as being singular persons. But there is no singular person there, that means were forever changing.
~ Julian Baggini
For a great companion read, you wont go wrong with Antonio Damasios excellent Self Comes to Mind:
Constructing the Conscious Brain.
Originally featured in June.
FLOURISH
Back in the day, I had the pleasure of studying under Dr. Martin Seligman, father of the
thriving positive psychology movement a potent antidote to the traditional disease model of
psychology, which focuses on how to relieve suffering rather than how to amplify well-being.
His seminal book, Authentic Happiness, was among the 7 essential books on the art and science
of happiness, and this year marked the release of his highly anticipated follow-up.Flourish: A
Visionary New Understanding of Happiness and Well-being is rather radical departure from
Seligmans prior conception of happiness, which he now frames as overly simplistic and inferior
to the higher ideal of lasting well-being.
Flourish is definitely not a self-help book, though it does offer insightful techniques to optimize yourself, your
relationships and your business for well-being. If anything, it can read a bit wonky at times, as Seligman
delves into fascinating empirical evidence culled from years of rigorous research. But I find this remarkably
refreshing and stimulating amidst the sea of dumbed down psycho-fluff.
Relieving the states that make life miserable has made building the states that make life worth living less of a priority.
The time has finally arrived for a science that seeks to understand positive emotion, build strength and virtue, and
provide guideposts for finding what Aristotle called the good life. ~ Martin Seligman
Seligman identifies five endeavors crucial to human flourishing positive emotion, engagement, good
relationships, meaning and purpose in life, and accomplishment and examines each in detail, ultimately
proposing that public policy have flourishing as its central goal.
The content itself happiness, flow, meaning, love, gratitude, accomplishment, growth, better relationships
constitutes human flourishing. Learning that you can have more of these things is life changing. Glimpsing the vision of a
flourishing human future is life changing. ~ Martin Seligman
Seligmans work over the years has taken him inside the brains of British lords, Australian school kids,
billionaire philanthropists, Army generals, artists, educators, scientists and countless more of humanitys most
interesting and inspired specimens. The insights gleaned from these clinical cases are both sage and surprising,
inviting you to look at the pillars of your own happiness with new eyes.
Originally featured in April.
THE TELL-TALE BRAIN
V.S. Ramachandran one of the most influential neuroscientists of our time, whose work has
not only made seminal contributions to the understanding of autism,phantom
limbs and synesthesia, among other fascinating phenomena, but has also helped introduce
neuroscience to popular culture. The fact that he is better-known as Rama you know, like
Prince or Madonna or Che is a fitting reflection of his cultural cachet. This year, in furthering
the inquiry into what it means to be human, Rama released his highly anticipated new book: The
Tell-Tale Brain: A Neuroscientists Quest for What Makes Us Human an ambitious
exploration of everything from the origins of language to our relationship with art to the very
mental foundation of civilization. Both empirically rooted in specific patient cases and philosophically
speculative in an intelligent, grounded way, with a healthy dose of humor thrown in for good measure, its an
absolute masterpiece of cognitive science and a living manifesto for the study of the brain.
As heady as our progress [in the sciences of the mind] has been, we need to stay completely honest with ourselves and
acknowledge that we have only discovered a tiny fraction of what there is to know about the human brain. But the
modest amount that we have discovered makes for a story more exciting than any Sherlock Holmes novel. I feel certain
that as progress continues through the coming decades, the conceptual twists and technological turns we are in for are
going to be at least as mind bending, at last as intuition shaking, and as simultaneously humbling and exalting to the
human spirit as the conceptual revolutions that upended physics a century ago. The adage that fact is stranger than
fiction seems to be especially true for the workings of the brain. ~ V. S. Ramachandran
You can sample Ramas remarkable quest to illuminate the brain with his excellent 2007 TED talk:
Originally featured in January.
THE BELIEF INSTINCT
Were deeply fascinated by how the human mind makes sense of the
world, and religion is one of the primary sensemaking mechanisms humanity has created to
explain reality. On the heels of our recent explorations ofthe relationship between science and
religion, the neuroscience of being human and the nature of reality comes The Belief Instinct:
The Psychology of Souls, Destiny, and the Meaning of Life an ambitious new investigation
by evolutionary psychologist Jesse Bering, exploring one of the most important questions of human existence.
Eloquently argued and engagingly written, it provides a compelling missing link between theory of mind and
the need for God.
If humans are really natural rather than supernatural beings, what accounts for our beliefs about souls, immortality, a
moral eye in the sky that judges us, and so forth?
A leading scholar of religious cognition, Bering who heads OxfordsExplaining Religion Project
proposes a powerful new hypothesis for the nature, origin and cognitive function of spirituality. Far from
merely regurgitating existing thinking on the subject, he connects dots across different disciplines, ideologies
and materials, from neuroscience to Buddhist scriptures to The Wizard of Oz. Blending empirical evidence
from seminal research with literary allusions and cultural critique, Bering examines the central tenets of
spirituality, from lifes purpose to the notion of afterlife, in a sociotheological context underlines by the rigor
of a serious scientists.
Originally featured in February, and one of our 7 fundamental meditations on faith.
OUT OF CHARACTER
The dichotomy of good and evil is as old as the story of the world, and timeless in its relevance to
just about everything we do in life, from our political and spiritual views to our taste in music, art
and literature to how we think about our simple dietary choices. But while most of us recognize
that these concepts of good and bad arent always black-and-white categories, we never cease to
be surprised when someone or something weve perceived as good does or becomes something
we perceive as bad, from an esteemed politicians transgression to a beloved celebritys slip
into addiction or scientology or otherwise socially undesirable behavior.
In Out of Character: Surprising Truths About the Liar, Cheat, Sinner (and Saint) Lurking in All of Us,
researchers David DeSteno and Piercarlo Valdesoloexplore this curious disconnect through the rigorous lens
of science. Drawing on their research at the Social Emotions Lab at Northeastern University, the authors offer
a fascinating yet highly readable perspective on the psychology of the hero/villain spectrum of human
character, inviting us to reconceive personality, both our own and that of others, with a more balanced moral
view that reflects the fluidity of human psychology.
The derivation of the word character comes from an ancient Greek term referring to the indelible marks stamped on
coins. Once character was pressed into your mind or soul, people assumed it was fixed. But what modern science
repeatedly shows is that this just isnt the case. As we discuss in our book, everyones moral behavior is much more
variable than any of us would have initially predicted. ~ David DeSteno
In this excellent talk from Northeasterns Insights series, DeSteno reveals some of the fascinating research
behind the book and the illuminating insights that came from it.
The analogy of color is an interesting way to think about [character]. Most of us think that colors are very discrete things
somethings red, its got redness; somethings blue, its got blueness. But we are creating these categories. Theyre
not natural kinds, theyre not given in ways that represent fundamentally distinct things. Ultimately, what determines
what colors we see are the frequencies of light waves entering our eyes, so its along a continuum. Its kind of the same
with character. Things blend. We assume that if someone is good, that weve characterized them as good, thats a
discrete category, they cant be bad. And when they are, our categories shatter. Thats because we have this illusory,
arbitrary idea of what vice and virtue mean ~David DeSteno
Ultimately, Out of Character: Surprising Truths About the Liar, Cheat, Sinner (and Saint) Lurking in All of
Us makes a compelling case for seeing human character as a grayscale continuum, not a black-and-white
dichotomy of good and bad, enlisting neuroscience and cognitive psychology to reaffirm the age-old
Aristotelian view of virtue and vice as fluid, interlaced existential capacities.
Originally featured in May.
Babel No More: Inside the Secrets of Superhuman
Language-Learners
by Maria Popova
What a Chilean YouTube disaster and a busy Manhattan restaurant
have to do with the limits of the human brain.
Nineteenth-century Italian cardinal Giuseppe Mezzofanti, a legend in his day,
was said to speak 72 languages. Hungarian hyperpolyglot Lomb Kat, who
taught herself Russian by reading Russian romance novels, insisted that one
learns grammar from language, not language from grammar. Legendary MIT
linguist Ken Hale, who passed away in 2001, had an arsenal of 50 languages
and was rumored to have once learned the notoriously difficult Finnish while
on a flight to Helsinki. Just like extraordinary feats of memory, extraordinary
feats of language serve as a natural experiment probing the limits of the
human brain Mezzofanti maintained that god had given him this
particular power, but did these linguistic superlearners really possess some
significant structural advantage over the rest of us in how their brains were
wired? Thats precisely what journalist and self-described metaphor
designer Michael Erard explores in Babel No More: The Search for the Worlds Most Extraordinary
Language Learners the first serious investigation into the phenomenon of seemingly superhuman
multilingual dexterity and those who have, or claim to have, mastered it, and a fine addition to our favorite
books about language.
To understand the cognitive machinery of such feats, Erard set out to find modern-day Mezzofantis, from an
eccentric Berkeley-based language learning guru and hyperpolyglot hyperglottery, Erard notes, begins at 11
languages to the Lebanese-born, Brazil-based one-time Guinness record holder for 58 languages, who
proceeded to embarrass himself on Chilean national television by not understanding a simple question by a
native speaker. In the process, Erard scrutinizes the very nature of language, its cultural role, and where it
resides in the brain, weaving a fascinating story about our most fundamental storytelling currency.
To grasp the power of language learning as a social facilitator, one need only stroll into a busy Manhattan
restaurant, where mapping the native origin of the staff and patrons might produce a near-complete world atlas.
Erard marvels:
Its amazing that the world runs so well, given that people use languages that they didnt grow up using, havent studied
in schools, and in which theyve never been tested or certified. Yet it does.
(For some related fascination, see David Bellos Is That a Fish in Your Ear?, which delves into what
translation reveals about the human condition.)

And in an age where geography and nationality have been shuffled the forces of globalization, ubiquitous
connectivity, cheap travel, and the Internet, understanding how language lubricates our social interactions is
crucial to making sense of our place in a global world. Erard observes:
Ideas, information, goods, and people are flowing more easily through space, and this is creating a sensibility about
language learning thats rooted more in the trajectories of an individuals life than in ones citizenship or nationality. Its
embedded in economic demands, not the standards of schools or governments. That means that our brains also have
to flow, to remain plastic and open to new skills and information. One of these skills is learning new ways to
communicate.
(The Daily Beast has an excerpt to give you a taste of Erards signature blend of absorbing storytelling and
rigorous research.)
Captivating and illuminating, Babel No Moreis as much an absorbing piece of investigative voyeurism into
superhuman feats as it is an intelligent invitation to visit the outer limits of our own cerebral potential.
Fail Safe: Debbie Millmans Advice on Courage and
the Creative Life
by Maria Popova
Imagine immensities, dont compromise, and dont waste time.
The seasonal trope of the commencement address is upon us as wisdom on
life is being dispensed from graduation podiums around the world.
After Greil Marcuss meditation on the essence of art and Neil Gaimans
counsel on the creative life, here comes a heartening speech
by artist, strategist, and interviewer extraordinaire Debbie Millman,
delivered to the graduating class at San Jose State University. The talk is
based on an essay titledFail Safe from her fantastic 2009
anthology Look Both Ways: I llustrated Essays on the I ntersection of Life
and Design (public library) and which has previously appeared onLiterary
Jukebox. The essay, which explores such existential skills as living with
uncertainty, embracing the unfamiliar,allowing for not knowing, and
cultivating what John Keats has famously termednegative capability, is
reproduced below with the artists permission.







If you imagine less, less will be what you undoubtedly deserve. Do what you love, and dont stop until you get what you
love. Work as hard as you can, imagine immensities, dont compromise, and dont waste time. Start now. Not 20 years
from now, not two weeks from now. Now.
Look Both Ways: I llustrated Essays on the I ntersection of Life and Design is an absolute treasure in its
entirety, the kind of read you revisit again and again, only to discover new meaning and new access to yourself
each time. It was preceded by How to Think Like a Great Graphic Designer and followed by the recent Brand
Thinking and Other Noble Pursuits, both excellent in very different but invariably stimulating ways.
Images and audio courtesy Debbie Millman
7 (More) Obscure Childrens Books by Famous
Adult Lit Authors
by Maria Popova
What a magical car engine has to do with social justice, a parrot named
Arturo and the history of jazz.
A week ago, we featured 7 little-known childrens books by famous authors of grown-up literature, on the
trails of some favorite childrens books with timeless philosophy for grown-ups. The response has been so
fantastic that, today, were back with seven more, based on reader suggestions and belated findings from the
rabbit hole of research surrounding the first installment.
ALDOUS HUXLEY
Aldous Huxley may be best known for his iconic 1932 novelBrave New
World, one of the most important meditations on futurism and how
technology is changing society ever published, but he was also deeply
fascinated by childrens fiction. In 1967, three years after Huxleys death, Random
House released a posthumous volume of the only childrens book he ever wrote, some
23 years earlier.The Crows of Pearblossomtells the story of Mr. and Mrs. Crow, whose
eggs never hatch because the Rattlesnake living at the base of their tree keeps eating
them. After the 297th eaten egg, the hopeful parents set out to kill the snake and enlist
the help of their friend, Mr. Owl, who bakes mud into two stone eggs and paints them
to resemble the Crows eggs. Upon eating them, the Rattlesnake is in so much pain that
he beings to thrash about, tying himself in knots around the branches. Mrs. Crow goes merrily on to hatch
four families of 17 children each, using the snake as a clothesline on which to hang the little crows
diapers.

The original volume was illustrated by the late Barbara Cooney, but a new edition published this spring
features artwork by Sophie Blackall, one of my favorite artists, whose utterly lovely illustrations of Craigslist
missed connections you might recall.



GERTRUDE STEIN
Writer, poet and art collector Gertrude Stein is one of the most beloved
and quoted luminaries of the early 20th century. In 1938, author
Margaret Wise Brown of the freshly founded Young Scott Books became obsessed with convincing leading
adult authors to try their hands at a childrens book. She sent letters to Ernest Hemingway, John Steinbeck, and
Gertrude Stein. Hemingway and Steinbeck expressed no interest, but Stein surprised Brown by saying she
already had a near-complete childrens manuscript titled The World Is Round, and would be happy to have
Young Scott bring it to life. Which they did, though not without drama. Stein demanded that the pages be pink,
the ink blue, and the artwork by illustrator Francis Rose. Young Scott were able to meet the first two demands
despite the technical difficulties, but they didnt want Rose to illustrate the book and asked Stein to instead
choose from several Young Scott illustrators. Reluctantly, she settle don Clement Hurd, whose first illustrated
book had appeared just that year. The World I s Round was eventually published, featuring a mix of
unpunctuated prose and poetry, with a single illustration for each chapter. The original release included a
special edition of 350 slipcase copies autographed by Stein and Hurd.



The wonderful We Too Were Children has the backstory.
JAMES THURBER
In the 1940s and 1950s, celebrated American author and cartoonistJames
Thurber, best-known for his contributions to The New Yorker, penned a number of book-length
fairy tales, some illustrated by acclaimed French-American artist and political cartoonist Marc
Simont. The most famous of them was The 13 Clocks a fantasy tale Thurber wrote in Bermuda in 1950,
telling the story of a mysterious prince who must complete a seemingly impossible challenge to free a maiden,
Princess Saralinda, from the grip of the evil Duke of Coffin Castle. The eccentric book is riddled with
Thurbers famous wordplay and written in a unique cadenced style, making it a fascinating object of linguistic
appreciation and a structural treat for language-lovers of all ages.

For a cherry on top, the current edition features an introduction by none other than Neil Gaiman.
Thanks, stormagnet
CARL SANDBURG
In 1922, nearly two decades before the first of his three Pulitzer Prizes, poet Carl
Sandburg wrote a childrens book titledRootabaga Stories for his three daughters, Margaret,
Janet and Helga, nicknamed Spink, Skabootch and Swipes, respectively. Their nicknames occur
repeatedly in some of the volumes whimsical interrelated short stories.
The book arose from Sandburgs desire to create the then-nonexistent American fairy tales, which he saw as
integral to American childhood, so he set out to replace the incongruous imagery of European fairy tales with
the fictionalized world of the American Midwest, which he called the Rootabaga country, substituting farms,
trains, and corn fairies for castles, knights and royatly. Equal parts fantastical and thoughtful, the stories
captured Sandburgs romantic, hopeful vision of childhood.

In 1923, Sandburg followed up with a sequel, Rootabaga Pigeons, telling tales of Big People Now and
Little People Long Ago.
Thanks, Rachel
SALMAN RUSHDIE
Indian-British novelist Salman Rushdie has had his share ofacclaim and controversy, but one
thing that has remained constant over his prolific career is his penchant for the written word. In
1990, he turned his talents to childrens literature with the release of Haroun and the Sea of
Stories a phantasmagorical allegory for a handful of timely social and social justice problems,
particularly in India, explored through the young protagonist, Haroun, and his fathers
storytelling. The book received a Writers Guild Award for Best Childrens Book that year.
One of the books unexpected treats is breakdown of the meanings and symbolism of the ample
cast of characters names, an intriguing linguistic and semantic bridge to Indian culture.
Twenty years later, just last winter, Rushdie followed up with his highly anticipated second childrens
book, Luka and the Fire of Life: A Novel.
Thanks, SaVen
IAN FLEMING
Ian Fleming is best-known as the creator of one of the best-selling literary works of all time: the
James Bond series. A few years after the birth of his son Caspar in 1952, Fleming decided to
write a childrens book for him, but Chitty Chitty Bang Bang didnt see light of day until 1964,
the year Fleming died. It tells the story of the Potts family and the father figure, Caractacus, who
uses money from the invention of a special candy to buy and repair a unique, magical former race
car, which the family affectionately names Chitty Chitty Bang Bang. Flemings inspiration came from a series
of aero engines built by racing driver and engineer Count Louis Zborowski in the early 1920s, whose first six-
cylinder Maybach aero engine was called Chitty Bang Bang.


The original book was beautifully illustrated in black-and-white by John Burningham and was soon adapted
into the 1968 classic film of the same name starring Dick Van Dyke.
LANGSTON HUGHES
Prolific poet, social activist, novelist, playwright, and columnist Langston Hughes is
considered one of the fathers of jazz poetry, a literary art form that emerged in the 1920s
and eventually became the foundation for modern hip-hop. In 1954, the 42-year-old
Hughes decided to channel his love of jazz into a sort-of-childrens book that educated
young readers about the culture he so loved.The First Book of J azz was born, taking on
the ambitious task of being the first-ever childrens book to review American music, and
to this day arguably the best. Hughes covered every notable aspect of jazz, from the
evolution of its eras to its most celebrated icons to its geography and sub-genres, and made a special point of
highlighting the essential role of African-American musicians in the genres coming of age. Hughes even
covered the technicalities of jazz rhythm, percussion, improvisation, syncopation,blue notes, harmony
with remarkable eloquence that, rather than overwhelming the young reader, exudes the genuine joy of
playing.



Alongside the book, Hughes released a companion record, The Story of Jazz, featuring Hughes lively, vivid
narration of jazz history in three tracks, each focusing on a distinct element of the genre. You can hear
them here.



For more on rare and out-of-print childrens books by famous 20th-century adult authors, I really cant
recommend Ariel S. Winters beautifully written, rigorously researched We Too Were Children enough.
How the Invention of the Alphabet Usurped Female Power in
Society and Sparked the Rise of Patriarchy in Human Culture
By: Maria Popova
A brief history of gender dynamics from page to screen.
The Rosetta Stone may be one of the 100 diagrams that changed the world and language
may have propelled our evolution, but the invention of the written word was not without
its costs. As Sophocles wisely observed, nothing vast enters the life of mortals without a
curse. That curse is what Leonard Shlainexplores in The Alphabet Versus the
Goddess: The Conflict Between Word and Image(public library) a pause-giving look
at the relationship between literacy and patriarchy. Without denying the vastness of the
benefits literacy bestowed upon humanity, Shlain uses Marshall McLuhans famous
dictum the medium is the message to examine how the advent of the written
word and our ability to read reconfigured the human brain, effecting profound changes in
the cultural dynamics of gender roles.
By profession, I am a surgeon I am by nature a storyteller, Shlain tells us, and it is
through this dual lens of critical thinking and enchantment that he examines his
iconoclastic subject a subject whose kernel was born while Shlain was touring
Mediterranean archeological sites in the early 1990s and realized that the majority of
shrines had been originally consecrated to female deities, only to be converted to male-
deity worship later, for unknown reasons. (Beyond the broader cultural appeal such an
observation might hold for a mind as inquisitive as Shlains, its worth noting that he had just sent off his own young daughter
one very special daughter to college and into a world still very much shaped by gender dynamics.) A major culprit in the shift,
Shlain argues, was the invention of the alphabet. (He takes great care to avoid the trap of correlation vs. causation and offers a
wonderfully poetic formulation of the danger of conflating the two: Correlation does not prove causality the
disappearance of the stars at dawn does not cause the sun to rise.)

Illustration by Giselle Potter for Gertrude Stein's posthumously published 'To Do: A Book of Alphabets and Birthdays.' Click
image for details.
Shlain frames the premise:
Of all the sacred cows allowed to roam unimpeded in our culture, few are as revered as literacy. Its benefits have been so incontestable that
in the five millennia since the advent of the written word numerous poets and writers have extolled its virtues. Few paused to consider its
costs. . . . One pernicious effect of literacy has gone largely unnoticed: writing subliminally fosters a patriarchal outlook. Writing of any kind,
but especially its alphabetic form, diminishes feminine values and with them, womens power in the culture.
He defines the feminine outlook as a holistic, simultaneous, synthetic, and concrete view of the world and the masculine as a
linear, sequential, reductionist one characterized by abstract thinking, while recognizing as Susan Sontag did decades earlier
in condemning our cultures artificial polarities that every individual is generously endowed with all the features of both.
Shlain writes:
They coexist as two closely overlapping bell-shaped curves with no feature superior to its reciprocal. These complementary methods of
comprehending reality resemble the ancient Taoist circle symbol of integration and symmetry in which the tension between the energy of
the feminine yin and the masculine yang is exactly balanced. One side without the other is incomplete; together, they form a unified whole
that is stronger than either half. First writing, and then the alphabet, upset this balance. Affected cultures, especially in the West, acquired a
strong yang thrust.

The Rosetta Stone, one of 100 diagrams that changed the world. Click image for details.
The invention of the alphabet, Shlain argues, is what tilted the balance of power toward the masculine a shift that took place
eons ago, but one that is also evidenced by isolated indigenous cultures of the present and recent past:
Anthropological studies of non-literate agricultural societies show that, for the majority, relations between men and women have been more
egalitarian than in more developed societies. Researchers have never proven beyond dispute that there were ever societies in which
women had power and influence greater than or even equal to that of men. Yet, a diverse variety of preliterate agrarian culturesthe
Iroquois and the Hopi in North America, the inhabitants of Polynesia, the African !Kung, and numerous others around the worldhad and
continue to have considerable harmony between the sexes.
He cites the work of legendary anthropologist Claude Lvi-Strauss, who was among the first to examine the dark side of literacy
in 1969:
There is one fact that can be established: the only phenomenon which, always and in all parts of the world, seems to be linked with the
appearance of writing is the establishment of hierarchical societies, consisting of masters and slaves, and where one part of the
population is made to work for the other part.
Shlain puts it in even less uncertain terms than Lvi-Strauss:
Literacy has promoted the subjugation of women by men throughout all but the very recent history of the West. Misogyny and patriarchy rise
and fall with the fortunes of the alphabetic written word.
Written language, Shlain argues, shaped both the development of the human nervous system and the social dynamics of gender
relations, affecting both sides of the nature/nurture equation profoundly:
Although each of us is born with a unique set of genetic instructions, we enter the world as a work-in-progress and await the deft hand of the
ambient culture to sculpt the finishing touches. Among the two most important influences on a child are the emotional constellation of his or
her immediate family and the configuration of his or her culture. Trailing a close third is the principal medium with which the child learns to
perceive and integrate his or her cultures information. This medium will play a role in determining which neuronal pathways of the childs
developing brain will be reinforced.

Artwork from 'Shapes for Sounds,' a visual history of the alphabet. Click image for details.
To illustrate the mesmerism of the written word, Shlain urges us to observe an enthralled four-year-old mastering the letters of
the alphabet an invocation that calls to mind an anecdote my own grandmother likes to tell: One day, when I was in the first
grade and we had just had our first lesson in writing the letters of the alphabet, grandma picked me up from school and made a
quick stop at the supermarket on the way home. She left me with a kindly cashier while she ran inside to grab whatever she
needed to buy. Upon returning, she found me perched up atop the counter, having filled an entire lined notebook with dutifully
drawn letter-curves. She uses this anecdote as evidence of my hunger for learning, but if Shlain is correct, it might be more
indicative of just how early children latch onto the inescapable hegemony of the alphabet. Shlain contemplates this duck-to-water
uptake:
Literacy, once firmly rooted, will eclipse and supplant speech as the principal source of culture-changing information. Adults, for so long
enmeshed in the alphabets visual skein, cannot easily disentangle themselves to assess its effect on culture. One could safely assume that
fish have not yet discovered water.
He juxtaposes the written word with the visual processing of images, exploring the gender implications of this dichotomy:
Images are primarily mental reproductions of the sensual world of vision. Nature and human artifacts both provide the raw material from the
outside that the brain replicates in the inner sanctum of consciousness. Because of their close connection to the world of appearances,
images approximate reality: they are concrete. The brain simultaneously perceives all parts of the whole integrating the parts synthetically
into a gestalt. The majority of images are perceived in an all-at-oncemanner.
Reading words is a different process. When the eye scans distinctive individual letters arranged in a certain linear sequence, a word with
meaning emerges. The meaning of a sentence, such as the one you are now reading, progresses word by word. Comprehension depends
on the sentences syntax, the particular horizontal sequence in which its grammatical elements appear. The use of analysis to break each
sentence down into its component words, or each word down into its component letters, is a prime example ofreductionism. This process
occurs at a speed so rapid that it is below awareness. An alphabet by definition consists of fewer than thirty meaningless symbols that do not
represent the images of anything in particular; a feature that makes them abstract. Although some groupings of words can be grasped in an
all-at-once manner, in the main, the comprehension of written words emerges in a one-at-a-time fashion.
To perceive things such as trees and buildings through images delivered to the eye, the brain uses wholeness, simultaneity, and synthesis.
To ferret out the meaning of alphabetic writing, the brain relies instead on sequence, analysis, and abstraction. Custom and language
associate the former characteristics with the feminine, the latter, with the masculine. As we examine the myths of different cultures, we will
see that these linkages are consistent.
Beyond the biological, Shlain argues, this divergence also manifests in the spiritual aspect of human culture. Returning to the
historical roots of the phenomenon, he points out that while hunter-gatherer societies tend to worship a mixture of male and
female deities, while hunting societies prioritize virile spirits and cultures where gathering is the primary method of survival
instead place greater value on nurturing, the female domain. The parts of the world we often refer to as the cradle of
civilization generally, Mesopotamia, Egypt, China, and Greece were populated primarily by gathering-based cultures and
originally worshipped female deities. But by the fifth century A.D., these objects of worship were almost entirely replaced by
masculine ones, to a point where women were prohibited from conducting a single major Western sacrament.
While Shlain points to influences like foreign invaders, the invention of private property, the formation of archaic states, the
creation of surplus wealth, and the educational disadvantaging of women as partially responsible, he argues that the single most
important factor was the invention of writing:
The introduction of the written word, and then the alphabet, into the social intercourse of humans initiated a fundamental change in the way
newly literate cultures understood their reality. It was this dramatic change in mindset that was primarily responsible for fostering
patriarchy.

Illustration by Sir Quentin Blake from 'Quentin Blakes ABC.' Click image for details.
He turns to the worlds major religions for evidence of the pattern:
The Old Testament was the first alphabetic written work to influence future ages. Attesting to its gravitas, multitudes still read it three
thousand years later. The words on its pages anchor three powerful religions: Judaism, Christianity, and Islam. Each is an exemplar of
patriarchy. Each monotheistic religion features an imageless Father deity whose authority shines through His revealed Word, sanctified in its
written form. Conceiving of a deity who has no concrete image prepares the way for the kind of abstract thinking that inevitably leads to law
codes, dualistic philosophy, and objective science, the signature triad of Western culture. I propose that the profound impact these ancient
scriptures had upon the development of the West depended as much on their being written in an alphabet as on the moral lessons they
contained.
Goddess worship, feminine values, and womens power depend on the ubiquity of the image . God worship, masculine values, and mens
domination of women are bound to the written word. Word and image, like masculine and feminine, are complementary opposites.
Whenever a culture elevates the written word at the expense of the image, patriarchy dominates. When the importance of the image
supersedes the written word, feminine values and egalitarianism flourish.
What is especially interesting is that Shlain was writing in 1998, when the internet as we know it a medium that lends text and
image seemingly equal gravitas was in its infant stage. The golden age of web video was nearly a decade away, as was the
invention of the smartphone camera and its constant connection to the web. Could it be that the world wide web, especially the
image-heavy ecosystem of social sharing, would emerge as an equalizer of gender dynamics? To be sure, the cultural and
biological changes Shlain examines in relation to the invention of the alphabet unfolded over millennia so whatever equalizing
effects the web might have, they wouldnt be fully detected for many generations.
Indeed, Shlain acknowledges that certain developments in the history of modern media challenged the dominance of the written
word:
World War II was a firestorm for modern civilization, but the conflict also marked the beginning of yet another massive shift in global
consciousness. The combining of two feminine influences, photography and electromagnetism, was chiefly responsible for this change. In
1939, Philo T. Farnsworth invented television. After the war ended, television spread rapidly literally house to house. One after another,
living rooms were illuminated by the glow of fuzzy electronic pictures. The tube was an overnight sensation, and soon the amount of time
people spent watching images flit on and off the front of the glowing box began to surpass the amount of time people spent reading linear
rows of black letters.

Artwork by Shepard Fairey for Marshall McLuhan's 'The Medium Is The Massage.' Click image for details.
With this new narrative form came new modes of cognitive processing:
Comprehending television required an entirely different hemispheric strategy than that used in reading. Viewers called forth their pattern-
recognition skills to decipher the screens low-definition flickering mosaic mesh. The retinas cones need bright light to scan a static page of
print, but television brings the eyes rods into play. They see best in dim surroundings and can detect the slightest movements. As people
watched more and more television, the supremacy of the left hemisphere dimmed as the rights use increased. For 750, 000 years, families
had gathered around lit hearths whose flames supplied warmth, illuminated darkness, encouraged camaraderie, and encouraged
storytelling. Campfires had been an essential ingredient for the evolution of oral epics. In 1950, a new kind of fire replaced the hearth; and it
encouraged a different set of social qualities.
Shlain points out that when a person reads a book, his or her electroencephalogram (EEG) brain wave patterns differ significantly
from those registered when that person is watching television a finding made all the more remarkable by the fact that these
patterns deviate negligibly when the content of the book or TV program is varied. Watching television generates the same slow
alpha and theta waves as meditating patterns representing a passive, receptive, and contemplative state of mind while
reading generates beta waves, typically registered when the mind is concentrating on a task. Shlain ties this back to the question
of balance in the human spirit:
Task-oriented beta waves activate the hunter/killer side of the brain as alpha and theta waves emanate more from the gatherer/nurturer
side. Perhaps Western civilization has for far too long been stuck in a beta mode due to literacy, and striking a balance with a little more
alpha and theta, regardless of the source, will serve to soothe humankinds savage beast.
[]
Television, being a flickering image-based medium, derails the masculine-left-linear strategy, just as in parallel, the written word had earlier
disoriented the gestalt-feminine-right one.
In one of the final chapters, Shlain does consider how the invention of the computer, if not the internet, plays into these
male/female modalities:
The computer converted the television screen from a monologue to a dialogue by making it interactive. And features peculiar to
computers shifted the collective cultural consciousness of the men and women who used them toward a right-hemispheric mode, which in
turn has further diminished male dominance.
The computer was originally designed to aid scientists, most of whom were male. Since the 1970s, therefore, males have rushed in droves
to learn what their fathers and grandfathers contemptuously dismissed as a skill for women and sissies typing. Unlike all the scribes of
past cultures, men now routinely write using both hands instead of only the dominant one. The entry into the communication equation of
millions of mens left hands, directed by millions of male right brains tapping out one half of every computer-generated written message, is, I
believe, an unrecognized factor in the diminution of patriarchy.

Illustration by Edward Gorey from his alphabet book 'The Gashlycrumb Tinies.' Click image for details.
One particularly curious phenomenon Shlain points to as evidence of this shift is the seemingly sudden rise of dyslexia:
Dyslexic children, predominantly male (9:1), have difficulty deciphering the alphabet. One credible theory proposes that it is due to a failure of
hemispheric dominance. Ninety percent of the language centers traditionally reside in the left hemisphere of right-handed people. In the
right-handed dyslexic, the distribution of language centers may be more on the order of 80/20 or 70/30. Although we cannot be sure that
dyslexia was not always among us, it seems to have erupted at the very moment that an entire generation was devaluing the left
hemispheric mode of knowing. Perhaps television is the agent equilibrating the human brains two differing modes of perception.
And yet such theories highlight our cultures toxic polarity between intellect and intuition. Shlain makes the same argument for
dyslexia that Temple Grandin has been championing about autism that rather than a disease producing an abnormal or lesser
mind, it is an evolution producing a different mind:
The very concept of brain dominance is presently under scrutiny, as many dyslexics are talented artists, architects, musicians, composers,
dancers, and surgeons. The idea that logical, linear thinking is better than intuition and holistic perception was a script written by left-brainers
in the first place. Our culture has classified dyslexia as a disability. But as culture becomes more comfortable with its reliance on images, it
may turn out that dyslexia will be reassessed as another of the many harbingers that announced the arrival of the Iconic Revolution.
The Alphabet Versus the Goddess is a fascinating read in its entirety, certain to pull into question a great many of our cultural
assumptions and perceived givens.
Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the weeks best articles. Heres what to
expect. Like? Sign up.
TAGS: BOOKS CULTURE GENDER HISTORY LANGUAGE LEONARD SHLAIN WOMEN
13 NOVEMBER, 2013
The History of the English Language, Animated
By: Maria Popova
The Sun never sets on the English language.
The history of language, that peculiar human faculty that Darwin believed washalf
art and half instinct, is intricately intertwined with the evolution of our species,
our capacity for invention, ourunderstanding of human biology, and even the
progress of our gender politics. From the fine folks at Open University who
previously gave us these delightful 60-second animated syntheses of the worlds
major religions, philosophys greatest thought experiments, and the major creative
movements in design comes this infinitely entertaining and illuminating
animated history of the English language in 10 minutes:
Complement with these 5 essential reads on language and the only surviving
recording of Virginia Woolfs voice, in which she explores the beauty of the
English language.
Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the weeks best articles. Heres what to
expect. Like? Sign up.
TAGS: ANIMATION HISTORY LANGUAGE
02 OCTOBER, 2013
How Richard Dawkins Coined the Word Meme: The Legendary
Atheists Surprising Religious Inspiration
By: Maria Popova
Just as genes propagate themselves in the gene pool by leaping from body to body via
sperms or eggs, so memes propagate themselves in the meme pool by leaping from
brain to brain.
Most people know that the word meme wascoined by legendary evolutionary
biologistRichard Dawkins in his seminal 1976 bookThe Selfish Gene. What few realize,
however, is that the vocal atheist and champion of evidence as the holy grail of life, who
even penned achildrens book rebutting religious mythology with science, had his first
experience of a true meme, decades before he had the word for it, in a religious context.
In his altogether fantastic new memoir, An Appetite for Wonder: The Making of a
Scientist (public library), Dawkins describes his largely unhappy days at boarding
school, where he was sent away at the age of seven:
Every night in the dormitory we had to kneel on our beds, facing the wall at the head, and take
turns on successive evenings to say the goodnight prayer:
Lighten our darkness, we beseech thee, O Lord; and by thy great mercy defend us from all perils
and dangers of this night. Amen.
None of us had ever seen it written down, and we didnt know what it meant. We copied it parrot
fashion from each other on successive evenings, and consequently the words evolved towards
garbled meaninglessness. Quite an interesting test case in meme theory. . . . If we had understood the words of that prayer, we would not
have garbled them, because their meaning would have had a normalizing effect, similar to the proofreading of DNA. It is such
normalization that makes it possible for memes to survive through enough generations to fulfill the analogy with genes. But because many
of the words of the prayer were unfamiliar to us, all we could do was imitate their sound, phonetically, and the result was a very high
mutation rate as they passed down the generations of boy-to-boy imitation.
Dawkins adds that it would be interesting to investigate this effect experimentally, but admits hes yet to do it. (I wonder whether
he knows ofBuckminster Fullers scientific revision of The Lords Prayer.)
But rather than mindlessly succumbing to the meme, young Dawkins found himself asking the types of profoundly philosophical
questions of which children are capable, and seeking their answers in science rather than religion:
I became a secret reader. In the holidays from boarding school, I would sneak up to my bedroom with a book: a guilty truant from the fresh
air and the virtuous outdoors. And when I started learning biology properly at school, it was still bookish pursuits that held me. I was drawn to
questions that grown-ups would have called philosophical. What is the meaning of life? Why are we here? How did it all start?

Richard Dawkins at age 7. Photograph courtesy of Edge.org
Nearly thirty years later, he came to formulate his meme theory in The Selfish Gene, which remains an essential piece of cultural
literacy. In considering the primeval soup of replicators responsible for the origin of all life, he casts human culture as a
different kind of primeval soup driven by the same mechanisms and coins his concept of the meme, which has since itself
mimetically overtaken popular culture, even offering a pronunciation pointer:
I think that a new kind of replicator has recently emerged. . . . It is staring us in the face. It is still in its infancy, still drifting clumsily about in its
primeval soup, but already it is achieving evolutionary change at a rate which leaves the old gene panting far behind.
The new soup is the soup of human culture. We need a name for the new replicator, a noun which conveys the idea of a unit of cultural
transmission, or a unit of imitation. Mimeme comes from a suitable Greek root, but I want a monosyllable that sounds a bit like gene. I hope
my classicist friends will forgive me if I abbreviate mimeme tomeme. If it is any consolation, it could alternatively be thought of as being
related to memory, or to the French word mme. It should be pronounced to rhyme with cream.
Examples of memes are tunes, ideas, catch-phrases, clothes fashions, ways of making pots or of building arches. Just as genes propagate
themselves in the gene pool by leaping from body to body via sperms or eggs, so memes propagate themselves in the meme pool by
leaping from brain to brain, via a process which, in the broad sense, can be calledimitation.
Returning to his days at public school, Dawkins offers another intriguing example of meme theory in action by way of the
weirdness of nickname evolution, which operates much like mimetic mutation:
One friend of mine was called Colonel, although there was nothing remotely military about his personality. Seen the Colonel anywhere?
Heres the evolutionary history. Years earlier, an older boy, who had by now left the school, was said to have had a crush on my friend. That
older boys nickname was Shkin (corruption of Skin, and who knows where that came from maybe some connection with foreskin, but
that name would have evolved before I arrived). So my friend inherited the name Shkin from his erstwhile admirer. Shkin rhymes with
Thynne, and at this point something akin to Cockney rhyming slang stepped in. There was a character in the BBC radio Goon Show called
Colonel Grytte Pyppe Thynne. Hence my friend became Colonel Grytte Pyppe Shkin, later contracted to Colonel. We loved the Goon
Show, and would vie with each other to mimic (as did Prince Charles, who went to a similar school around the same time) the voices of the
characters: Bluebottle, Eccles, Major Denis Bloodnok, Henry Crun, Count Jim Moriarty. And we gave each other Goon nicknames like
Colonel or Count.
An Appetite for Wonder is an altogether fantastic read, offering a fascinating glimpse of how one of todays most influential
scientific minds blossomed into himself.
27 SEPTEMBER, 2013
Ironic Serif: A Brief History of Typographic Snark and the Failed
Crusade for an Irony Mark
By: Maria Popova
From 17th-century France to digital emoticons, by way of kooky characters and
spectacular failures.
In Shady Characters: The Secret Life of Punctuation, Symbols, and Other Typographical
Marks (public library) a wonderful addition to these stimulating reads about
language language-lover Keith Houston traces the secret history of punctuation,
spanning from antiquity to the digital age, from the asterisk to the @-symbol, chronicling
the strange and scintillating lives of the characters, glyphs, and marks that populate the
nooks and crannies of human communication. Though many of them are familiar staples of
everyday life, the most fascinating story is one of punctuational peril the failed quest for
a symbol to denote irony.

Frontispiece illustration by Edward Gorey from Felicia Lamport's 'Scrap Irony,' 1961. Click image for details.
He was a great friend of mine, Truman Capote once said of William Faulkner, adding: Well, as much as you could be a
friend of his, unless you were a fourteen-year-old nymphet. Without our capacity for comprehending irony, Capotes literary
snark wouldve rung hollow and nonsensical. And yet irony along with its kin, snark and sarcasm is an art form that thrives
on the spoken word, relying on intonation and body language to distinguish it from the literal, so its had a particularly rocky run
translating into written language. Thats precisely what Houston explores in the chapter on irony and sarcasm, beginning with a
historical and linguistic backdrop:
The concept of irony got its name though not yet an attendant mark of punctuation in ancient Greece, where playwrights employed a
cast of stock characters made recognizable by their physical characteristics, props, and personalities. One such staple of comic plays was
the eirn, a seeming buffoon who would best the alazon, his braggart opponent, by means of self-deprecation and feigned ignorance, and it
was the cunning eirn who gave his name first to the Greek eirneia and then to the modern term irony.
In the ancient world, however, humorists relied on their audiences intellect to detect the irony and didnt find it necessary to flag
it as such. But by 17th-century England, writers had become increasingly restless about pointing out irony readers might miss,
and so the first documented punctuation mark denoting irony was born in 1668, the brainchild of the English vicar and natural
philosopher John Wilkins brother-in-law of the royalists bte noire Oliver Cromwell, one-time head of Trinity College (of
which pioneering astronomerMaria Mitchell wrote in her diaries that in the opinion of a Cambridge man, to be master of Trinity
is to be master of the world), and eventually appointed as the first secretary of the newly founded Royal Society. Wilkins was a
kooky character, who believed the moon was inhabited by aliens, proposed the construction of submarine Arks, invented
transparent beehives that allowed for the extraction of honey without killing the bees inside, and wrote the very first book on
cryptography published in English. But his most memorable accomplishment was the publication of his Essay Towards a Real
Character and a Philosophical Language, in which he proposed, two centuries before the invention of Esperanto, a new
universal language of letters and symbols a sort of steroidal, all-encompassing Dewey Decimal System where concepts were
organized into a rigid hierarchy. More than a mere linguistic diversion, however, Houston argues the concept bespoke the eras
growing concern with information overload:
The Renaissance had generated an explosion of information, with knowledge and ideas spreading like wildfire among an increasing literate
and scientific populace. Latin, however, once the go-to language for international scholarly discourse, was in decline. More seriously, a new
breed of natural philosophers understood that the biases and limitations of natural language in general made it an imperfect tool for
communicating the new body of scientific knowledge: to Wilkins and his contemporaries, the notion of a purpose-built, universal language
with which to analyze and transmit this information held a powerful fascination. Thus it was that the mid-seventeenth century saw the
invention of a succession of philosophical languages and real characters, artificial taxonomies of things and concepts that were, crucially,
free from the myriad complexities of linguistic evolution.
So how does the irony mark fit into all this? Houston explains:
In addition to this taxonomy, however, Wilkins strayed into punctuation and writing, and in doing so made a curious innovation: he declared
that irony should be punctuated with an inverted exclamation mark ().
What prompted Wilkins to propose this punctuational improvement remains unknown, but we do know that he was neither the
only nor the first thinker who pondered the problem. Some sixty years earlier, the revered Dutch Renaissance humanist and social
critic Desiderius Erasmus, after whom Rotterdams prestigious Erasmus University is named, lamented the lack of punctuation
for irony, observing that irony has no place, only different pronunciation. Whether or not Wilkins was aware of Erasmuss
musings is subject to speculation, but Houston commends the Englishmans effort:
Regardless of his inspiration Wilkinss choice of the seems most appropriate. The presence of an exclamation mark already modifies
the tone of a statement, and inverting it to yield an i-like character both hints at the implied i-rony and simultaneously suggests the inversion
of its meaning.
And yet Wilkinss was only one of many proposed irony marks and the first of many famous failures to make one stick:
By the end of the seventeenth century the idea that a messy, chaotic universe could be brought to order with a manmade taxonomy had
been proven quixotic, and the dream of a universal language with which to express that taxonomy had largely faded. Wilkinss Essay, last
best hope for the ill-fated universal-language movement, is nowadays regarded as a glorious failure; his little-remarked inverted exclamation
point sank along with it, seemingly without trace. A fateful precedent had been set.
Nearly two centuries later, the effort was resurrected across the English Channel, where the surveyor Jean-Baptiste-Ambroise-
Marcellin Jobard an early champion of lithography and another kooky character who studied the propagation of the human
voice using hundreds of feet of pipes and designed an elaborate system of gaslights to light his home proposed a peculiar
series of Christmas-tree-like glyphs to denote irony. In an October 1841 article in Le Courrier Belge, the newspaper of record in
his adopted home of Brussels, where he had settled in 1819, he exorcised his exasperation over Europes chaotic politics:
What to say? What (1) when France stamps and prances impatiently to get on the battlefield, when Spain, tired of a truce of some
months, again engages in civil war, Belgium remains quietly occupied by industry, trade, railways and colonization! But this is absurd.
Beneath the article, throughout which the glyph appeared several more times, a footnote (1) explained: This is an irony point.
Houston writes:
Though his new mark went unused after this first outing, Jobard returned to the subject in a book published in 1842. Expanding his palette of
nonstandard punctuation marks, he suggested that the same arrowlike symbol could be placed at different orientations to indicate a point of
irritation, an indignation point, a point of hesitation, and mused that other symbols, yet to be invented, might be used to convey sympathy or
antipathy, affliction or satisfaction, and loud or quiet exclamations.
And yet Jobard, like his English counterpart two decades earlier, failed to make his invention stick:
A great intellectual of his time, Jobards works are only dimly remembered within the Francophone world and have been almost wholly
forgotten outside it. Fascinated by spiritualism in the latter part of his life, Jobard wrote a great deal on the subject. This obsession dominated
and diminished his legacy to such an extent that it has all but disappeared. Abandoned by its maker after a brief flirtation, his irony mark has
suffered a similar fate.
Eleven years later, Jobards lament over the lack of written equivalent to the vocal intonations of irony was echoed by none other
than education icon Jean-Jacques Rousseau, but Rousseau himself didnt propose a solution. The next crusader for an irony mark,
who actually proposed a practical implementation, was the poet Marcel Bernhardt, who in 1899 devised a new symbol
reminiscent of a stylized mirror-image question mark. Houston explains:
Alcanters point dironie dripped with knowing humor: in a nod to the sentiment often conveyed by verbal irony , he described it as taking the
form of a whip, and, aware that irony loses its sting when it must be signposted in exactly the manner he was proposing, the French name
for his new symbol was a pun with the additional meaning of no irony.

Alcanter de Brahms whiplike point dironie, proposed in 1899
But Brahm, too, built upon the work of thinkers who predated him. Three centuries earlier, Henry Denham dreamt up the
percontation point a reversed question mark used at the end of rhetorical questions, of which Brahms later character was
strikingly reminiscent. This tells us little more than that the irony mark was not destined for greatness:
Both Denhams percontation point and Brahms point dironiefared better than Wilkinss inverted exclamation mark and Jobards Christmas
tree, though neither one quite made the jump to common use. Benefiting, perhaps, from the malleable standards of sixteenth-century
punctuation, Denhams percontation point puttered on for around fifty years, while Brahms fin-de-sicle irony mark merited an entry in
theNouveau Larousse Illustr encyclopedia, preserved behind glass, as it were, until 1960. In their respective times, neither amounted to
anything more than a grammatical curiosity. The curse of the irony mark remained in force.

Herv Bazins menagerie of proposed punctuation marks, including the psi-like 'point dironie'
In the 1960s, however, one of Frances most prominent authors, Herv Bazin, took it upon himself to break the curse. In his 1966
tome Plumons loiseau, a playful push for spelling and grammar reform, he dedicated several pages to what he termed Les points
dintonation, or intonation points a solution to the same lack of nuance in written language that Rousseau had bemoaned
more than a century earlier. Bazin created an entire system of symbols, including the love point, conviction point, authority
point, acclamation point, doubt point, and his newly proposed point dironie, which Bazin explained thusly:
This is an arrangement of the Greek letter . This letter (psi) is an arrow in the bow, corresponding to ps: that is to say the sound of that
same arrow in the air. What could be better to denote irony?
Alas, Bazins arrow missed the target and suffered the same fate as its predecessors, perishing in obscurity. Nearly half a century
later, in 2007, the quest was resumed during the annual Dutch book festival, themed In Praise of Folly Jest, Irony and Satire
that year. Pan-European type foundry Underware was commissioned to create a special punctuation mark for the occasion and
thus the ironieteken was born a zigzaggy exclamation point denoting irony. But despite significant buzz across Dutch literary
circles including some criticism that, when placed in a row of several, it bore an unfortunate resemblance to the Nazi swastika
the mark quickly fizzled.

Left to right: Underwares ironieteken as rendered in 72pt Dolly, Century Catalogue, Share, and Cardo typefaces
This raises the necessary question of what it is, exactly, that makes representing irony typographically so catastrophic. As
someone who finds even the use of italics for emphasis, with very limited exceptions, a mark of weakness of style if a writer
cant wield language in a way that produces organic emotional crescendos, how pitiful to try forcing those typographically I
see the answer as obvious: Irony thrives on an implicit juxtaposition of contextual intention and literal meaning, so as soon as we
make it explicit, it stops being ironic.
Perhaps ironically, a separate crusade for a textual signifier of irony comes precisely in the form of a script a reverse-italics
typeface called Ironics, attributed to the iconoclastic journalist H. L. Mencken, who believed Americans were unable to recognize
irony and thus needed a special typeface to indicate that the writer was being facetious. But Houston is careful to point out that
attribution is murky and Mencken is credited with the invention of ironics more by virtue of popular myth than of historical
record. In fact, Houston cites a 1982 article which identifies British journalist and politician Tom Driberg (1905-1976) as the
likely originator of ironics:
Long ago the late Tom Driberg proposed that typographers should design a new face, which would slope the opposite way from italics, and
would be called ironics. In this type-face jokes would be set, and no-one would have any excuse for failing to see them. Until this happy
development takes place, I am left with the only really useful thing journalism has taught me: that there is no joke so obvious that some
bloody fool wont miss the point.
Driberg was arguably the strangest character of all irony-crusaders. Houston describes him:
Tom Dribergs life was a mess of ironies: he was a married, gay churchman who lunched with occultists; a left-wing politician who reveled in
frivolous society gossip; a patriot who spied both for his country and the dreaded KGB. It seems entirely apt for him to have proposed the
creation of a typeface to invest text with a double meaning, which would be slanted the opposite way from italics, and that would be called
ironics.
But Dribergs public image was dragged down after his death curiously, hisTimes obituary made him the very first
homosexual person outed in the papers history and along with it sank the dream of ironics.
Then came the internet, which changed the whole game:
The subtle shadings of verbal irony were bleached flat in the blinding glare of the new medium: what the Internet really wanted to
communicate was not irony, but its laser-guided offspring, sarcasm.
As UNICODE the definitive character library defining more than 109,000 symbols from multiple ancient and modern scripts,
including Latin, Greek, Arabic, Cyrillic, Chinese, cuneiform, and more emerged as the linguistic overlord of web characters,
the quest for an irony symbol now had to abide by its rules. And so in 1999, a group of Ethiopian language geeks lobbied for the
inclusion of an Ethiopian colloquialism denoting irony into the UNICODE set, writing:
Graphically indistinguishable from [the inverted exclamation point] () Temherte Slaq differs in semantic use in Ethiopia. Temherte Slaq will
come at the end of a sentence (vs at the beginning in Spanish use) and is used to indicate an unreal phrase, often sarcastical in editorial
cartoons. Temherte Slaq is also important in childrens literature and in poetic use.
Ironically, everything came full circle as the digital irony mark returned to Wilkinss original analog concept. Like its analog
brethren, however, the proposed UNICODE character didnt catch on. Instead, the internet embraced emoticons, which use
standard UNICODE punctuation, as the textual signifier of emotional intonation. Curiously, emoticons werent an invention of
the digital age they originated in 1881 but thats another story.

The very first use of emoticons, Puck Magazine, 1881. Click image for details.
Shady Characters goes on to examine the origins, evolution, and anthropology of such typographic darlings as the hashtag, the
ampersand, and the colophon. More than a mere catalog of curious trivia, its an absolutely fascinating blend of history, design,
sociology, and cultural poetics highly recommended.
Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the weeks best articles. Heres what to
expect. Like? Sign up.
TAGS: BOOKS CULTURE HISTORY LANGUAGE
27 SEPTEMBER, 2013
What Elvish, Klingon, and Dothraki Reveal about Real
Language & the Essence of Human Communication
By: Maria Popova
Why a good language, like a good life, requires both rules and messiness.
Language, Darwin believed, was not a conscious invention but a phenomenon slowly and
unconsciously developed by many steps. But what makes a language a language? In this
short animation from TED Ed, linguist John McWhorter, author of the indispensable The
Power of Babel: A Natural History of Language(public library), explores the fascinating
world of fantasy constructed languages known as conlangs from Game of
Thrones Dothraki toAvatars Navi to Star Treks Klingon to Lord of the Rings Elvish.
Though fictional, these conlangs reveal a great deal about the fundamentals of real human
communication and help us understand the essential components of a successful language
extensive vocabulary, consistent grammar rules but peppered with exceptions, and just the
right amount of room for messiness and evolution.
We can see the difference between vocabulary alone and what makes a real language from a look at
how Tolkien put together grand old Elvish, a conlang with several thousand words. After all, you can
memorize 5,000 words of Russian and still be barely able to construct a sentence a 4-year-old
would talk rings around you. Thats because you have to know how to put the words together that
is, a real language has grammar; Elvish does. Real languages also change over time theres no such thing as a language thats the
same today as it was 1,000 years ago: As people speak, they drift into new habits, shed old ones, make mistakes, and get creative. Real
languages are messy thats because they change, and change has a way of working against order. Real languages are never
perfectly logical thats why Tolkien made sure Elvish had plenty of exceptions.
Complement with this illustrated vintage guide to the science of language and the fascinating story of how Darwin shaped our
understanding of why language exists.
Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the weeks best articles. Heres what to
expect. Like? Sign up.
TAGS: ANIMATION CULTURE LANGUAGE TED
28 AUGUST, 2013
The Shape of Spectacular Speech: An Infographic Analysis of
What Made MLKs I Have a Dream Great
By: Maria Popova
The poetics of presenting, or why beautiful metaphors are better than beautiful slides.
On August 28, 1963, Martin Luther King, Jr. rose to the top of the steps of the
Lincoln Memorial during the March on Washington for Jobs and Freedom and
delivered his legendary I Have a Dream speech before 250,000 civil rights
supporters. It would go on to reverberate through the nation, reaching millions more,
and through history, inspiring generations and forever changing the course of culture.
But how can sixteen minutes of human speech have the power to move millions and
steer history?
Thats exactly what presentation design guru Nancy Duarte, author of Resonate:
Present Visual Stories that Transform Audiences (public library), probes as she
analyzes the shape of Dr. Kings speech and what made it so monumentally
impactful a modern-day, infographic-powered version of Kurt Vonneguts iconic
lecture on the shapes of stories exploring oration rather than narrative.

Duarte notes the Dr. King spoke in short bursts more reminiscent of poetry than of long-winded lecture-speak and highlights his
most powerful rhetorical devices repetition, metaphors, visual words, references to political documents, citations from sacred
texts and spiritual songs in a fascinating visualization of the speech, demonstrating how it embodies the core principles of her
book.
Duarte followed up Resonatewith Harvard Business Reviews HBR Guide to Persuasive Presentations, offering more specific
strategies for honing the power of presentation, where she places special emphasis on the far-reaching power of metaphor and
writes:
Metaphors are a powerful literary device. In Dr. Martin Luther King Jr.s I Have a Dream speech, about 20% of what he said was
metaphorical. For example, he likened his lack of freedom to a bad check that America has given the Negro people a check that has
come back marked insufficient funds. King introduced his metaphor three minutes into his 16-minute talk, and it was the first time the
audience roared and clapped.
Pair with five things every presenter should know about people and some timeless advice on how to give a great presentation.
The Adverb Is Not Your Friend: Stephen King on Simplicity of
Style
By: Maria Popova
I believe the road to hell is paved with adverbs, and I will
shout it from the rooftops.
Employ a simple and straightforward style, Mark Twain instructed in the 18th of his 18
famous literary admonitions. And what greater enemy of simplicity and straightforwardness
than the adverb? Or so argues Stephen King in On Writing: A Memoir on the Craft (public
library), one of 9 essential books to help you write better.
Though he may have used a handful of well-placed adverbs in his recent eloquent case for
gun control, King embarks upon a forceful crusade against this malignant part of speech:
The adverb is not your friend.
Adverbs are words that modify verbs, adjectives, or other adverbs. Theyre the ones that usually
end in -ly. Adverbs, like the passive voice, seem to have been created with the timid writer in mind.
With adverbs, the writer usually tells us he or she is afraid he/she isnt expressing himself/herself
clearly, that he or she is not getting the point or the picture across.
Consider the sentence He closed the door firmly. Its by no means a terrible sentence (at least its got an active verb going for it), but ask
yourself if firmly really has to be there. You can argue that it expresses a degree of difference between He closed the door and He
slammed the door, and youll get no argument from me but what about context? What about all the enlightening (not to say emotionally
moving) prose which came before He closed the door firmly? Shouldnt this tell us how he closed the door? And if the foregoing
prose does tell us, isnt firmly an extra word? Isnt it redundant?
Someone out there is now accusing me of being tiresome and anal-retentive. I deny it. I believe the road to hell is paved with adverbs, and I
will shout it from the rooftops. To put it another way, theyre like dandelions. If you have one on your lawn, it looks pretty and unique. If you fail
to root it out, however, you find five the next day . . . fifty the day after that . . . and then, my brothers and sisters, your lawn is totally,
completely, and profligately covered with dandelions. By then you see them for the weeds they really are, but by then its GASP!!
too late.
I can be a good sport about adverbs, though. Yes I can. With one exception: dialogue attribution. I insist that you use the adverb in dialogue
attribution only in the rarest and most special of occasions . . . and not even then, if you can avoid it. Just to make sure we all know what
were talking about, examine these three sentences:
Put it down! she shouted.
Give it back, he pleaded, its mine.
Dont be such a fool, Jekyll, Utterson said.
In these sentences, shouted, pleaded, and said are verbs of dialogue attribution. Now look at these dubious revisions:
Put it down! she shouted menacingly.
Give it back, he pleaded abjectly, its mine.
Dont be such a fool, Jekyll, Utterson said contemptuously.
The three latter sentences are all weaker than the three former ones, and most readers will see why immediately.

King uses the admonition against adverbs as a springboard for a wider lens on good and bad writing, exploring the interplay of
fear, timidity, and affectation:
Im convinced that fear is at the root of most bad writing. If one is writing for ones own pleasure, that fear may be mild timidity is the word
Ive used here. If, however, one is working under deadline a school paper, a newspaper article, the SAT writing sample that fear may
be intense. Dumbo got airborne with the help of a magic feather; you may feel the urge to grasp a passive verb or one of those nasty
adverbs for the same reason. Just remember before you do that Dumbo didnt need the feather; the magic was in him.
[]
Good writing is often about letting go of fear and affectation. Affectation itself, beginning with the need to define some sorts of writing as
good and other sorts as bad, is fearful behavior.
This latter part, touching on the contrast between intrinsic and extrinsic motivation, illustrates the critical difference
between working for prestige and working for purpose.
Complement On Writing with more famous wisdom on the craft from Kurt Vonnegut, Susan Sontag, Henry Miller, Jack
Kerouac, F. Scott Fitzgerald, H. P. Lovecraft, Zadie Smith, John Steinbeck, Margaret Atwood, Neil Gaiman, Mary Karr, Isabel
Allende, and Susan Orlean.
The Speech Chain: A Vintage Illustrated Guide to the Science of
Language
By: Maria Popova
A mid-century primer on how verbal messages progress from the mind of the speaker
to the mind of the listener.
Given my documented soft spot for all kindsof vintage anatomy, I was intrigued to
come across The Speech Chain: The Physics and Biology of Spoken
Language(public library) a short 1963 book that promises to cover a significant
subject in an interdisciplinary manner, exploring the science of speech and featuring
one of the most beautifully designed mid-century book covers Ive ever come across.
Today, in the age of constantly evolving textual and visual communication media, from
Twitter to Instagram to Vine, the book reminds us why speech is the one possibly
the only enduring and universal mode of relaying ideas:
Human society relies heavily on the free and easy interchange of ideas among is members
and, for one reason or another, man has found speech to be his most convenient form of
communication.
Through its constant use as a tool essential to daily living, speech has developed into a highly
efficient system for the exchange of even our most complex ideas. It is a system particularly
suitable for widespread use under the constantly changing and varied conditions of life.
It all sounds fine enough, until we realize the book which makes such statements of
questionable causal implication as the widespread use of books and printed matter may very well be an indication of a highly
developed civilization, but so is the greater use of telephone systems, areas of the world where civilization is most highly
developed are also the areas with the greatest density of telephones, andcountries bound by social and political ties are
usually connected by a well developed telephone system was published by the educational division of Bell Telephone
Laboratories. As we lament the the rise of sponsored content in contemporary media, a book from half a century ago reminds
us that publishing and corporate propaganda have always coexisted, and have always elicited outrage.
That said, the book does offer a wealth of fascinating science, including a number of delightful diagrams:

'The Speech Chain: the different forms in which a spoken messages exist in its progress from the mind of the speaker to the
mind of the listener.'

'The human vocal organs.'

'Outlines for the vocal tract during the articulation of various vowels.'

'The wavelengths and corresponding spectra of the vowels 'uh' (top) and 'ah' (bottom).

'Vocal tract configurations and corresponding mouth configurations for three different vowels. (The peaks of the spectra
represent vocal tract resonances. Vertical lines for individual harmonics are not shown.)'

'Diagram of the auditory pathways linking the brain with the ear.'

'The cochlear portion of the inner ear.'

'Diagram of a section through the core of the cochlea.'

'Patterns showing the relationship between second format transition and place-of-articulation of consonants.'
In 1993, the book was reissued with cover art by Keith Haring:

Pair The Speech Chain with Lilli Lehmanns 1902 illustrated guide to singing.
The Magic of Metaphor: What Childrens Minds Teach Us about
the Evolution of the Imagination
by Maria Popova
Metaphorical thinking is essential to how we communicate, learn, discover, and
invent.
Children help us to mediate between the ideal and the real, MoMAs Juliet Kinchin
wrote in her fascinating design history of childhood. Indeed, children have a penchant
for disarming clarityand experience reality in ways profoundly different from adults, in
the process illuminating the workings of our own minds. But among the most curious of
these mediations of reality is childrens understanding of abstraction in language, which is
precisely what James Geary explores in a chapter of his altogether enthralling I I s an
Other: The Secret Life of Metaphor and How I t Shapes the Way We See the
World(public library).
But first, Geary examines the all-permeating power of metaphor:
Metaphor is most familiar as the literary device through which we describe one thing in terms of
another, as when the author of the Old Testament Song of Songs describes a lovers navel as a
round goblet never lacking mixed wine or when the medieval Muslim rhetorician Abdalqahir Al-
Jurjani pines, The gazelle has stolen its eyes from my beloved.
Yet metaphor is much, much more than this. Metaphor is not just confined to art and literature but
is at work in all fields of human endeavor, from economics and advertising, to politics and business, to science and psychology. There is
no aspect of our experience not molded in some way by metaphors almost imperceptible touch. Once you twig to metaphors modus
operandi, youll find its fingerprints on absolutely everything.
Metaphorical thinking our instinct not just for describing but for comprehending one thing in terms of another, for equating I with an other
shapes our view of the world, and is essential to how we communicate, learn, discover, and invent.
Metaphor is a way of thought long before it is a way with words.
Children, it turns out, are on the one hand skilled and intuitive weavers of original metaphors and, on the other, utterly (and,
often, humorously) stumped by common adult metaphors, revealing that metaphor is both evolutionarily rooted and culturally
constructed. Citing a primatologists study of a bonobo, humans closest living relative, who was able to construct simple
metaphors after learning to use symbols and a keyboard, Geary traces the developmental evolution of childrens natural
metaphor-making ability:
Children share with bonobos an instinctive metaphor-making ability. Most early childhood metaphors are simple noun-noun
substitutions These metaphors tend to emerge first during pretend play, when children are between the ages of twelve and twenty-four
months. As psychologist Alan Leslie proposed in his theory of mind, children at this age start to create metarepresentations through which
they imaginatively manipulate both the objects around them and their ideas about those objects. At this stage, metaphor is, literally, childs
play. During pretend play, children effortlessly describe objects as other objects and then use them as such. A comb becomes a centipede;
cornflakes become freckles; a crust of bread becomes a curb.
Childrens natural gift for rich and vivid metaphors, Geary argues, is propelled by the same driving force of our own adult
creativity, pattern-recognition. Because kids pattern-recognition circuits arent yet stifled by narrow conventions of thinking and
classification, they are able to produce a cornucopia of metaphorical expressions but only few of them actually make sense.
The reason is that successful metaphors hang on perceptual similarities, and in the best of them these similarities are more
abstract than literal, but children are only able to comprehend the more obvious similarities as their developmental psychology
evolves toward abstraction. Geary cites an illustrative study:
Children listened to short stories that ended with either a literal or metaphorical sentence. In a story about a little girl on her way home, for
example, the literal ending was Sally was a girl running to her home, while the metaphoric ending was Sally was a bird flying to her nest.
Researchers asked the children to act out the stories using a doll. Five- to six-year-olds tended to move the Sally doll through the air when
the last sentence was Sally was a bird flying to her nest, taking the phrase literally. Eight- to nine-year-olds, however, tended to move her
quickly across the ground, taking the phrase metaphorically.

In his brilliant picture-book 'People,' French illustrator Blexbolex uses visual, perceptual similarities to make clever
commentary on conceptual ideas. Click image for details.
Another study, conducted by legendary social psychologist Solomon Asch and his collaborator Harriet Nerlove in the 1960s,
demonstrated a different facet of the same phenomenon by testing childrens comprehension of so-called double function
terms, such as warm, cold, bitter, and sweet, which in their literal sense refer to physical sensations, but in the abstract
can describe human temperament and personality:
To trace the development of double function terms in children, Asch and Nerlove presented groups of kids with a collection of different
objects ice water, sugar cubes, powder puffs and asked them to identify the ones that were cold, sweet, or soft. This, of course, they
were easily able to do.
Asch and Nerlove then asked the children, Can a person be cold? Can a person be sweet? Can a person be soft? While preschoolers
understood the literal physical references, they did not understand the metaphorical psychological references. They described cold people
as those not dressed warmly; hard people were those with firm muscles. One preschooler described his mother as sweet but only
because she cooked sweet things, not because she was nice.
Asch and Nerlove observed that only between the ages of seven and ten did children begin to understand the psychological meanings of
these descriptions. Some seven- and eight-year-olds said that hard people are tough, bright people are cheerful, and crooked people do
bad things. But only some of the eleven- and twelve-year-olds were able to actually describe the metaphorical link between the physical
condition and the psychological state. Some nine- and ten-year-olds, for instance, were able to explain that both the sun and bright people
beamed. Childrens metaphorical competence, it seems, is limited to basic perceptual metaphors, at least until early adolescence.
Younger childrens inability to understand how a physical state could be mapped onto a psychological one, Geary argues, has to
do with kids lack of life experience in observing how physical circumstances like, say, poverty or violence can impact a
persons character, which, as we know, is constantly evolving and responsive to life:
Children have trouble understanding more sophisticated metaphors because they have not yet had the life experiences needed to acquire
the relevant cache of associated commonplaces.
What this tells us is that while the hardware for making metaphors may be in-born, the software is earned and learned through
living. This learning, Geary explains by pointing to cognitive scientist Dedre Gentners work, takes place in stages marked by a
sliding scale of increasingly complex similes:
[Gentner] presented three different age groups five- to six-year-olds, nine- to ten-year-olds, and college students with three different
kinds of similes.
Attributional similes, such as Pancakes are like nickels, were based on physical similarities; both are round and flat. Relational similes, such
as A roof is like a hat, were based on functional similarity; both sit on top of something to protect it. Double similes, such as Plant stems are
like drinking straws, were based on physical as well as functional similarities; both are long and cylindrical and both bring liquid from below to
nourish a living thing.
Gentner found that youngsters in all age groups had no problem comprehending the attributional similes. But only the older kids understood
the relational and double similes. In subsequent research, Gentner has found that giving young children additional context enhances their
ability to pick up on the kind of relational comparisons characteristic of more complex metaphors.

Some of history's most celebrated children's books, like Alice's Adventures in Wonderland, are woven of metaphors
exploring life's complexities. Click for more.
Thus, as childrens cognition develops and their understanding of the world evolves, their metaphorical range becomes more
expansive something equally true of us grown-ups, as Geary reminds us:
Any metaphor is comprehensible only to the extent that the domains from which it is drawn are familiar.
But this is where it gets most interesting: While this familiarity might be the foot in the door of understanding, a great metaphor
is also an original one, thus forming new, uncommon associations of common elements rather than relying merely on the familiar
ones a beautiful manifestation of combinatorial creativity at play. And therein lies the magic:
This is one of the marvels of metaphor. Fresh, successful metaphors do not depend on conventional pre-existing associations. Instead, they
highlight novel, unexpected similarities not particularly characteristic of either the source or the target at least until the metaphor itself points
them out.
I I s an Other is endlessly illuminating in its entirety, exploring how metaphors influence our experience and understanding of
everything from politics to science to money. It follows Gearys equally fascinating The World in a Phrase: A History of
Aphorisms and Gearys Guide to the Worlds Great Aphorists.
Love and Math: Equations as an Equalizer for Humanity
by Maria Popova
Mathematics is the source of timeless profound knowledge, which goes to the heart of
all matter and unites us across cultures, continents, and
centuries.
French polymath Henri Poincar saw in mathematics a metaphor for how creativity works,
while autistic savant Daniel Tammet believes that math expands our circle of empathy. So
how can a field so diverse in its benefits and so rich in human value remain alienating to so
many people who subscribe to the toxic cultural mythology that in order to appreciate its
beauty, one needs a special kind of mathematical mind? Thats precisely what renowned
mathematician Edward Frenkel sets out to debunk in Love and Math: The Heart of
Hidden Reality(public library) a quest to unravel the secrets of the hidden parallel
universe of beauty and elegance, intricately intertwined with ours, premised on the idea
that math is just as valuable a part of our cultural heritage as art, music, literature, and the
rest of the humanities we so treasure.
Frenkel makes the same case for math that philosopher Judith Butler made for reading and
the humanities, arguing for it as a powerful equalizer of humanity:
Mathematical knowledge is unlike any other knowledge. While our perception of the physical world
can always be distorted, our perception of mathematical truths cant be. They are objective, persistent, necessary truths. A mathematical
formula or theorem means the same thing to anyone anywhere no matter what gender, religion, or skin color; it will mean the same thing
to anyone a thousand years from now. And whats also amazing is that we own all of them. No one can patent a mathematical formula, its
ours to share. There is nothing in this world that is so deep and exquisite and yet so readily available to all. That such a reservoir of
knowledge really exists is nearly unbelievable. Its too precious to be given away to the initiated few. It belongs to all of us.
Math also helps lift our blinders and break the shackles of our own prejudices:
Mathematics is a way to break the barriers of the conventional, an expression of unbounded imagination in the search for truth. Georg
Cantor, creator of the theory of infinity, wrote: The essence of mathematics lies in its freedom. Mathematics teaches us to rigorously
analyze reality, study the facts, follow them wherever they lead. It liberates us from dogmas and prejudice, nurtures the capacity for
innovation.
BEAUTY OF MATHEMATICS by Yann Pineill & Nicolas Lefaucheux
To illustrate why our aversion to math is a product of our cultures bias rather than of maths intrinsic whimsy, Frenkel offers an
analogy:
What if at school you had to take an art class in which you were only taught how to paint a fence? What if you were never shown the
paintings of Leonardo da Vinci and Picasso? Would that make you appreciate art? Would you want to learn more about it? I doubt it. You
would probably say something like this: Learning art at school was a waste of my time. If I ever need to have my fence painted, Ill just hire
people to do this for me. Of course, this sounds ridiculous, but this is how math is taught, and so in the eyes of most of us it becomes the
equivalent of watching paint dry. While the paintings of the great masters are readily available, the math of the great masters is locked away.
Countering these conventional attitudes toward math, Frenkel argues that it isnt necessary to immerse yourself in the field for
years of rigorous study in order to appreciate its far-reaching power and beauty:
Mathematics directs the flow of the universe, lurks behind its shapes and curves, holds the reins of everything from tiny atoms to the biggest
stars.
[]
There is a common fallacy that one has to study mathematics for years to appreciate it. Some even think that most people have an innate
learning disability when it comes to math. I disagree: most of us have heard of and have at least a rudimentary understanding of such
concepts as the solar system, atoms and elementary particles, the double helix of DNA, and much more, without taking courses in physics
and biology. And nobody is surprised that these sophisticated ideas are part of our culture, our collective consciousness. Likewise,
everybody can grasp key mathematical concepts and ideas, if they are explained in the right way. . . .
The problem is: while the world at large is always talking about planets, atoms, and DNA, chances are no one has ever talked to you about
the fascinating ideas of modern math, such as symmetry groups, novel numerical systems in which 2 and 2 isnt always 4, and beautiful
geometric shapes like Riemann surfaces. Its like they keep showing you a little cat and telling you that this is what a tiger looks like. But
actually the tiger is an entirely different animal. Ill show it to you in all of its splendor, and youll be able to appreciate its fearful symmetry, as
William Blake eloquently said.

Drawing from Soviet artist and mathematician Anatolii Fomenkos 'Mathematical Impressions.' Click image for more.
And as if a mathematician quoting Blake werent already an embodiment that boldly counters our cultural stereotypes, Frenkel
adds even more compelling evidence from his own journey: Born in Soviet Russia where mathematics had become an outpost of
freedom in the face of an oppressive regime, discriminatory policies denied him entrance into Moscow State University. But
already enamored with math, he secretly snuck into lectures and seminars, read books well into the night, and gave himself the
education the system had attempted to bar him from. A young self-taught mathematician, he began publishing provocative
papers, one of which was smuggled abroad and gained international acclaim. Soon, he was invited as a visiting professor at
Harvard. He was only twenty-one.
The point of this biographical anecdote, of course, isnt that Frenkel is brilliant, though he certainly is its that the love math
ignites in those willing to surrender to its siren call can stir hearts, move minds, and change lives. Frenkel puts it beautifully,
returning to maths equalizing quality:
Mathematics is the source of timeless profound knowledge, which goes to the heart of all matter and unites us across cultures, continents,
and centuries. My dream is that all of us will be able to see, appreciate, and marvel at the magic beauty and exquisite harmony of these
ideas, formulas, and equations, for this will give so much more meaning to our love for this world and for each other.
Love and Math goes on to explore the alchemy of that magic through its various facets, including one of the biggest ideas that
ever came from mathematics the Langlands Program, launched in the 1960s by Robert Langlands, the mathematician who
currently occupies Einsteins office at Princeton, and considered by many the Grand Unified Theory of mathematics.
Complement it with Paul Lockharts exploration of the whimsy of math and Daniel Tammet on the poetry of numbers.
Thanks, Kirstin
Lost Cat: An Illustrated Meditation on Love, Loss, and What It
Means To Be Human
by Maria Popova
You can never know anyone as completely as you want. But thats okay, love is
better.
Dogs are not about something else. Dogs are about dogs, Malcolm Gladwell
indignated in the introduction to The Big New Yorker Book of Dogs. Though hailed as
memetic rulers of the internet, cats too have enjoyed an admirable run as creative
devices and literary muses inJoyces childrens books, T. S. Eliots
poetry,Hemingways letters, and various verses. But hardly ever have cats been at
once more about cats and more about something else than inLost Cat: A True Story
of Love, Desperation, and GPS Technology (public library) by firefighter-turned-
writer Caroline Paul and illustrator extraordinaire Wendy MacNaughton, she of many
wonderful collaborations a tender, imaginative memoir infused with equal parts
humor and humanity. (You might recall a subtle teaser for this gem in Wendys
wonderful recent illustration of Gay Taleses taxonomy of cats.) Though about a
cat, this heartwarming and heartbreaking tale is really about what it means to be
human about the osmosis of hollowing loneliness and profound attachment, the
oscillation between boundless affection and paralyzing fear of abandonment, the
unfair promise of loss implicit to every possibility of love.

After Caroline crashes an experimental plane she was piloting, she finds herself severely injured and spiraling into the depths
of depression. It both helps and doesnt that Caroline and Wendy have just fallen in love, soaring in the butterfly heights of
new romance, the phase of love that didnt obey any known rules of physics, until the crash pulls them into a place that
would challenge even the most seasoned and grounded of relationships. And yet they persevere as Wendy patiently and
lovingly takes care of Caroline.

When Caroline returns from the hospital with a shattered ankle, her two thirteen-year-old tabbies the shy, anxious Tibby
(short for Tibia, affectionately and, in these circumstances, ironically named after the shinbone) and the sociable,
amicable Fibby (short for Fibula, after the calf bone on the lateral side of the tibia) are, short of Wendy, her only joy and
comfort:
Tibia and Fibula meowed happily when I arrived. They were undaunted by my ensuing stupor. In fact they were delighted; suddenly I
had become a human who didnt shout into a small rectangle of lights and plastic in her hand, peer at a computer, or get up and
disappear from the vicinity, only to reappear through the front door hours later. Instead, I was completely available to them at all times.
Amazed by their good luck, they took full feline advantage. They asked for ear scratches and chin rubs. They rubbed their whiskers
along my face. They purred in response to my slurred, affectionate baby talk. But mostly they just settled in and went to sleep. Fibby
snored into my neck. Tibby snored on the rug nearby. Meanwhile I lay awake, circling the deep dark hole of depression.
Without my cats, I would have fallen right in.

And then, one day, Tibby disappears.

Wendy and Caroline proceed to flyer the neighborhood, visit every animal shelter in the vicinity, and even, in their
desperation, enlist the help of a psychic who specializes in lost pets but to no avail. Heartbroken, they begin to mourn
Tibbys loss.

And then, one day five weeks later, Tibby reappears.

Once the initial elation of the recovery has worn off, however, Caroline begins to wonder where hed been and why hed left.
He is now no longer eating at home and regularly leaves the house for extended periods of time Tibby clearly has a secret
place he now returns to. Even more worrisomely, hes no longer the shy, anxious tabby hed been for thirteen years
instead, hes a half pound heavier, chirpy, with a youthful spring in his step. But why would a happy cat abandon his loving
lifelong companion and find comfort find himself, even elsewhere?
When the relief that my cat was safe began to fade, and the joy of his prone, snoring form sprawled like an athlete after a celebratory
night of boozing started to wear thin, I was left with darker emotions. Confusion. Jealousy. Betrayal. I thought Id known my cat of
thirteen years. But that cat had been anxious and shy. This cat was a swashbuckling adventurer back from the high seas. What siren
call could have lured him away? Was he still going to this gilded place, with its overflowing food bowls and endless treats?

Theres only one obvious thing left to do: Track Tibby on his escapades. So Caroline, despite Wendys lovingly suppressed
skepticism, heads to a spy store yes, those exist and purchases a real-time GPS tracker, complete with a camera that
they program to take snapshots every few minutes, which they then attach to Tibbys collar.

What follows is a wild, hilarious, and sweet tale of tinkering, tracking, and tenderness. Underpinning the obsessive quest is
the subtle yet palpable subplot of Wendy and Carolines growing love for each other, the deepening of trust and affection that
happens when two people share in a special kind of insanity.

The inimitable Maira Kalman blurbed the book admiringly:
The writing and drawings are funny. Nutty. Heartwarming. Smart. Loopy. Full of love.

Every quest is a journey, every journey a story. Every story, in turn, has a moral,writes Caroline in the final chapter, then
offers several possible morals for the story, the last of which embody everything that makes Lost Cat an absolute treat from
cover to cover:
6. You can never know your cat. In fact, you can never know anyone as completely as you want.
7. But thats okay, love is better.
The Fine Art of Italian Hand Gestures: A Vintage Visual
Dictionary by Bruno Munari
By: Maria Popova
A pocket guide to Neapolitan nonverbal communication.
Somewhere between his seminal manifestos on design as art and his timelessly
delightfulchildrens books, legendary Italian artist and graphic designer Bruno
Munari made time for a number of idiosyncratic side projects. Among them is Speak
I talian: The Fine Art of the Gesture(UK; public library) a charming, quirky,
minimalist guide to Italians expressive nonverbal communication originally published in
1958 as a supplement to the Italian dictionary, inspired by The Ancients Mimic Through
the Neapolitan Gestures, the first collection of gestures made by Canon Andrea de Jorio
in 1832. Unlike the hefty and sparsely illustrated 380-page original tome, however,
Munaris pocket-sized version features frugally descriptive text and ample, elegant black-
and-white photographs of hand-gestures for everything from mundane activities like
reading and writing to emotive expressions of praise and criticism.
In the short preface, Munari notes the globalization of nonverbal vernacular, as
Neapolitan gestures begin being recognized worldwide and American imports like OK
permeate Italian culture, then promises:
We have collected a good many gestures, leaving aside vulgar ones, in order to give an idea of their meaning to foreigners visiting Italy and
as a supplement to an Italian dictionary.

Old Neapolitan gestures, from left to right: money, past times, affirmation, stupid, good, wait a moment, to walk backward,
to steal, horns, to ask for.

Another illustrated page of the book of Canon Andrea de Jorio. Meaning of the gestures: silence, no, beauty, hunger, to
mock, weariness, stupid, squint, to deceive, cunning.

Gestures of drinking and eating (from an old Neapolitan print)

'You make a mockery of the 'madam'!' (from an old Neapolitan print)












For a naughty twist, complement Speak I talian: The Fine Art of the Gesturewiththe surrealist chart of erotic hand signaling.
Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the weeks best articles. Heres what to
expect. Like? Sign up.
TAGS: ART BOOKS BRUNO MUNARI CULTURE DESIGN LANGUAGE VINTAGE
29 OCTOBER, 2012
The Etymology of Hangover
By: Maria Popova
What George Washington and coarse French fabric have to do with the language of
drunkenness.
The fringes of language have a special kind of allure, especially when it comes to
theunsuspected origins of common words. Thats precisely what Mark Forsyth explores
with equal parts wryness, curiosity, and erudition inThe Etymologicon: A Circular Stroll
Through the Hidden Connections of the English Language(public library), based on his
popular language-geekery blog The Inky Fool. Among Forsyths fascinating, meandering
stories of linguistic historicity is that of the hangover a phenomenon encrusted with
rich empirical familiarity, and even some scientific knowledge, but paltry etymological
grasp.
[George Washington] had an elder half-brother and mentor called Lawrence Washington who
had, in fact, been a British soldier. Specifically, he was a marine in the Royal Navy. As a recruit
from the British dominions in North America, he served under Admiral Edward Vernon in the
Caribbean, and was part of the force that seized a strategically important base called
Guantnamo, which has some minor position in modern history.
Lawrence Washington was very attached to admiral Vernon. So loyal was he that when he went
home to the family estate, which had been called Little Hunting Creek Plantation, he decided to rename it Mount Vernon. So Washingtons
house was named after a British admiral.
Admiral Vernons naming exploits didnt end there, though. In 1739 Vernon led the British assault on Porto Bello in what is now Panama. He
had only six ships, but with lots of derring-do and British pluck, et cetera, he won a startling victory. In fact, so startling was the victory that a
patriotic English farmer heard the news, dashed off to the countryside west of London, and built Portobello Farm in honour of the victorys
startlingness. Greens Lane, which was nearby, soon became known as Portobello Lane and then Portobello Road. And thats why the
London market, now one of the largest antiques markets in the world, is called Portobello Market.
But Admiral Vernons naming exploits didnt end there, either. When the seas were stormy he used to wear a thick coat made out of coarse
material called grogram (from the Frenchgros grain). So his men nicknamed him Old Grog.
British sailors used to have a daily allowance of rum. In 1740, flushed from victory at Porto Bello and perhaps under the pernicious influence
of Lawrence Washington, Vernon ordered that the rum be watered down. The resulting mixture, which eventually became standard for the
whole navy, was also named after Vernon. It was called grog.
If you drank too much grog you became drunk or groggy, and the meaning has slowly shifted from there to the wages of gin: a hangover.
The rest of The Etymologicon traces curious linguistic origin stories connecting concepts as seemingly unrelated as sex and
bread, Medieval monks and cappuccino, sausage poison and botox, and much more.
Brain Pickings has a free weekly newsletter and people say its cool. It comes out on Sundays and offers the weeks best
articles. Heres what to expect. Like? Sign up.
TAGS: BOOKS HISTORY LANGUAGE
12 SEPTEMBER, 2012
Thoughtful Alphabets: Edward Goreys Lost Cryptic 26-Word
Illustrated Stories
By: Maria Popova
A delightfully dark journey into the love of language.
Having a soft spot for all things Edward Goreyand unusual alphabet books, I was thrilled
byPomegranates new edition of Thoughtful Alphabets: The J ust Dessert and The Deadly
Blotter (public library) a collection of two cryptic 26-word stories, in which the word
begin with the letters of the alphabet in order and the story progresses as the alphabet does
in parallel.
The stories belong to a mid-90s Thoughtful Alphabets series, the first six volumes of
which were released as hand-lettered posters illustrated with clip-art. Then, several years
ago, stories numbers XI and XVII emerged as signed limited-edition books featuring
Goreys original drawings but the books quickly went out of print. In this beautiful
resurrection, Goreys signature blend of wit and dark whimsy shines in each of the micro-
vignettes a fine complement to his beloved alphabet classic, The Gashlycrumb Tinies.










Illustrations The Edward Gorey Charitable Trust, courtesy Pomegranate. All rights reserved.
Isaac Asimov on the Thrill of Lifelong Learning, Science vs.
Religion, and the Role of Science Fiction in Advancing Society
By: Maria Popova
Its insulting to imply that only a system of rewards and punishments can keep you a
decent human being.
Isaac Asimov was an extraordinary mind and spirit the author of more than 400
science and science fiction books and a tirelessadvocate of space exploration, he also
took great joy in the humanities (and onceannotated Lord Byrons epic poem Don
Juan), championed humanism over religion, and celebrated the human spirit itself (he
evenwrote young Carl Sagan fan mail). Like many of the best science fiction writers, he
was asexceptional at predicting the future as he was at illuminating some of the most
timeless predicaments of the human condition. In a 1988 interview with Bill Moyers,
found in Bill Moyers: A World of I deas (public library) the same remarkable tome
that gave us philosopher Martha Nussbaum on how to live with our human fragility
Asimov explores several subjects that still stir enormous cultural concern and friction.
With his characteristic eloquence and sensitivity to the various dimensions of these issues, he presages computer-powered
lifelong learning and online education decades before it existed, weighs the question of how authors will make a living in a world
of free information, bemoans the extant attempts of religious fundamentalism to drown out science and rational thought, and
considers the role of science fiction as a beacon of the future.

The conversation begins with a discussion of Asimovs passionate belief that when given the right tools, we can accomplish far
more than what we can with the typical offerings of formal education:
MOYERS: Do you think we can educate ourselves, that any one of us, at any time, can be educated in any subject that strikes our fancy?
ASIMOV: The key words here are that strikes our fancy. There are some things that simply dont strike my fancy, and I doubt that I can
force myself to be educated in them. On the other hand, when theres a subject Im ferociously interested in, then it is easy for me to learn
about it. I take it in gladly and cheerfully
[What's exciting is] the actual process of broadening yourself, of knowing theres now a little extra facet of the universe you know about and
can think about and can understand. It seems to me that when its time to die, there would be a certain pleasure in thinking that you had
utilized your life well, learned as much as you could, gathered in as much as possible of the universe, and enjoyed it. Theres only this one
universe and only this one lifetime to try to grasp it. And while it is inconceivable that anyone can grasp more than a tiny portion of it, at least
you can do that much. What a tragedy just to pass through and get nothing out of it.
MOYERS: When I learn something new and it happens every day I feel a little more at home in this universe, a little more comfortable
in the nest. Im afraid that by the time I begin to feel really at home, itll all be over.
ASIMOV: I used to worry about that. I said, Im gradually managing to cram more and more things into my mind. Ive got this beautiful mind,
and its going to die, and itll all be gone. And then I thought, No, not in my case. Every idea Ive ever had Ive written down, and its all there
on paper. I wont be gone. Itll be there.

Page from 'Charley Harper: An Illustrated Life'
Asimov then considers how computers would usher in this profound change in learning and paints the outline of a concept that
Clay Shirky would detail and term cognitive surplus two decades later:
MOYERS: Is it possible that this passion for learning can be spread to ordinary folks out there? Can we have a revolution in learning?
ASIMOV: Yes, I think not only that we can but that we must. As computers take over more and more of the work that human beings
shouldnt be doing in the first place because it doesnt utilize their brains, it stifles and bores them to death theres going to be nothing
left for human beings to do but the more creative types of endeavor. The only way we can indulge in the more creative types of endeavor is
to have brains that aim at that from the start.
You cant take a human being and put him to work at a job that underuses the brain and keep him working at it for decades and decades,
and then say, Well, that job isnt there, go do something more creative. You have beaten the creativity out of him. But if from the start
children are educated into appreciating their own creativity, then probably almost all of us can be creative. In the olden days, very few people
could read and write. Literacy was a very novel sort of thing, and it was felt that most people just didnt have it in them. But with mass
education, it turned out that most people could be taught to read and write. In the same way, once we have computer outlets in every home,
each of them hooked up to enormous libraries, where you can ask any question and be given answers, you can look up something youre
interested in knowing, however silly it might seem to someone else.
Asimov goes on to point out the flawed industrial model of education something Sir Ken Robinson would lament articulately
two decades later and tells Moyers:
Today, what people call learning is forced on you. Everyone is forced to learn the same thing on the same day at the same speed in class.
But everyone is different. For some, class goes too fast, for some too slow, for some in the wring direction. But give everyone a chance, in
addition to school, to follow up their own bent from the start, to find out about whatever theyre interested in by looking it up in their own
homes, at their own speed, in their own time, and everyone will enjoy learning.
Later, in agreeing with Moyers that this revolution in learning isnt merely for the young, Asimov adds:
Thats another trouble with education as we now have it. People think of education as something that they can finish. And whats more,
when they finish, its a rite of passage. Youre finished with school. Youre no more a child, and therefore anything that reminds you of school
reading books, having ideas, asking questions thats kids stuff. Now youre an adult, you dont do that sort of thing anymore
Every kid knows the only reason hes in school is because hes a kid and little and weak, and if he manages to get out early, if he drops out,
why hes just a premature man.

Embroidered map of the infant Internet in 1983 by Debbie Millman
Speaking at a time when the Internet as we know it today was still an infant, and two decades before the golden age of online
education, Asimov offers a remarkably prescient vision for how computer-powered public access to information would spark the
very movement of lifelong learning that weve witnessed in the past decade:
You have everybody looking forward to no longer learning, and you make them ashamed afterward of going back to learning. If you have a
system of education using computers, then anyone, any age, can learn by himself, can continue to be interested. If you enjoy learning,
theres no reason why you should stop at a given age. People dont stop things they enjoy doing just because they reach a certain age.
They dont stop playing tennis just because theyve turned forty. They dont stop with sex just because theyve turned forty. They keep it up
as long as they can if they enjoy it, and learning will be the same thing. The trouble with learning is that most people dont enjoy it because of
the circumstances. Make it possible for them to enjoy learning, and theyll keep it up.
When Moyers asks him to describe what such a teaching machine would look like again, in 1988, when personal computers
had only just begun to appear in homes Asimov envisions a kind of Siri-like artificial intelligence, combined with the
functionality of a discovery engine:
I suppose that one essential thing would be a screen on which you could display things And youll have to have a keyboard on which you
ask your questions, although ideally I could like to see one that could be activated by voice. You could actually talk to it, and perhaps it could
talk to you too, and say, I have something here that may interest you. Would you like to have me print it out for you? And youd say, Well,
what is it exactly? And it would tell you, and you might say, Oh all right, Ill take a look at it.
But one of his most prescient remarks actually has to do not with the mechanics of freely available information but with the
ethics and economics of it. Long before our present conundrum of how to make online publishing both in the public interest and
financially sustainable for publishers, Asimov shares with Moyers the all too familiar question he has been asking himself
How do you arrange to pay the author for the use of the material? and addresses it with equal parts realism and idealism:
After all, if a person writes something, and this then becomes available to everybody, you deprive him of the economic reason for writing. A
person like myself, if he was assured of a livelihood, might write anyway, just because he enjoyed it, but most people would want to do it in
return for something. I imagine how they must have felt when free libraries were first instituted. What? My book in a free library? Anyone can
come in and read it for free? Then you realize that there are some books that wouldnt be sold at all if you didnt have libraries.
(A century earlier, Schopenhauer had issued a much sterner admonition against the cultural malady of writing solely for material
rewards.)

Painting of hell by William Blake from John Milton's 'Paradise Lost' (click image for more)
Asimov then moves on to the subject of science vs. religion something he would come to address with marvelous eloquence in
his memoir and shares his concern about how mysticism and fundamentalism undercut society:
Id like to think that people who are given a chance to learn facts and broaden their knowledge of the universe wouldnt seek so avidly after
mysticism.
[]
It isnt right to sell a person phony stock, and take money for it, and this is what mystics are doing. Theyre selling people phony knowledge
and taking money for it. Even if people feel good about it, I can well imagine that a person who really believes in astrology is going to have a
feeling of security because he knows that this is a bad day, so hell stay at home, just as a guy whos got phony stock may look at it and feel
rich. But he still has phony stock, and the person who buys mysticism still has phony knowledge.
He offers a counterpoint and considers what real knowledge is, adding tohistorys best definitions of science:
Science doesnt purvey absolute truth. Science is a mechanism, a way of trying to improve your knowledge of nature. Its a system for
testing your thoughts against the universe and seeing whether they match. This works not just for the ordinary aspects of science, but for all
of life.
Asimov goes on to bemoan the cultural complacency that has led to the decline of science in mainstream culture a decline we
feel even today more sharply than ever when, say, a creationist politician tries to stop a little girls campaign for a state
fossil because such an effort would endorse evolution. Noting that we are living in a business society where fewer and fewer
students take math and science, Asimov laments how weve lost sight of the fact that science is driven by not-knowing rather
than certitude:
MOYERS: You wrote a few years ago that the decline in Americas world power is in part brought about by our diminishing status as a world
science leader. Why have we neglected science?
ASIMOV: Partly because of success. The most damaging statement that the United States has ever been subjected to is the phrase
Yankee know-how. You get the feeling somehow that Americans just by the fact that theyre American are somehow smarter and
more ingenious than other people, which really is not so. Actually, the phrase was first used in connection with the atomic bomb, which was
invented and brought to fruition by a bunch of European refugees. Thats Yankee know-how.
MOYERS: Theres long been a bias in this country against science. When Benjamin Franklin was experimenting with the lightning rod, a lot
of good folk said, You dont need a lightning rod. If you want to prevent lightning from striking, you just have to pray about it.
ASIMOV: The bias against science is part of being a pioneer society. You somehow feel the city life is decadent. American history is full of
fables of the noble virtuous farmer and the vicious city slicker. The city slicker is an automatic villain. Unfortunately, such stereotypes can do
damage. A noble ignoramus is not necessarily what the country needs.
(What might Asimov, who in 1980 voiced fears that the fundamentalists coming into power with President Reagan would turn
the country even more against science by demanding that biblical creationism be given an equal footing with evolution in the
classroom, if he knew that a contemporary television station can edit out Neil deGrasse Tysons mention of evolution?)

'The Expulsion of Adam and Eve from the Garden of Eden' by William Blake from John Milton's 'Paradise Lost' (click image
for more)
But when Moyers asks the writer whether he considers himself an enemy of religion, Asimov answers in the negative and offers
this beautifully thoughtful elaboration on the difference between the blind faith of religion and the critical thinking at the heart of
science:
My objection to fundamentalism is not that they are fundamentalists but that essentially they want me to be a fundamentalist, too. Now, they
may say that I believe evolution is true and I want everyone to believe that evolution is true. But I dont want everyone to believe that
evolution is true, I want them to study what we say about evolution and to decide for themselves. Fundamentalists say they want to treat
creationism on an equal basis. But they cant. Its not a science. You can teach creationism in churches and in courses on religion. They
would be horrified if I were to suggest that in churches they should teach secular humanism as nan alternate way of looking at the universe
or evolution as an alternate way of considering how life may have started. In the church they teach only what they believe, and rightly so, I
suppose. But on the other hand, in schools, in science courses, weve got to teach what scientists think is the way the universe works.
He extols the thoroughly conscious ignorance at the heart of science as a much safer foundation of reality than dogma:
That is really the glory of science that science is tentative, that it is not certain, that it is subject to change. What is really disgraceful is to
have a set of beliefs that you think is absolute and has been so from the start and cant change, where you simply wont listen to evidence.
You say, If the evidence agrees with me, its not necessary, and if it doesnt agree with me, its false. This is the legendary remark of Omar
when they captured Alexandria and asked him what to do with the library. He said, If the books agree with the Koran, they are not
necessary and may be burned. If they disagree with the Koran, they are pernicious and must be burned. Well, there are still these Omar-
like thinkers who think all of knowledge will fit into one book called the Bible, and who refuse to allow it is possible ever to conceive of an error
there. To my way of thinking, that is much more dangerous than a system of knowledge that is tentative and uncertain.
Riffing off the famous and rather ominous Dostoevsky line that if God is dead, everything is permitted, Asimov revisits the
notion of intrinsic vs. extrinsic rewards similarly to his earlier remark that good writing is motivated by intrinsic motives
rather than external incentives, he argues that good-personhood cant be steered by dogma but by ones own conscience:
Its insulting to imply that only a system of rewards and punishments can keep you a decent human being. Isnt it conceivable a person
wants to be a decent human being because that way he feels better?
I dont believe that Im ever going to heaven or hell. I think that when I die, there will be nothingness. Thats what I firmly believe. Thats not to
mean that I have the impulse to go out and rob and steal and rape and everything else because I dont fear punishment. For one thing, I fear
worldly punishment. And for a second thing, I fear the punishment of my own conscience. I have a conscience. It doesnt depend on religion.
And I think thats so with other people, too.

'The Rout of the Rebel Angels' by William Blake from John Milton's 'Paradise Lost' (click image for more)
He goes on to extend this conscience-driven behavior to the domain of science, which he argues is strongly motivated by
morality and a generosity of spirit uncommon in most other disciplines, where ego consumes goodwill. (Mark Twain memorably
argued that no domain was more susceptible to human egotism than religion.) Asimov offers a heartening example:
I think its amazing how many saints there have been among scientists. Ill give you an example. In 1900, De Vries studied mutations. He
found a patch of evening primrose of different types, and he studied how they inherited their characteristics. He worked out the laws of
genetics. Two other guys worked out the laws of genetics at the same time, a guy called Karl Correns, who was a German, and Erich
Tschermak von Seysenegg, who was an Austrian. All three worked out the laws of genetics in 1900, and having done so, all three looked
through the literature, just to see what has been done before. All three discovered that in the 1860s Gregor Mendel had worked out the laws
of genetics, and people hadnt paid any attention then. All three reported their findings as confirmation of what Mendel had found. Not one of
the three attempted to say that it was original with him. And you know what it meant. It meant that two of them, Correns and Tschermak von
Seyenegg, lived in obscurity. De Vries is known only because he was also the first to work out the theory of mutations. But as far as
discovering genetics is concerned, Mendel gets all the credit. They knew at the time that this would happen. Thats the sort of thing you just
dont find outside of science.
Moyers, in his typical perceptive fashion, then asks Asimov why, given how much the truth of science excites him, he is best-
known for writing science fiction, and Asimov responds with equal insight and outlines the difference, both cultural and creative,
between fiction in general and science fiction:
In serious fiction, fiction where the writer feels hes accomplishing something besides simply amusing people although theres nothing
wrong with simply amusing people the writer is holding up a mirror to the human species, making it possible for you to understand people
better because youve read the novel or story, and maybe making it possible for you to understand yourself better. Thats an important thing.
Now science fiction uses a different method. It works up an artificial society, one which doesnt exist, or one that may possibly exist in the
future, but not necessarily. And it portrays events against the background of this society in the hope that you will be able to see yourself in
relation to the present society Thats why I write science fiction because its a way of writing fiction in a style that enables me to make
points I cant make otherwise.

Painting by Rowena Morrill
But perhaps the greatest benefit of science fiction, Moyers intimates and Asimov agrees, is its capacity to warm people up to
changes that are inevitable but that seem inconceivable at the present time after all, science fiction writers do have
a remarkable record of getting the future right. Asimov continues:
Society is always changing, but the rate of change has been accelerating all through history for a variety of reasons. One, the change is
cumulative. The very changes you make now make it easier to make further changes. Until the Industrial Revolution came along, people
werent aware of change or a future. They assumed the future would be exactly like it had always been, just with different people It was
only with the coming of the Industrial Revolution that the rate of change became fast enough to be visible in a single lifetime. People were
suddenly aware that not only were things changing, but that they would continue to change after they died. That was when science fiction
came into being as opposed to fantasy and adventure tales. Because people knew that they would die before they could see the changes
that would happen in the next century, they thought it would be nice to imagine what they might be.
As time goes on and the rate of change still continues to accelerate, it becomes more and more important to adjust what you do today to the
fact of change in the future. Its ridiculous to make your plans now on the assumption that things will continue as they are now. You have to
assume that if something youre doing is going to reach fruition in ten years, that in those ten years changes will take place, and perhaps
what youre doing will have no meaning then Science fiction is important because it fights the natural notion that theres something
permanent about things the way they are right now.

Painting by William Blake from Dante's 'Divine Comedy' (click image for more)
Given that accepting impermanence doesnt come easily to us, that stubborn resistance to progress and the inevitability of change
is perhaps also what Asimov sees in the religious fundamentalism he condemns dogma, after all, is based on the premise that
truth is absolute and permanent, never mind that the cultural context is always changing. Though he doesnt draw the link
directly, in another part of the interview he revisits the problem with fundamentalism with words that illuminate the stark contrast
between the cultural role of religion and that of science fiction:
Fundamentalists take a statement that made sense at the time it was made, and because they refuse to consider that the statement may
not be an absolute, eternal truth, they continue following it under conditions where to do so is deadly.
Indeed, Asimov ends the conversation on a related note as he considers what it would take to transcend the intolerance that such
fundamentalism breeds:
MOYERS: Youve lived through much of this century. Have you ever known human beings to think with the perspective youre calling on
them to think with now?
ASIMOV: Its perhaps not important that every human being think so. But how about the leaders and opinion-makers thinking so? Ordinary
people might follow them. It would help if we didnt have leaders who were thinking in exactly the opposite way, if we didnt have people who
were shouting hatred and suspicion of foreigners, if we didnt have people who were shouting that its more important to be unfriendly than to
be friendly, if we didnt have people shouting that the people inside the country who dont look exactly the way the rest of us look have
something wrong with them. Its almost not necessary for us to do good; its only necessary for us to stop doing evil, for goodness sake.
Bill Moyers: A World of Ideas is a remarkable tome in its entirety. Complement this particular sample-taste with Asimov on
religion vs. humanism,Buckminster Fullers vision for the future of education, and Carl Sagan on science and spirituality.
Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the weeks best articles. Heres what to
expect. Like? Sign up.
TAGS: BILL MOYERS BOOKS CULTURE EDUCATION FUTURISM INTERVIEW ISAAC ASIMOVPHILOSOPHY PSYCHOLOGY RELIGION SCIENCE TECHNOLOGY WRITING
14 MARCH, 2014
Einstein on Fairy Tales and Education
By: Maria Popova
How far superior an education that stresses independent action and personal
responsibility is to one that relies on drill, external authority and ambition.
Albert Einstein, celebrated as the quintessential modern genius, is credited with many
things from era-defining scientific discoveries to great wisdom on everything
from creativity tokindness to war to the secret to learning anything. Among them is also a
sentiment of admirable insight yet questionable attribution: In Christopher Fraylings 2005
book Mad, Bad and Dangerous?: The Scientist and the Cinema, Einstein is credited as
having said:
If you want your children to be intelligent, read them fairy tales. If you want them to be very
intelligent, read them more fairy tales.

Illustration by Vladimir Radunsky for 'On a Beam of Light: A Story of Albert Einstein.' Click image for details.
As an enormous lover of fairy tales and a believer in Tolkiens proposition thatthey are not written for children, I was, of
course, instantly gladdened by these words, but also peeved by the broken chain of proper attribution. After diligent digging
through various archives, I found the earliest reference to this in an out-of-print volume published by the Montana State Library
for Book Week in November of 1954. The entry, a second-hand account at best, reads:
In the current New Mexico Library Bulletin, Elizabeth Margulis tells a story of a woman who was a personal friend of the late dean of
scientists, Dr. Albert Einstein. Motivated partly by her admiration for him, she held hopes that her son might become a scientist. One day she
asked Dr. Einsteins advice about the kind of reading that would best prepare the child for this career. To her surprise, the scientist
recommended Fairy tales and more fairy tales. The mother protested that she was really serious about this and she wanted a serious
answer; but Dr. Einstein persisted, adding that creative imagination is the essential element in the intellectual equipment of the true scientist,
and that fairy tales are the childhood stimulus to this quality.
While we might never know the full, accurate details for Einsteins fairy-tale adage, embedded in it is something the celebrated
physicist felt very strongly about: the importance of the liberal arts and humanities in education. The preface to Dear Professor
Einstein: Albert Einsteins Letters to and from Children(public library) the same impossibly endearing volume that gave us
hisencouraging advice to a little girl who wanted to be a scientist and his answer to child who asked whether scientists pray
features the following autobiographical reflection by Einstein:
This school with its liberal spirit and teachers with a simple earnestness that did not rely on any external authority, made an unforgettable
impression on me. In comparing it with six years schooling at an authoritarian German Gymnasium, I was made acutely aware how far
superior an education that stresses independent action and personal responsibility is to one that relies on drill, external authority and
ambition.
Complement with Einstein on why were alive (in a letter to a Brain Pickingsreaders mother), his remarkable conversation with
Indian philosopher Tagore, and his life-story, illustrated.
Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the weeks best articles. Heres what to
expect. Like? Sign up.
TAGS: ALBERT EINSTEIN BOOKS CHILDREN'S BOOKS CULTURE EDUCATION SCIENCE
05 MARCH, 2014
Buckminster Fuller Presages Online Education, with a Touch of
TED, Netflix, and Pandora, in 1962
By: Maria Popova
A prophetic vision for mobile, time-shifted, tele-commuted, on-demand education.
In 1962, Buckminster Fuller delivered a prophetic lecture at Southern Illinois University
on the future of education aimed at solving [educational] problems by design competence
instead of by political reform. It was eventually published as Education Automation:
Comprehensive Learning for Emergent Humanity(public library) a prescient vision for
online education decades before the web as we know it, and half a century before the golden
age of MOOCs, with elements of TED and Pandora mixed in.
Fuller begins by tracing his own rocky start in traditional education and his eventual
disillusionment with the establishment:
I am a New Englander, and I entered Harvard immaturely. I was too puerilely in love with a special,
romantic, mythical Harvard of my own conjuring an Olympian world of super athletes and
alluring, grown-up, worldly heroes. I was the fifth generation of a direct line of fathers and their sons
attending Harvard College. I arrived there in 1913 before World War I and found myself primarily
involved in phases of Harvard that were completely irrelevant to Harvards educational system. For
instance, because I had been quarterback on a preparatory school team whose quarterbacks
before me had frequently become quarterbacks of the Harvard football team, I had hoped that I too might follow that precedent, but I broke
my knee, and that ambition was frustrated.
[...]
Though I had entered Harvard with honor grades I obtained only good to passing marks in my college work, which I adolescently looked
upon as a chore done only to earn the right to live in the Harvard community. But above all, I was confronted with social problems of clubs
and so forth.
[...]
The problems they generated were solved by the great [Greek] House system that was inaugurated after World War I. My father died when
I was quite young, and though my family was relatively poor I had come to Harvard from a preparatory school for quite well-to-do families. I
soon saw that I wasnt going to be included in the clubs as I might have been if I had been very wealthy or had a father looking out for me, for
much of the clubs membership was prearranged by the clubs graduate committees. I was shockingly surprised by the looming situation. I
hadnt anticipated these social developments. I suddenly saw a class system existing in Harvard of which I had never dreamed. I was not
aware up to that moment that there was a social class system and that there were different grades of citizens. My thoughts had been
idealistically democratic. Some people had good luck and others bad, but not because they were not equal. I considered myself about to be
ostracized or compassionately tolerated by the boys I had grown up with. I became panicky about that disintegration of my idealistic
Harvard world, went on a pretended lark, cut classes, and was fired.
Out of college, I went to work and worked hard. In no time at all, reports went to Harvard that I was a good and able boy and that I really
ought to go back to college; so Harvard took me back. However, I was now considered a social maverick, and I saw none of my old friends;
it hurt too much. Again I cut classes, spent all my years allowance, and once more was fired. After my second firing I again worked very
hard. If World War I hadnt come along, I am sure the university would have taken me back again, and I am sure I would have been fired
again. Each time I returned to Harvard I entered a world of gnawing apprehensions, not an educational institution, and that was the problem.

But Fuller, known for his public distaste for specialization, managed to get an education anyway, in due and slow course
one of [his] own inquiring, experimenting, and self-disciplining. In the thirty years since his Harvard fiasco, he was invited
as a lecturer, critic, or experimental seminarist at 106 universities around the world, including nine times at Princeton, eight at
MIT, and four at Cornell.
Then, forty-seven years after he had been expelled from Harvard, the universitys Dean Bundy, at the time one of JFKs White
House advisors, invited him to return to Harvard as the Charles Eliot Norton Professor of Poetry a prestigious and unusual
position created to emphasize the value of cross-pollinating ideas from different fields, lending to the concept of poetry the
most expansive possible definition, also held by such luminaries as Umberto Eco, T.S. Eliot, E.E. Cummings, Jorge Luis
Borges, Italo Calvino, and Leonard Bernstein. Fuller, who wasnt a poet in the traditional sense despite his poetic scientific
revision of The Lords Prayer, writes:
The chair was founded because its donor felt that the university needed to bring in individuals who on their own initiative have long
undertaken objective realizations reflecting the wisdom harvested by the educators, which realizations might tend to regenerate the vigor of
the university world. Harvard fills this professorship with men* who are artists, playwrights, authors, architects, and poets. The word poet in
this professorship of poetry is a very general term for a person who puts things together in an era of great specialization wherein most people
are differentiating or taking things apart. Demonstrated capability in the integration of ideas is the general qualification for this professorship.
(* Alas, this isnt merely a reflection of the eras gender-biased language, wherein men really means people. Men really
does mean men here, as Harvard didnt appoint a female Norton poet until 1979 Helen Gardner and as of this writing, has
only had two other women hold the position since its inception in 1925 Nadine Gordimer in 19941995 and Linda Nochlin in
20032004.)
Fuller considers what made him qualified for the position:
By my own rules, I may not profess any special preoccupation or capability. I am a random element. There is nothing even mildly
extraordinary about me except that I think I am durable and inquisitive in a comprehensive pattern. I have learned much; but I dont know
very much; but what I have learned, I have learned by trial and error. And I have great confidence in the meager store of wisdom that I have
secured.
He admonishes that traditional education incentivizes and measures precisely the opposite not the ability to be inquisitive in a
comprehensive pattern, but to memorize with a narrow focus, which in turn stifles innovation:
I am convinced that humanity is characterized by extraordinary love for its new life and yet has been misinforming its new life to such an
extent that the new life is continually at a greater disadvantage than it would be if abandoned in the wilderness by the parents.
[]
The kind of examination procedure that our science foundations and other science leaders have developed is one in which they explore to
discover whether [a] capable student is able to unlearn everything he has learned, because experience has shown that that is what he is
going to have to do if he is to become a front-rank scientist. The frontiers of science are such that almost every morning many of our
hypotheses of yesterday are found inadequate or in error. So great is the frontier acceleration that now in a year of such events much of
yesterdays conceptioning becomes obsolete.
[]
I am quite confident that humanity is born with its total intellectual capability already on inventory and that human beings do not add anything
to any other human being in the way of faculties and capacities. What usually happens in the educational process is that the faculties are
dulled, overloaded, stuffed and paralyzed, so that by the time that most people are mature they have lost use of many of their innate
capabilities. My long-time hope is that we may soon begin to realize what we are doing and may alter the education process in such a way
as only to help the new life to demonstrate some of its very powerful innate capabilities.

Writing several years before the historic moon landing, Fuller argues that while such cosmic feats might be admirable aspirations
for scientific, technological, and cultural progress, nothing is going to be quite so surprising or abrupt in the forward history of
man as the forward evolution in the educational processes. He goes on to present a prescient vision for the future of mobile,
time-shifted, tele-commuted education:
Today we are extraordinarily mobile Comprehensively, the world is going from a Newtonian static norm to an Einsteinian all-motion norm.
That is the biggest thing that is happening at this moment in history. We are becoming quick and the graveyards of the dead become
progressively less logical. [Educational planners] will have to be serving the children of the mobile people who really, in a sense, dont have a
base
And yet, noting that the worlds population is increasing in exponential rates and greater and greater numbers of people will need
to be educated, he prefaces his vision for the future of education with a cautionary note about the importance of remaining rooted
in history and connected with the great minds who have come before. (Massimo Vignellis famous contention that a designer
without a sense of history is worth nothing is equally true, after all, of any field.) Fuller writes:
The new life needs to be inspired with the realization that it has all kinds of new advantages that have been gained through great dedications
of unknown, unsung heroes of intellectual exploration and great intuitively faithful integrities of men groping in the dark. Unless the new life is
highly appreciative of those who have gone before, it wont be able to take effective advantage of its heritage. It will not be as regenerated
and inspired as it might be if it appreciated the comprehensive love invested in that heritage.
He goes on to outline the technological advances that would give our worlds new life access to universal education, describing
a paradigm that presages the concept of network-based online education, combined with the high production value of TED talks:
I have taken photographs of my grandchildren looking at television. Without consideration of the value, the actual concentration of a child
on the message which is coming to him is fabulous. They really latch on. Given the chance to get accurate, logical, and lucid information at
the time when they want and need to get it, they will go after it and inhibit it in a most effective manner. I am quite certain that we are soon
going to begin to do the following: At our universities we will take the men who are the faculty leaders in research or in teaching. We are not
going to ask them to give the same lectures over and over each year from their curriculum cards, finding themselves confronted with another
roomful of people and asking themselves, What was it I said last year? This is a routine which deadens the faculty member. We are going
to select, instead, the people who are authorities on various subjects the men who are most respected by other men within their
respective departments and fields. They will give their basic lecture course just once to a group of human beings, including both the experts
in their own subject and bright children and adults without special training in their field. This lecture will be recorded. . . . They will make
moving picture footage of the lecture as well as hi-fi tape recording. Then the professor and his faculty associates will listen to this recording
time and again.
What you say is very good, his associates may comment, but we have heard you say it a little better at other times. The professor then
dubs in a better statement. Thus begins complete reworking of the tape, cleaned up, and cleaned up some more, as in the moving picture
cutting, and new illustrative footage will be added on. The whole of a university department will work on improving the message and
conceptioning of a picture for many months, sometimes for years. The graduate students who want to be present in the university and who
also qualify to be with the men who have great powers and intellectual capability together with the faculty may spend a year getting a
documentary ready. They will not even depend upon the diction of the original lecturer, because the diction of that person may be very
inadequate to his really fundamental conceptioning and information, which should be superb. His knowledge may be very great, but he may
be a poor lecturer because of poor speaking habits or false teeth. Another voice will take over the task of getting his exact words across.
Others will gradually process the tape and moving picture footage, using communications specialists, psychologists, etc.
For instance, I am quite certain that some day we will take a subject such as Einsteins Theory of Relativity, and with the Einstein of the
subject and his colleagues working on it for a year, we will finally get it reduced down to what is net in the subject and enthusiastically
approved by the Einstein who gave the original lecture. What is net will become communicated so well that any child can turn on a
documentary device, a TV, and get the Einstein lucidity of thinking and get it quickly and firmly. I am quite sure that we are going to get
research and development laboratories of education where the faculty will become producers of extraordinary moving-picture
documentaries. That is going to be the big, new educational trend.

Noting that these documentaries will be distributed in various ways, Fuller even presages the feedback loop of content
recommendation algorithms, envisioning a sort Pandora of education:
There is a direct, fixed, wireless connection, an actual direct linkage to individuals; and it works in both directions. Therefore, the receiving
individual can beam back, I dont like it. He may and can say yes or no. This yes or no is the basis of a binary mathematical system,
and immediately brings in the language of the modern electronic computers. With two-way TV, constant referendum of democracy will be
manifest, and democracy will become the most practical form of industrial and space-age government by all people, for all people.
It will be possible not only for an individual to say, I dont like it, on his two-way TV but he can also beam-dial (without having to know
mathematics), I want number so and so. It is also possible with this kind of two-way TV linkage with individuals homes to send out many
different programs simultaneously; in fact, as many as there are two-way beamed-up receiving sets and programs. It would be possible to
have large central storages of documentaries great libraries. A child could call for a special program information locally over the TV set.
While Fuller acknowledges the general baby-sitting function of traditional schools and the value of these social experiences for
kids, he argues that this new model of long-distance education also provides the additional benefit of greater capacity for solitary
contemplation of materials. He points to Einstein, who he had met a few years earlier, to illustrate his point:
Einstein, when he wanted to study, didnt sit in the middle of a school room. That is probably the poorest place he could have gone to study.
When an individual is really thinking, he is tremendously isolated. He may manage to isolate himself in Grand Central Station, but it is despite
the environment rather than because of it. The place to study is not in a school room.
This, Fuller argues, wouldnt threaten traditional universities but, rather, fortify them and amplify their power by allowing
education to serve a broader purpose in human culture not mere memorization, but a lens on how to live. He concludes
optimistically:
Education will then be concerned primarily with exploring to discover not only more about the universe and its history but about what the
universe is trying to do, about why man is part of it, and about how can, and may man best function in universal evolution.
[]
The universities are going to be wonderful places. Scholars will stay there for a long, long time the rest of their lives while they are
developing more and more knowledge about the whole experience of man. All men will be going around the world in due process as
everyday routine search and exploration, and the world experiencing patterning will be everywhere all students from everywhere all over
the world. That is all part of the new pattern that is rushing upon us.

Fuller goes into his prescient vision in greater detail in the rest of Education Automation, including some timeless commentary
on the complex politics of education. Complement it with his scientific revision of The Lords Prayer and his case against the
specialization of knowledge. For a contemporary alternative to the path of traditional education, see Kio Starks
indispensable Dont Go Back to School.
Public domain photographs courtesy of the State Archives of North Carolina via Flickr Commons
The 13 Best Science and Technology Books of 2013
By: Maria Popova
The wonders of the gut, why our brains are wired to be social, what poetry and math
have in common, swarm intelligence vs. God, and more.
On the heels of the years best reads in psychology and philosophy, art and design, history and biography, and childrens books,
the seasons subjective selection of best-of reading lists continues with the finest science and technology books of 2013. (For
more timeless stimulation, revisit the selections for 2012 and 2011.)
1. THIS EXPLAINS EVERYTHING
Every year since 1998, intellectual impresario and Edge editor John Brockman has been
posing a single grand question to some of our times greatest thinkers across a wide
spectrum of disciplines, then collecting the answers in an annual anthology. Last years
answers to the question What scientific concept will improve everybodys cognitive
toolkit? were released in This Will Make You Smarter: New Scientific Concepts to
Improve Your Thinking, one of the years best psychology and philosophy books.
In 2012, the question Brockman posed, proposed by none other than Steven Pinker,
was What is your favorite deep, elegant, or beautiful explanation? The answers,
representing an eclectic mix of 192 (alas, overwhelmingly male) minds spanning
psychology, quantum physics, social science, political theory, philosophy, and more, are
collected in the edited compendium This Explains Everything: Deep, Beautiful, and
Elegant Theories of How the World Works (UK;public library) and are also
available online.
In the introduction preceding the micro-essays, Brockman frames the question and its
ultimate objective, adding to historys most timeless definitions of science:
The ideas presented on Edge are speculative; they represent the frontiers in such areas as evolutionary biology, genetics, computer
science, neurophysiology, psychology, cosmology, and physics. Emerging out of these contributions is a new natural philosophy, new ways
of understanding physical systems, new ways of thinking that call into question many of our basic assumptions.
[]
Perhaps the greatest pleasure in science comes from theories that derive the solution to some deep puzzle from a small set of simple
principles in a surprising way. These explanations are called beautiful or elegant.
[]
The contributions presented here embrace scientific thinking in the broadest sense: as the most reliable way of gaining knowledge about
anything including such fields of inquiry as philosophy, mathematics, economics, history, language, and human behavior. The common
thread is that a simple and nonobvious idea is proposed as the explanation of a diverse and complicated set of phenomena.

Puffer fish with Akule by photographer Wayne Levin. Click image for details.
Stanford neuroscientist Robert Sapolsky, eloquent as ever, marvels at the wisdom of the crowd and the emergence of swarm
intelligence:
Observe a single ant, and it doesnt make much sense, walking in one direction, suddenly careening in
another for no obvious reason, doubling back on itself. Thoroughly unpredictable.
The same happens with two ants, a handful of ants. But a colony of ants makes fantastic sense.
Specialized jobs, efficient means of exploiting new food sources, complex underground nests with
temperature regulated within a few degrees. And critically, theres no blueprint or central source of
commandeach individual ants has algorithms for their behaviors. But this is not wisdom of the crowd,
where a bunch of reasonably informed individuals outperform a single expert. The ants arent reasonably
informed about the big picture. Instead, the behavior algorithms of each ant consist of a few simple rules
for interacting with the local environment and local ants. And out of this emerges a highly efficient colony.
Ant colonies excel at generating trails that connect locations in the shortest possible way, accomplished with simple rules about when to lay
down a pheromone trail and what to do when encountering someone elses trailapproximations of optimal solutions to the Traveling
Salesman problem. This has useful applications. In ant-based routing, simulations using virtual ants with similar rules can generate optimal
ways of connecting the nodes in a network, something of great interest to telecommunications companies. It applies to the developing brain,
which must wire up vast numbers of neurons with vaster numbers of connections without constructing millions of miles of connecting axons.
And migrating fetal neurons generate an efficient solution with a different version of ant-based routine.
A wonderful example is how local rules about attraction and repulsion (i.e., positive and negative charges) allow simple molecules in an
organic soup to occasionally form more complex ones. Life may have originated this way without the requirement of bolts of lightning to
catalyze the formation of complex molecules.
And why is self-organization so beautiful to my atheistic self? Because if complex, adaptive systems dont require a blue print, they dont
require a blue print maker. If they dont require lightning bolts, they dont require Someone hurtling lightning bolts.
Developmental psychologist Howard Gardner, who famously coined the seminal theory of multiple intelligences, echoes Anas
Nin in advocating for the role of the individual and Susan Sontag in stressing the impact of individual acts on collective fate. His
answer, arguing for the importance of human beings, comes as a welcome antidote to a question that suffers the danger of being
inherently reductionist:
In a planet occupied now by seven billion inhabitants, I am amazed by the difference that one human
being can make. Think of classical music without Mozart or Stravinsky; of painting without Caravaggio,
Picasso or Pollock; of drama without Shakespeare or Beckett. Think of the incredible contributions of
Michelangelo or Leonardo, or, in recent times, the outpouring of deep feeling at the death of Steve Jobs
(or, for that matter, Michael Jackson or Princess Diana). Think of human values in the absence of Moses
or Christ.
[]
Despite the laudatory efforts of scientists to ferret out patterns in human behavior, I continue to be struck by the impact of single individuals, or
of small groups, working against the odds. As scholars, we cannot and should not sweep these instances under the investigative rug. We
should bear in mind anthropologist Margaret Meads famous injunction: Never doubt that a small group of thoughtful committed citizens can
change the world. It is the only thing that ever has.
Uber-curator Hans Ulrich Obrist, who also contributed to last years volume, considers the parallel role of patterns and chance
in the works of iconic composer John Cage and painter Gerhard Richter, and the role of uncertainty in the creative process:
In art, the title of a work can often be its first explanation. And in this context I am thinking especially of the
titles of Gerhard Richter. In 2006, when I visited Richter in his studio in Cologne, he had just finished a
group of six corresponding abstract paintings which he gave the title Cage.
There are many relations between Richters painting and the compositions of John Cage. In a book about
the Cage series, Robert Storr has traced them from Richters attendance of a Cage performance at the
Festum Fluxorum Fluxus in Dsseldorf 1963 to analogies in their artistic processes. Cage has often
applied chance procedures in his compositions, notably with the use of the I Ching. Richter in his abstract
paintings also intentionally allows effects of chance. In these paintings, he applies the oil paint on the
canvas by means of a large squeegee. He selects the colors on the squeegee, but the factual trace that the paint leaves on the canvas is to
a large extent the outcome of chance.
[]
Richters concise title, Cage, can be unfolded into an extensive interpretation of these abstract paintings (and of other works)but, one can
say, the short form already contains everything. The title, like an explanation of a phenomenon, unlocks the works, describing their relation to
one of the most important cultural figures of the twentieth century, John Cage, who shares with Richter the great themes of chance and
uncertainty.
Writer, artist, and designer Douglas Coupland, whose biography of Marshall McLuhan remains indispensable, offers a lyrical
meditation on the peculiar odds behind coincidences and dja vus:
I take comfort in the fact that there are two human moments that seem to be doled out equally and
democratically within the human conditionand that there is no satisfying ultimate explanation for either.
One is coincidence, the other is dja vu. It doesnt matter if youre Queen Elizabeth, one of the thirty-three
miners rescued in Chile, a South Korean housewife or a migrant herder in Zimbabwein the span of 365
days you will pretty much have two dja vus as well as one coincidence that makes you stop and say,
Wow, that was a coincidence.
The thing about coincidence is that when you imagine the umpteen trillions of coincidences that can
happen at any given moment, the fact is, that in practice, coincidences almost never do occur.
Coincidences are actually so rare that when they do occur they are, in fact memorable. This suggests to me that the universe is designed to
ward off coincidence whenever possiblethe universe hates coincidenceI dont know whyit just seems to be true. So when a
coincidence happens, that coincidence had to workawfully hard to escape the system. Theres a message there. What is it? Look. Look
harder. Mathematicians perhaps have a theorem for this, and if they do, it might, by default be a theorem for something larger than what
they think it is.
Whats both eerie and interesting to me about dja vus is that they occur almost like metronomes throughout our lives, about one every six
months, a poetic timekeeping device that, at the very least, reminds us we are alive. I can safely assume that my thirteen year old niece,
Stephen Hawking and someone working in a Beijing luggage-making factory each experience two dja vus a year. Not one. Not
three. Two.
The underlying biodynamics of dja vus is probably ascribable to some sort of tingling neurons in a certain part of the brain, yet this doesnt
tell us why they exist. They seem to me to be a signal from larger point of view that wants to remind us that our lives are distinct, that they
have meaning, and that they occur throughout a span of time. We are important, and what makes us valuable to the universe is our
sentience and our curse and blessing of perpetual self-awareness.
Originally featured in January read more here.
2. YOU ARE STARDUST
Everyone you know, everyone you ever heard of, every human being who ever was
lived there on a mote of dust suspended in a sunbeam,Carl Sagan famously
marveled in his poeticPale Blue Dot monologue, titled after the iconic 1990 photograph
of Earth. The stardust metaphor for our interconnection with the cosmos
soon permeated popular culture and became a vehicle for the allure of space
exploration. Theres something at once incredibly empowering and incredibly
humbling in knowing that the flame in your fireplace came from the sun.
Thats precisely the kind of cosmic awe environmental writer Elin Kelsey and Toronto-
based Korean artist Soyeon Kim seek to inspire in kids in You Are Stardust (public
library) an exquisite picture-book that instills that profound sense of connection
with the natural world, and also among the best childrens books of the year.
Underpinning the narrative is a bold sense of optimism a refreshing antidote to the
fear-appeal strategy plaguing most environmental messages today.

Kims breathtaking dioramas, to which this screen does absolutely no justice, mix tactile physical materials with fine drawing
techniques and digital compositing to illuminate the relentlessly wondrous realities of our intertwined existence: The water in
your sink once quenched the thirst of dinosaurs; with every sneeze, wind blasts out of your nose faster than a cheetahs sprint; the
electricity that powers every thought in your brain is stronger than lightning.
But rather than dry science trivia, the message is carried on the wings of poetic admiration for these intricate relationships:
Be still. Listen.
Like you, the Earth breathes.
Your breath is alive with the promise of flowers.
Each time you blow a kiss to the world, you spread pollen that might grow to be a new plant.
The book is nonetheless grounded in real science. Kelsey notes:
I wrote this book as a celebration one to honor the extraordinary ways in which all of us simply are nature. Every example in this book is
backed by current science. Every day, for instance, you breathe in more than a million pollen grains.

But what makes the project particularly exciting is that, in the face of the devastating gender gap in science education, here is a
thoughtful, beautiful piece of early science education presented by two women, the most heartening such example since Lauren
Rednisss Radioactive.


A companion iPad app features sound effects, animation, an original score by Paul Aucoin, behind-the-scenes glimpses of Kims
process in creating her stunning 3D dioramas, and even build-your-own-diorama adventures.
Originally featured in March see more here.
3. ON LOOKING
How we spend our days, Annie Dillard wrote in her timelessly beautiful meditation
on presence over productivity, is, of course, how we spend our lives. And nowhere do we
fail at the art of presence most miserably and most tragically than in urban life in the
city, high on the cult of productivity, where we float past each other, past the buildings and
trees and the little boy in the purple pants, past life itself, cut off from the breathing of the
world by iPhone earbuds and solipsism. And yet: The art of seeing has to be
learned, Marguerite Duras reverberates and itcan be learned, as cognitive scientist Alexandra Horowitz invites us to believe
in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library), also among the best psychology and
philosophy books of the year a record of her quest to walk around a city block with eleven different experts, from an artist
to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is
undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing Ive read in ages. In a
way, its the opposite but equally delightful mirror image of Christoph Niemanns Abstract City a concrete, immersive
examination of urbanity blending the mindfulness of Sherlock Holmes withthe expansive sensitivity of Thoreau.
Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call reality:
Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the
distance, and right in front of you.
By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of
information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your
chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw,
the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own
shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.
This adaptive ignorance, she argues, is there for a reason we celebrate it as concentration and welcome its way of easing our
cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital
importance, and to dismiss or entirely miss all else. (Attention is an intentional, unapologetic discriminator, Horowitz tells us.
It asks what is relevant right now, and gears us up to notice only that.) But while this might make us more efficient in our goal-
oriented day-to-day, it also makes us inhabit a largely unlived and unremembered life, day in and day out.
For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her curly haired, sage
mixed breed (who also inspired Horowitzs first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she
found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences
she and her canine companion were having along the exact same route:
Minor clashes between my dogs preferences as to where and how a walk should proceed and my own indicated that I was experiencing
almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a
sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my
attention invited along attentions companion: inattention to everything else.
The book was her answer to the disconnect, an effort to attend to that inattention. It is not, she warns us, about how to bring
more focus to your reading of Tolstoy or how to listen more carefully to your spouse. Rather, it is an invitation to the art of
observation:
Together, we became investigators of the ordinary, considering the block the street and everything on itas a living being that could be
observed.
In this way, the familiar becomes unfamiliar, and the old the new.
Her approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our
conditioned concentration that obscures it, and the power of individual bias in perception or what we call expertise, acquired
by passion or training or both in bringing attention to elements that elude the rest of us. What follows is a whirlwind of
endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorkers special fascination with the
humming life-form that is an urban street, and her diverse companions take to the city.
First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously
ravenous intellectual curiosity, Horowitz is a rare magician with language. (The walkers trod silently; the dogs said nothing. The
only sound was the hum of air conditioners, she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she
ponders parenthetically, how does a Q-tip escape?; turning her final corner, she gazes at the entrance of a mansion and its pair
of stone lions waiting patiently for royalty that never arrives. Stunning.)
But as soon as she joins her experts, Horowitz is faced with the grimacing awareness that despite her best, most Sherlockian
efforts, she was missing pretty much everything. She arrives at a newfound, profound understanding of what William James
meant when he wrote, My experience is what I agree to attend to. Only those items which I notice shape my mind.:
I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of
mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the
signs, but not their meanings. We are not blinded, but we have blinders.
Originally featured in August, with a closer look at the expert insights. For another peek at this gem, which is easily among my
top three favorite books of the past decade, learn how to do the step-and-slide.
4. WILD ONES
Wild Ones: A Sometimes Dismaying, Weirdly Reassuring Story About Looking at
People Looking at Animals in America (public library) by journalist Jon
Mooallem isnt the typical story designed to make us better by making us feel bad, to
scare us into behaving, into environmental empathy; Mooallems is not the self-
righteous tone of capital-K knowing typical of many environmental activists but the
scientists disposition of not-knowing, the poets penchant for negative
capability.Rather than ready-bake answers, he offers instead directions of thought and
signposts for curiosity and, in the process, somehow gently moves us a little bit closer
to our better selves, to a deep sense of, as poet Diane Ackerman beautifully put it in
1974, the plain everythingness of everything, in cahoots with the everythingness of
everything else.
In the introduction, Mooallem recalls looking at his four-year-old daughter Islas
menagerie of stuffed animals and the odd cultural disconnect they mime:
[T]hey were foraging on the pages of every bedtime story, and my daughter was sleeping in
polar bear pajamas under a butterfly mobile with a downy snow owl clutched to her chin. Her
comb handle was a fish. Her toothbrush handle was a whale. She cut her first tooth on a
rubber giraffe.
Our world is different, zoologically speaking less straightforward and more grisly. We are living in the eye of a great storm of extinction, on
a planet hemorrhaging living things so fast that half of its nine million species could be gone by the end of the century. At my place, the teddy
bears and giggling penguins kept coming. But I didnt realize the lengths to which humankind now has to go to keep some semblance of
actual wildlife in the world. As our own species has taken over, weve tried to retain space for at least some of the others being pushed aside,
shoring up their chances of survival. But the threats against them keep multiplying and escalating. Gradually, Americas management of its
wild animals has evolved, or maybe devolved, into a surreal kind of performance art.

Yet even conservationists small successes crocodile species bouncing back from the brink of extinction, peregrine falcons
filling the skies once again even these pride points demonstrate the degree to which weve assumed usurped, even a
puppeteer role in the theater of organic life. Citing a scientist who lamented that right now, nature is unable to stand on its own,
Mooallem writes:
Weve entered what some scientists are calling the Anthropocene a new geologic epoch in which human activity, more than any other
force, steers change on the planet. Just as were now causing the vast majority of extinctions, the vast majority of endangered species will
only survive if we keep actively rigging the world around them in their favor. We are gardening the wilderness. The line between
conservation and domestication has blurred.
He finds himself uncomfortably straddling these two animal worlds the idyllic little-kids dreamland and the messy, fragile
ecosystem of the real world:
Once I started looking around, I noticed the same kind of secondhand fauna that surrounds my daughter embellishing the grown-up world,
too not just the conspicuous bald eagle on flagpoles and currency, or the big-cat and raptor names we give sports teams and computer
operating systems, but the whale inexplicably breaching in the life-insurance commercial, the glass dolphin dangling from a rearview mirror,
the owl sitting on the rump of a wild boar silk-screened on a hipsters tote bag. I spotted wolf after wolf airbrushed on the sides of old vans,
and another wolf, painted against a full moon on purple velvet, greeting me over the toilet in a Mexican restaurant bathroom. [But] maybe
we never outgrow the imaginary animal kingdom of childhood. Maybe its the one we are trying to save.
[]
From the very beginning, Americas wild animals have inhabited the terrain of our imagination just as much as theyve inhabited the actual
land. They are free-roamingRorschachs, and we are free to spin whatever stories we want about them. The wild animals always have no
comment.

So he sets out to better understand the dynamics of the cultural forces that pull these worlds together with shared abstractions and
rip them apart with the brutal realities of environmental collapse. His quest, in which little Isla is a frequent companion, sends
him on the trails of three endangered species a bear, a butterfly, and a bird which fall on three different points on the
spectrum of conservation reliance, relying to various degrees on the mercy of the very humans who first disrupted the machinery
of their wildness. On the way, he encounters a remarkably vibrant cast of characters countless passionate citizen scientists, a
professional theater actor who, after an HIV diagnosis, became a professional butterfly enthusiast, and even Martha Stewart
and finds in their relationship with the environment the same creeping disquiet about the future that Mooallem himself came to
know when he became a father. In fact, the entire project was inextricably linked to his sense of fatherly responsibility:
Im part of a generation that seems especially resigned to watching things we encountered in childhood disappear: landline telephones,
newspapers, fossil fuels. But leaving your kids a world without wild animals feels like a special tragedy, even if its hard to rationalize why it
should.
The truth is that most of us will never experience the Earths endangered animals as anything more than beautiful ideas. They are figments
of our shared imagination, recognizable from TV, but stalking places places out there to which we have no intention of going. I
wondered how that imaginative connection to wildlife might fray or recalibrate as were forced to take more responsibility for its wildness.
It also occurred to me early on that all three endangered species I was getting to know could be gone by the time Isla is my age. Its possible
that, thirty years from now, theyll have receded into the realm of dinosaurs, or the realm of Pokmon, for that matter fantastical creatures
whose names and diets little kids memorize from books. And its possible, too, I realized, that it might not even make a difference, that there
would still be polar bears on footsy pajamas and sea turtle-shaped gummy vitamins that there could be so much actual destruction
without ever meaningfully upsetting the ecosystems in our minds.

Originally featured in May read more here.
5. THINKING IN NUMBERS
Daniel Tammet was born with an unusual mind he was diagnosed with high-
functioning autistic savant syndrome, which meant his brains uniquely wired circuits
made possible such extraordinary feats of computation and memory as learning Icelandic
in a single week and reciting the number pi up to the 22,514th digit. He is also among the
tiny fraction of people diagnosed with synesthesia that curious crossing of the senses
that causes one to hear colors, smell sounds, or perceive words and numbers in
different hues, shapes, and textures. Synesthesia is incredibly rare Vladimir
Nabokov was among its few famous sufferers which makes it overwhelmingly hard for
the majority of us to imagine precisely what its like to experience the world through this
sensory lens. Luckily, Tammet offers a fascinating first-hand account in Thinking In
Numbers: On Life, Love, Meaning, and Math (public library) a magnificent
collection of 25 essays on the math of life, celebrating the magic of possibility in all its
dimensions. In the process, he also invites us to appreciate the poetics of numbers,
particularly of ordered sets in other words, the very lists that dominate everything from
our productivity tools to our creative inventories to the cheapened headlines flooding the
internet.
Reflecting on his second book, Embracing the Wide Sky: A Tour Across the Horizons of the Mind, and the overwhelming
response from fascinated readers seeking to know what its really like to experience words and numbers as colors and textures
to experience the beauty that a poem and a prime number exert on a synesthete in equal measure Tammet offers an absorbing
simulation of the synesthetic mind:
Imagine.
Close your eyes and imagine a space without limits, or the infinitesimal events that can stir up a countrys revolution. Imagine how the perfect
game of chess might start and end: a win for white, or black, or a draw? Imagine numbers so vast that they exceed every atom in the
universe, counting with eleven or twelve fingers instead of ten, reading a single book in an infinite number of ways.
Such imagination belongs to everyone. It even possesses its own science: mathematics. Ricardo Nemirovsky and Francesca Ferrara, who
specialize in the study of mathematical cognition, write that like literary fiction, mathematical imagination entertains pure possibilities. This is
the distillation of what I take to be interesting and important about the way in which mathematics informs our imaginative life. Often we are
barely aware of it, but the play between numerical concepts saturates the way we experience the world.

Sketches from synesthetic artist and musician Michal Levy's animated visualization of John Coltrane's 'Giant Steps.' Click
image for details.
Tammet, above all, is enchanted by the mesmerism of the unknown, which liesat the heart of science and the heart of poetry:
The fact that we have never read an endless book, or counted to infinity (and beyond!) or made contact with an extraterrestrial civilization (all
subjects of essays in the book) should not prevent us from wondering: what if? Literature adds a further dimension to the exploration of
those pure possibilities. As Nemirovsky and Ferrara suggest, there are numerous similarities in the patterns of thinking and creating shared
by writers and mathematicians (two vocations often considered incomparable.)
In fact, this very link between mathematics and fiction, between numbers and storytelling, underpins much of Tammets
exploration. Growing up as one of nine siblings, he recounts how the oppressive nature of existing as a small number in a large
set spurred a profound appreciation of numbers as sensemaking mechanisms for life:
Effaced as individuals, my brothers, sisters, and I existed only in number. The quality of our quantity became something we could not
escape. It preceded us everywhere: even in French, whose adjectives almost always follow the noun (but not when it comes to une grande
famille). From my family I learned that numbers belong to life. The majority of my math acumen came not from books but from regular
observations and day-to-day interactions. Numerical patterns, I realized, were the matter of our world.
This awareness was the beginning of Tammets synesthetic sensibility:
Like colors, the commonest numbers give character, form, and dimension to our world. Of the most frequent zero and one we might
say that they are like black and white, with the other primary colors red, blue, and yellow akin to two, three, and four. Nine, then, might
be a sort of cobalt or indigo: in a painting it would contribute shading, rather than shape. We expect to come across samples of nine as we
might samples of a color like indigoonly occasionally, and in small and subtle ways. Thus a family of nine children surprises as much as a
man or woman with cobalt-colored hair.

Daniel Tammet. Portrait by Jerome Tabet.
Sampling from Jorge Luis Borgess humorous fictional taxonomy of animals, inspired by the work of nineteenth-century German
mathematician Georg Cantor, Tammet points to the deeper insight beneath our efforts to itemize and organize the universe
something Umberto Eco knew when he proclaimed that the list is the origin of culture and Susan Sontag intuited when she
reflected on why lists appeal to us. Tammet writes:
Borges here also makes several thought-provoking points. First, though a set as familiar to our understanding as that of animals implies
containment and comprehension, the sheer number of its possible subsets actually swells toward infinity. With their handful of generic labels
(mammal, reptile, amphibious, etc.), standard taxonomies conceal this fact. To say, for example, that a flea is tiny, parasitic, and a
champion jumper is only to begin to scratch the surface of all its various aspects.
Second, defining a set owes more to art than it does to science. Faced with the problem of a near endless number of potential categories,
we are inclined to choose from a few those most tried and tested within our particular culture. Western descriptions of the set of all
elephants privilege subsets like those that are very large, and those possessing tusks, and even those possessing an excellent
memory, while excluding other equally legitimate possibilities such as Borgess those that at a distance resemble flies, or the Hindu those
that are considered lucky.
[]
Reading Borges invites me to consider the wealth of possible subsets into which my family set could be classified, far beyond those that
simply point to multiplicity.
Tammet circles back to the shared gifts of literature and mathematics, which both help cultivate our capacity for compassion:
Like works of literature, mathematical ideas help expand our circle of empathy, liberating us from the tyranny of a single, parochial point of
view. Numbers, properly considered, make us better people.
Originally featured in August read more here.
6. SMARTER THAN YOU THINK
The dangerous time when mechanical voices, radios, telephones, take the place of
human intimacies, and the concept of being in touch with millions brings a greater and
greater poverty in intimacy and human vision, Anas Nin wrote in her diary in 1946,
decades before the internet as we know it even existed. Her fear has since been echoed
again and again with every incremental advance in technology, often with simplistic
arguments about the attrition of attention in the age of digital distraction. But inSmarter
Than You Think: How Technology is Changing Our Minds for the Better (public
library), Clive Thompson one of the finest technology writers I know, with regular
bylines for Wired and The New York Times makes a powerful and rigorously thought
out counterpoint. He argues that our technological tools from search engines to status
updates to sophisticated artificial intelligence that defeats the worlds best chess players
are now inextricably linked to our minds, working in tandem with them and
profoundly changing the way we remember, learn, and act upon that knowledge
emotionally, intellectually, and politically, and this is a promising rather than perilous
thing.
He writes in the introduction:
These tools can make even the amateurs among us radically smarter than wed be on our own, assuming (and this is a big assumption) we
understand how they work. At their best, todays digital tools help us see more, retain more, communicate more. At their worst, they leave us
prey to the manipulation of the toolmakers. But on balance, Id argue, what is happening is deeply positive. This book is about the
transformation.

Page from 'Charley Harper: An Illustrated Life.' Click image for details.
But Thompson is nothing if not a dimensional thinker with extraordinary sensitivity to the complexities of cultural phenomena.
Rather than revisiting painfully familiar and trite-by-overuse notions like distraction and information overload, he examines the
deeper dynamics of how these new tools are affecting the way we make sense of the world and of ourselves. Several decades
after Vannevar Bushs now-legendary meditation on how technology will impact our thinking, Thompson reaches even further
into the fringes of our cultural sensibility past the cheap techno-dystopia, past the pollyannaish techno-utopia, and into that
intricate and ever-evolving intersection of technology and psychology.
One of his most fascinating and important points has to do with our outsourcing of memory or, more specifically, our
increasingly deft, search-engine-powered skills of replacing the retention of knowledge in our own brains with the on-demand
access to knowledge in the collective brain of the internet. Think, for instance, of those moments when youre trying to recall the
name of a movie but only remember certain fragmentary features the name of the lead actor, the gist of the plot, a song from
the soundtrack. Thompson calls this tip-of-the-tongue syndrome and points out that, today, youll likely be able to reverse-
engineer the name of the movie you dont remember by plugging into Google what you do remember about it. Thompson
contextualizes the phenomenon, which isnt new, then asks the obvious, important question about our culturally unprecedented
solutions to it:
Tip-of-the-tongue syndrome is an experience so common that cultures worldwide have a phrase for it. Cheyenne Indians call
it navonotootsea, which means I have lost it on my tongue; in Korean its hyeu kkedu-te mam-dol-da, which has an even more gorgeous
translation: sparkling at the end of my tongue. The phenomenon generally lasts only a minute or so; your brain eventually makes the
connection. But when faced with a tip-of-the-tongue moment, many of us have begun to rely instead on the Internet to locate information
on the fly. If lifelogging stores episodic, or personal, memories, Internet search engines do the same for a different sort of memory:
semantic memory, or factual knowledge about the world. When you visit Paris and have a wonderful time drinking champagne at a caf,
your personal experience is an episodic memory. Your ability to remember that Paris is a city and that champagne is an alcoholic beverage
thats semantic memory.
[]
Whats the line between our own, in-brain knowledge and the sea of information around us? Does it make us smarter when we can dip in
so instantly? Or dumber with every search?

Vannevar Bush's 'memex' -- short for 'memory index' -- a primitive vision for a personal hard drive for information storage
and management. Click image for the full story.
That concern, of course, is far from unique to our age from the invention of writing to Alvin Tofflers Future Shock, new
technology has always been a source of paralyzing resistance and apprehension:
Writing the original technology for externalizing information emerged around five thousand years ago, when Mesopotamian
merchants began tallying their wares using etchings on clay tablets. It emerged first as an economic tool. As with photography and the
telephone and the computer, newfangled technologies for communication nearly always emerge in the world of commerce. The notion of
using them for everyday, personal expression seems wasteful, risible, or debased. Then slowly it becomes merely lavish, what wealthy
people do; then teenagers take over and the technology becomes common to the point of banality.
Thompson reminds us of the anecdote, by now itself familiar to the point of banality, about Socrates and his admonition that
the technology of writing would devastate the Greek tradition of debate and dialectic, and would render people incapable of
committing anything to memory because knowledge stored was not really knowledge at all. He cites Socratess parable of the
Egyptian god Theuth and how he invented writing, offering it as a gift to the king of Egypt, Thamus, who met the present with
defiant indignation:
This discovery of yours will create forgetfulness in the learners souls, because they will not use their memories; they will trust to the external
written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence,
and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they
will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.
That resistance endured as technology changed shape, across the Middle Ages and past Gutenbergs revolution, but it wasnt
without counter-resistance: Those who recorded their knowledge in writing and, eventually, collected it in the form of books
argued that it expanded the scope of their curiosity and the ideas they were able to ponder, whereas the mere act of rote
memorization made no guarantees of deeper understanding.
Ultimately, however, Thompson points out that Socrates was both right and wrong: Its true that, with some deliberately
cultivated exceptions andneurological outliers, few thinkers today rely on pure memorization and can recite extensive passages of
text from memory. But what Socrates failed to see was the extraordinary dot-connecting enabled by access to knowledge beyond
what our own heads can hold because, as Amanda Palmer poignantly put it,we can only connect the dots that we
collect, and the outsourcing of memory has exponentially enlarged our dot-collections.
With this in mind, Thompson offers a blueprint to this newly developed system of knowledge management in which access is
critical:
If you are going to read widely but often read books only once; if you going to tackle the ever-expanding universe of ideas by skimming and
glancing as well as reading deeply; then you are going to rely on the semantic-memory version of gisting. By which I mean, youll absorb the
gist of what you read but rarely retain the specifics. Later, if you want to mull over a detail, you have to be able to refind a book, a passage, a
quote, an article, a concept.
But Thompson argues that despite historys predictable patterns of resistance followed by adoption and adaptation, theres
something immutably different about our own era:
The history of factual memory has been fairly predictable up until now. With each innovation, weve outsourced more information, then
worked to make searching more efficient. Yet somehow, the Internet age feels different. Quickly pulling up [the answer to a specific esoteric
question] on Google seems different from looking up a bit of trivia in an encyclopedia. Its less like consulting a book than like asking
someone a question, consulting a supersmart friend who lurks within our phones.
And therein lies the magic of the internet that unprecedented access to humanitys collective brain. Thompson cites the work
of Harvard psychologist Daniel Wegner, who first began exploring this notion of collective rather than individual knowledge in
the 1980s by observing how partners in long-term relationships often divide and conquer memory tasks in sharing the
households administrative duties:
Wegner suspected this division of labor takes place because we have pretty good metamemory. Were aware of our mental strengths and
limits, and were good at intuiting the abilities of others. Hang around a workmate or a romantic partner long enough and you begin to realize
that while youre terrible at remembering your corporate meeting schedule, or current affairs in Europe, or how big a kilometer is relative to a
mile, theyre great at it. So you begin to subconsciously delegate the task of remembering that stuff to them, treating them like a notepad or
encyclopedia. In many respects, Wegner noted, people are superior to these devices, because what we lose in accuracy we make up in
speed.
[]
Wegner called this phenomenon transactive memory: two heads are better than one. We share the work of remembering, Wegner
argued, because it makes us collectively smarter expanding our ability to understand the world around us.
This ability to google one anothers memory stores, Thompson argues, is the defining feature of our evolving relationship with
information and its profoundly shaping our experience of knowledge:
Transactive memory helps explain how were evolving in a world of on-tap information.
He illustrates this by turning to the work of Betsy Sparrow, a graduate student of Wegners, who conducted a series of
experiments demonstrating that when we know a digital tool will store information for us, were far less likely to commit it to
memory. On the surface, this may appear like the evident and worrisome shrinkage of our mental capacity. But theres a subtler
yet enormously important layer that such techno-dystopian simplifications miss: This very outsourcing of memory requires that
we learn what the machine knows a kind of meta-knowledge that enables us to retrieve the information when we need it. And,
reflecting on Sparrows findings, Thomspon points out that this is neither new nor negative:
Weve been using transactive memory for millennia with other humans. In everyday life, we are only rarely isolated, and for good reason. For
many thinking tasks, were dumber and less cognitively nimble if were not around other people. Not only has transactive memory not hurt
us, its allowed us to perform at higher levels, accomplishing acts of reasoning that are impossible for us alone. It wasnt until recently that
computer memory became fast enough to be consulted on the fly, but once it did with search engines boasting that they return results in
tenths of a second our transactive habits adapted.
Thompsons most important point, however, has to do with how outsourcing our knowledge to digital tools actually hampers the
very process of creative thought, which relies on our ability to connect existing ideas from our mental pool of resources into new
combinations, or what the French polymath Henri Poincar has famously termed sudden illuminations. Without a mental
catalog of materials which to mull and let incubate in our fringe consciousness, our capacity for such illuminations is greatly
deflated. Thompson writes:
These eureka moments are familiar to all of us; theyre why we take a shower or go for a walk when were stuck on a problem. But this
technique works only if weve actually got a lot of knowledge about the problem stored in our brains through long study and focus. You
cant come to a moment of creative insight if you havent got any mental fuel. You cant be googling the info; its got to be inside you.
But while this is a valid concern, Thompson doubts that were outsourcing too many bits of knowledge and thus curtailing our
creativity. He argues, instead, that were mostly employing this newly evolved skill to help us sift the meaningful from the
meaningless, but we remain just as capable of absorbing that which truly stimulates us:
Evidence suggests that when it comes to knowledge were interested in anything that truly excites us and has meaning we dont turn
off our memory. Certainly, we outsource when the details are dull, as we now do with phone numbers. These are inherently meaningless
strings of information, which offer little purchase on the mind. It makes sense that our transactive brains would hand this stuff off to
machines. But when information engages us when we really care about a subject the evidence suggests we dont turn off our
memory at all.
Originally featured in September read more here.
7. COSMIC APPRENTICE
As if to define what science is and what philosophy is werent hard enough, to delineate how
the two fit together appears a formidable task, one that has spurred rather intense opinions. But
thats precisely what Dorion Sagan, who has previously examined the prehistoric history of
sex, braves in the introduction to Cosmic Apprentice: Dispatches from the Edges of
Science(public library) as he sets out to explore the intricate ways in which the two fields hang
in a kind of odd balance, watching each other, holding hands:
The difference between science and philosophy is that the scientist learns more and more about less
and less until she knows everything about nothing, whereas a philosopher learns less and less about
more and more until he knows nothing about everything. There is truth in this clever crack, but, as Niels
Bohr impressed, while the opposite of a trivial truth is false, the opposite of a great truth is another great
truth.
I would say that applies to the flip side of the above flip takedown: Sciences eye for detail, buttressed by
philosophys broad view, makes for a kind of alembic, an antidote to both. This intellectual electrum cuts
the cloying taste of idealist and propositional philosophy with the sharp nectar of fact yet softens the edges of a technoscience that has
arguably lost both its moral and its epistemological compass, the result in part of its being funded by governments and corporations whose
relationship to the search for truth and its open dissemination can be considered problematic at best.

Sagan refutes the popular perception of science as rationally objective, a vessel of capital-T Truth, reminding us that every
scientific concept and theory was birthed by a subjective, fallible human mind:
All observations are made from distinct places and times, and in science no less than art or philosophy by particular individuals. Although
philosophy isnt fiction, it can be more personal, creative and open, a kind of counterbalance for science even as it argues that science, with
its emphasis on a kind of impersonal materialism, provides a crucial reality check for philosophy and a tendency to overtheorize that [is]
inimical to the scientific spirit. Ideally, in the search for truth, science and philosophy, the impersonal and autobiographical, can keep each
other honest, in a kind of open circuit. Philosophy as the underdog even may have an advantage, because its not supposed to be as
advanced as science, nor does it enjoy sciences level of institutional support or the commensurate heightened risks of being beholden to
ones benefactors.
Like Richard Feynman, who argued tirelessly for the scientists responsibility to remain unsure, Sagan echoes the idea
that willful ignorance is what drives science and the fear of being wrong is one of its greatest hindrances:
Sciences spirit is philosophical. It is the spirit of questioning, of curiosity, of critical inquiry combined with fact-checking. It is the spirit of being
able to admit youre wrong, of appealing to data, not authority, which does not like to admit it is wrong.
Sagan reflects on his fathers conviction that the effort to popularize science is a crucial one for society, one he shared with
Richard Feynman, and what made Carls words echo as profoundly and timelessly as they do:
Science and philosophy both had a reputation for being dry, but my father helped inject life into the former, partly by speaking in plain English
and partly by focusing on the science fiction fantasy of discovering extraterrestrial life.

In that respect, science could learn from philosophys intellectual disposition:
Philosophy today, not taught in grade school in the United States, is too often merely an academic pursuit, a handmaiden or apologetics of
science, or else a kind of existential protest, a trendy avocation of grad students and the dark-clad coffeehouse set. But philosophy, although
it historically gives rise to experimental science, sometimes preserves a distinct mode of sustained questioning that sharply distinguishes it
from modern science, which can be too quick to provide answers.
[]
Philosophy is less cocksure, less already-knowing, or should be, than the pundits diatribes that relieve us of the difficulties of not knowing, of
carefully weighing, of looking at the other side, of having to think things through for ourselves. Dwell in possibility, wrote Emily Dickinson:
Philosophy at its best seems a kind of poetry, not an informational delivery but a dwelling, an opening of our thoughts to the world.

Like Buckminster Fuller, who vehemently opposed specialization, Sagan attests to the synergetic value of intellectual cross-
pollination, attesting to the idea that true breakthroughs in science require cross-disciplinary connections and originality consists
of linking up ideas whose connection was not previously suspected:
It is true that science requires analysis and that it has fractured into microdisciplines. But because of this, more than ever, it requires
synthesis. Science is about connections. Nature no more obeys the territorial divisions of scientific academic disciplines than do continents
appear from space to be colored to reflect the national divisions of their human inhabitants. For me, the great scientific satoris, epiphanies,
eurekas, and aha! moments are characterized by their ability to connect.
In disputes upon moral or scientific points, advised Martine in his wonderful1866 guide to the art of conversation, ever let
your aim be to come at truth, not to conquer your opponent. So you never shall be at a loss in losing the argument, and gaining a
new discovery. Science, Sagan suggests at least at its most elegant is a conversation of constant revision, where each dead
end brings to life a new fruitful question:
Theories are not only practical, and wielded like intellectual swords to the death but beautiful. A good one is worth more than all the ill-
gotten hedge fund scraps in the world. A good scientific theory shines its light, revealing the worlds fearful symmetry. And its failure is also a
success, as it shows us where to look next.

Supporting Neil deGrasse Tysons contention that intelligent design is a philosophy of ignorance, Sagan applies this very
paradigm of connection-making to the crux of the age-old science vs. religion debate, painting evolution not as a tool of certitude
but as a reminder of our connectedness to everything else:
Connecting humanity with other species in a single process was Darwins great natural historical accomplishment. It showed that some of
the issues relegated to religion really come under the purview of science. More than just a research program for technoscience, it provides a
eureka moment, a subject of contemplation open in principle to all thinking minds. Beyond the squabbles over its mechanisms and modes,
evolutions epiphany derives from its widening of vistas, its showing of the depths of our connections to others from whom wed thought we
were separate. Philosophy, too in its ancient, scientifico-genic spirit of inquiry so different from a mere, let alone peevish, recounting of
facts, needs to be reconnected to science for the latter to fulfill its potential not just as something useful but as a source of numinous
moments, deep understanding, and indeed, religious-like epiphanies of cosmic comprehension and aesthetic contemplation.
Originally featured in April see more here.
8. SOCIAL
Without the sense of fellowship with men of like mind, Einstein wrote, life would have
seemed to me empty. It is perhaps unsurprising that the iconic physicist, celebrated as the
quintessential modern genius, intuited something fundamental about the inner workings of
the human mind and soul long before science itself had attempted to concretize it with
empirical evidence. Now, it has: In Social: Why Our Brains Are Wired to Connect (public
library), neuroscientist Matthew D. Lieberman, director of UCLAs Social Cognitive
Neuroscience lab, sets out to get clear about who we are as social creatures and to reveal
how a more accurate understanding of our social nature can improve our lives and our
society. Lieberman, who has spent the past two decades using tools like fMRI to study how
the human brain responds to its social context, has found over and over again that our brains
arent merely simplistic mechanisms that only respond to pain and pleasure, as philosopher
Jeremy Bentham famously claimed, but are instead wired to connect. At the heart of his
inquiry is a simple question: Why do we feel such intense agony when we lose a loved one?
He argues that, far from being a design flaw in our neural architecture, our capacity for such
overwhelming grief is a vital feature of our evolutionary constitution:
The research my wife and I have done over the past decade shows that this response, far from being an accident, is actually profoundly
important to our survival. Our brains evolved to experience threats to our social connections in much the same way they experience physical
pain. By activating the same neural circuitry that causes us to feel physical pain, our experience of social pain helps ensure the survival of our
children by helping to keep them close to their parents. The neural link between social and physical pain also ensures that staying socially
connected will be a lifelong need, like food and warmth. Given the fact that our brains treat social and physical pain similarly, should we as a
society treat social pain differently than we do? We dont expect someone with a broken leg to just get over it. And yet when it comes to the
pain of social loss, this is a common response. The research that I and others have done using fMRI shows that how we experience social
pain is at odds with our perception of ourselves. We intuitively believe social and physical pain are radically different kinds of experiences, yet
the way our brains treat them suggests that they are more similar than we imagine.

Citing his research, Lieberman affirms the notion that there is no such thing as a nonconformist, pointing out the social
construction of what we call our individual selves empirical evidence for what the novelist William Gibson so eloquently
termed ones personal micro-culture and observes our socially malleable sense of self:
The neural basis for our personal beliefs overlaps significantly with one of the regions of the brain primarily responsible for allowing other
peoples beliefs to influence our own. The self is more of a superhighway for social influence than it is the impenetrable private fortress we
believe it to be.
Contextualizing it in a brief evolutionary history, he argues that this osmosis of sociality and individuality is an essential aid in
our evolutionary development rather than an aberrant defect in it:
Our sociality is woven into a series of bets that evolution has laid down again and again throughout mammalian history. These bets come in
the form of adaptations that are selected because they promote survival and reproduction. These adaptations intensify the bonds we feel
with those around us and increase our capacity to predict what is going on in the minds of others so that we can better coordinate and
cooperate with them. The pain of social loss and the ways that an audiences laughter can influence us are no accidents. To the extent that
we can characterize evolution as designing our modern brains, this is what our brains were wired for: reaching out to and interacting with
others. These are design features, not flaws. These social adaptations are central to making us the most successful species on earth.

The implications of this span across everything from the intimacy of our personal relationships to the intricacy of organizational
management and teamwork. But rather than entrusting a single cognitive social network with these vital functions, our brains
turn out to host many. Lieberman explains:
Just as there are multiple social networks on the Internet such as Facebook and Twitter, each with its own strengths, there are also multiple
social networks in our brains, sets of brain regions that work together to promote our social well-being.
These networks each have their own strengths, and they have emerged at different points in our evolutionary history moving from
vertebrates to mammals to primates to us,Homo sapiens. Additionally, these same evolutionary steps are recapitulated in the same order
during childhood.

He goes on to explore three major adaptations that have made us so inextricably responsive to the social world:
Connection: Long before there were any primates with a neocortex, mammals split off from other vertebrates and evolved the
capacity to feel social pains and pleasures, forever linking our well-being to our social connectedness. Infants embody this deep
need to stay connected, but it is present through our entire lives.
Mindreading: Primates have developed an unparalleled ability to understand the actions and thoughts of those around them,
enhancing their ability to stay connected and interact strategically. In the toddler years, forms of social thinking develop that
outstrip those seen in the adults of any other species. This capacity allows humans to create groups that can implement nearly
any idea and to anticipate the needs and wants of those around us, keeping our groups moving smoothly.
Harmonizing: The sense of self is one of the most recent evolutionary gifts we have received. Although the self may appear to
be a mechanism for distinguishing us from others and perhaps accentuating our selfishness, the self actually operates as a
powerful force for social cohesiveness. During the preteen and teenage years, adolescent refers to the neural adaptations that
allow group beliefs and values to influence our own.

Originally featured in November see more here, including Libermans fantastic TEDxStLouis talk.
9. GULP
Few writers are able to write about science in a way thats provocative without being
sensationalistic, truthful without being dry, enchanting without being forced and even
fewer are able to do so on subjects that dont exactly lend themselves to Saganesque
whimsy. After all, its infinitely easier to inspire awe while discussing the bombastic
magnificence of the cosmos than, say, the function of bodily fluids and the structures that
secrete them. But Mary Roach is one of those rare writers, and thats precisely what she
proves once more in Gulp: Adventures on the Alimentary Canal (public library) a
fascinating tour of the bodys most private hydraulics.
Roach writes in the introduction:
The early anatomists had that curiosity in spades. They entered the human form like an unexplored
continent. Parts were named like elements of geography: the isthmus of the thyroid, the isles of the
pancreas, the straits and inlets of the pelvis. The digestive tract was for centuries known as the
alimentary canal. How lovely to picture ones dinner making its way down a tranquil, winding
waterway, digestion and excretion no more upsetting or off-putting than a cruise along the Rhine.
Its this mood, these sentiments the excitement of exploration and the surprises and delights of travel to foreign locales that I hope to
inspire with this book.
It may take some doing. The prevailing attitude is one of disgust. I remember, for my last book, talking to the public-affairs staff who
choose what to stream on NASA TV. The cameras are often parked on the comings and goings of Mission Control. If someone spots a
staffer eating lunch at his desk, the camera is quickly repositioned. In a restaurant setting, conviviality distracts us from the biological reality of
nutrient intake and oral processing. But a man alone with a sandwich appears as what he is: an organism satisfying a need. As with other
bodily imperatives, wed rather not be watched. Feeding, and even more so its unsavory correlates, are as much taboos as mating and
death.
The taboos have worked in my favor. The alimentary recesses hide a lode of unusual stories, mostly unmined. Authors have profiled the
brain, the heart, the eyes, the skin, the penis and the female geography, even the hair, but never the gut. The pie hole and the feed chute are
mine.
Roach goes on to bring real science to those subjects that make teenagers guffaw and that populate mediocre standup jokes,
exploring such bodily mysteries as what flatulence research reveals about death, why tasting has little to do with taste, how
thorough chewing can lower the national debt, and why we like the foods we like and loathe the rest.
10. WONDERS OF THE UNIVERSE
I know that I am mortal by nature and ephemeral,ur-astronomer Ptolemy
contemplated nearly two millennia ago, but when I trace at my pleasure the windings to
and fro of the heavenly bodies, I no longer touch earth with my feet. I stand in the presence
of Zeus himself and take my fill of ambrosia. But while the cosmos has fascinated
humanity since the dawn of time, its mesmerism isnt that of an abstract other but, rather,
the very self-reflexive awareness that Ptolemy attested to, that intimate and inextricable
link between the wonders of life here on Earth and the magic weve always found in our
closest cosmic neighbors.
Thats precisely what modern-day science-enchanter Brian Cox explores inWonders of the
Solar System(public library) the fantastic and illuminating book based on his BBC
series of the same title celebrating the spirit of exploration, and a follow-up to his Wonders
of Life and every bit as brimming with his signature blend of enthralling
storytelling, scientific brilliance, andcontagious conviction.
Cox begins by reminding us that preserving the spirit of exploration is both a joy and a moral obligation especially at a time
when it faces tragic threats of indifference and neglect from the very authorities whose job it is to fuel it, despite a
citizenry profoundly in love with the ethos of exploration:
[The spirit of exploration] is desperately relevant, an idea so important that celebration is perhaps too weak a word. It is a plea for the spirit of
the navigators of the seas and the pioneers of aviation and spaceflight to be restored and cherished; a case made to the viewer and reader
that reaching for worlds beyond our grasp is an essential driver of progress and necessary sustenance for the human spirit. Curiosity is the
rocket fuel that powers our civilization. If we deny this innate and powerful urge, perhaps because earthly concerns seem more worthy or
pressing, then the borders of our intellectual and physical domain will shrink with our ambitions. We are part of a much wider ecosystem,
and our prosperity and even long-term survival are contingent on our understanding of it.
But most revelational of all is Coxs gift from illustrating what our Earthly phenomena, right here on our seemingly ordinary
planet, reveal about the wonders and workings of the Solar System.

Tornadoes, for instance, tell us how our star system was born the processes that drive these giant rotating storms obey the
same physics forces that caused clumps to form at the center of nebulae five billion years ago, around which the gas cloud
collapsed and began spinning ever-faster, ordering the chaos, until the early Solar System was churned into existence. This
universal principle, known as the conservation of angular momentum, is also what drives a tornados destructive spiral.

Cox synthesizes:
This is how our Solar System was born: rather than the whole system collapsing into the Sun, a disc of dust and gas extending billions of
kilometers into space formed around the new shining star. In just a few hundred million years, pieces of the cloud collapsed to form planets
and moons, and so a star system, our Solar System, was formed. The journey from chaos into order had begun.

Then we have Icelands icebergs and glacial lagoons, which offer remarkable insight into the nature of Saturns rings. Both shine
with puzzling brightness the lagoons, here on Earth, by bringing pure water that is thousands of years old and free of
pollutants from the bottom of the seabed to the surface as they rise, forming ice crystals of exceptional vibrance; Saturns rings,
young and ever-changing, by circling icy ring particles around the planet, constantly crashing them together and breaking them
apart, thus exposing bright new facets of ice that catch the sunlight and dazzle amidst a Solar System that is otherwise a very
dirty place.

Cox explains:
Its difficult to imagine the scale, beauty and intricacy of Saturns rings here on Earth, but the glacial lagoons of Iceland can transport our
minds across millions of kilometers of space and help us understand the true nature of the rings. At first sight, the lagoon appears to be a
solid sheet of pristine ice,but this is an illusion. The surface is constantly shifting, an almost organic, every-changing raft of thousands of
individual icebergs floating on the water. The structure of Saturns rings is similar, because despite appearances the rings arent solid. Each
ring is made up of hundreds of ringlets and each ringlet is made up of billions of separate pieces. Captured by Saturns gravity, the ring
particles independently orbit the panel in an impossibly thin layer.
Cox goes on to explore other such illuminating parallels, from how Alaskas Lake Eyak illustrate the methane cycles of the
universe to what Hawaiis Big Island tells us about the forces that keep any planet alive to how the volcanic features of Indias
Deccan Traps explain why Venus choked to death. He ends with T. S. Eliots timeless verses on the spirit of exploration and
echoes Neil deGrasse Tysons wisdom on your ego and the cosmic perspective, concluding:
You could take the view that our exploration of the Universe has made us somehow insignificant; one tiny planet around one star amongst
hundreds of billions. But I dont take that view, because weve discovered that it takes the rarest combination of chance and the laws of
Nature to produce a planet that can support a civilization, that most magnificent structure that allows us to explore and understand the
Universe. Thats why, for me, our civilization is the wonder of the Solar System, and if you were to be looking at the Earth from outside the
Solar System that much would be obvious. We have written the evidence of our existence onto the surface of our planet. Our civilization has
become a beacon that identifies our planet as a home to life.
Originally featured in August see more here.
11. SAVE OUR SCIENCE
What is crucial is not that technical ability, but it is imagination in all of its
applications, the great E. O. Wilson offered in his timeless advice to young scientists a
conviction shared by some of historys greatest scientific minds. And yet it is rote
memorization and the unimaginative application of technical skill that our dominant
education system prioritizes so its no wonder it is failing to produce
the Edisons and Curies of our day. In Save Our Science: How to I nspire a New
Generation of Scientists, materials scientist, inventor, and longtime Yale professorAinissa
Ramirez takes on a challenge Isaac Asimov presaged a quarter century ago, advocating for
the value of science education and critiquing its present failures, with a hopeful and
pragmatic eye toward improving its future. She writes in the introduction:
The 21st century requires a new kind of learner not someone who can simply churn out
answers by rote, as has been done in the past, but a student who can think expansively and solve
problems resourcefully.
To do that, she argues, we need to replace the traditional academic skills of reading, riting, and rithmetic
with creativity, curiosity, critical-thinking, and problem-solving. (Though, as psychology has recently revealed, problem-
finding might be the more valuable skill.)

Ainissa Ramirez at TED 2012 (Photograph: James Duncan Davidson for TED)
She begins with the basics:
While the acronym STEM sounds very important, STEM answers just three questions: Why does something happen? How can we apply
this knowledge in a practical way? How can we describe what is happening succinctly? Through the questions, STEM becomes a pathway
to be curious, to create, and to think and figure things out.
Even for those of us who deem STEAM (wherein the A stands for arts) superior to STEM, Ramirezs insights are razor-sharp
and consistent with the oft-affirmed idea that creativity relies heavily upon connecting the seemingly disconnected and aligning
the seemingly misaligned:
There are two schools of thought on defining creativity: divergent thinking, which is the formation of a creative idea resulting from generating
lots of ideas, and a Janusian approach, which is the act of making links between two remote ideas. The latter takes its name from the two-
faced Roman god of beginnings, Janus, who was associated with doorways and the idea of looking forward and backward at the same
time. Janusian creativity hinges on the belief that the best ideas come from linking things that previously did not seem linkable. Henri
Poincar, a French mathematician,put it this way: To create consists of making new combinations. The most fertile will often be those
formed of elements drawn from domains which are far apart.
Another element inherent to the scientific process but hardly rewarded, if not punished, in education is the role of ignorance, or
what the poet John Keats has eloquently and timelessly termed negative capability the art of brushing up against the
unknown and proceeding anyway. Ramirez writes:
My training as a scientist allows me to stare at an unknown and not run away, because I learned that this melding of uncertainty and
curiosity is where innovation and creativity occur.
Yet these very qualities are missing from science education in the United States and it shows. When the Programme for
International Student Assessment (PISA) took their annual poll in 2006, the U.S. ranked 35th in math and 29th in science out of
the 40 high-income, developed countries surveyed.

Average PISA scores versus expenditures for selected countries (Source: Organisation for Economic Co-operation and
Development)
Ramirez offers a historical context: When American universities first took root in the colonial days, their primary role was to
educate men for the clergy, so science, technology, and math were not a priority. But then Justin Smith Morrill, a little-known
congressman from Vermont who had barely completed his high school education, came along in 1861 and quietly but
purposefully sponsored legislation that forever changed American education, resulting in more than 70 new colleges and
universities that included STEM subjects in their curricula. This catapulted enrollment rates from the mere 2% of the population
who attended higher education prior to the Civil War and greatly increased diversity in academia, with the acts second revision
in 1890 extending education opportunities to women and African-Americans.

The growth of U.S. college enrollment from 1869 to 1994. (Source: S. B. Carter et al., Historical Statistics of the United
States)
But what really propelled science education, Ramirez notes, was the competitive spirit of the Space Race:
The mixture of being outdone and humiliated motivated the U.S. to create NASA and bolster the National Science Foundations budget to
support science research and education. Sputnik forced the U.S. to think about its science position and to look hard into a mirror and the
U.S. did not like what it saw. In 1956, before Sputnik, the National Science Foundations budget was a modest $15.9 million. In 1958, it
tripled to $49.5 million, and it doubled again in 1959 to $132.9 million. The space race was on. We poured resources, infrastructure, and
human capital into putting an American on the moon, and with that goal, STEM education became a top priority.


President John F. Kennedy addresses a crowd of 35,000 at Rice University in 1962, proclaiming again his desire to reach
the moon with the words, 'We set sail on this new sea because there is new knowledge to be gained.' Credit: NASA / Public
domain
Ramirez argues for returning to that spirit of science education as an investment in national progress:
The U.S. has a history of changing education to meet the nations needs. We need similar innovative forward-thinking legislation now, to
prepare our children and our country for the 21st century. Looking at our history allows us to see that we have been here before and
prevailed. Lets meet this challenge, for it will, as Kennedy claimed, draw out the very best in all of us.
In confronting the problems that plague science education and the publics relationship with scientific culture, Ramirez points to
the fact that women account for only 26% of STEM bachelors degrees and explores the heart of theglaring gender problem:
[There is a] false presumption that girls are not as good as boys in science and math. This message absolutely pervades our national
mindset. Even though girls and boys sit next to each other in class, fewer women choose STEM careers than men. This is the equivalent to
a farmer sowing seeds and then harvesting only half of the fields.

The precipitous drop in girls enrollment in STEM classes. (Source: J. F. Latimer, Whats Happened To Our High Schools)
In turning toward possible solutions, Ramirez calls out the faulty models of standardized testing, which fail to account for more
dimensional definitions of intelligence. She writes:
There is a concept in physics that the observer of an experiment can change the results just by the act of observing (this is called, not
surprisingly, the observer effect). For example, knowing the required pressure of your tires and observing that they are overinflated dictates
that you let some air out, which changes the pressure slightly.
Although this theory is really for electrons and atoms, we also see it at work in schools. Schools are evaluated, by the federal and state
governments, by tests. The students are evaluated by tests administered by the teachers. It is the process of testing that has changed the
mission of the school from instilling a wide knowledge of the subject matter to acquiring a good score on the tests.
The United States is one of the most test-taking countries in the world, and the standard weapon is the multiple-choice question. Although
multiple-choice tests are efficient in schools, they dont inspire learning. In fact, they do just the opposite. This is hugely problematic in
encouraging the skills needed for success in the 21st century. Standardized testing teaches skills that are counter to skills needed for the
future, such as curiosity, problem solving, and having a healthy relationship with failure. Standardized tests draw up a fear of failure, since
you seek a specific answer and you will be either right or wrong; they kick problem solving in the teeth, since you never need to show your
work and never develop a habit of figuring things out; and they slam the doors to curiosity, since only a small selection of the possible
answers is laid out before you. These kinds of tests produce thinkers who are unwilling to stretch and take risks and who cannot handle
failure. They crush a sense of wonder.
Like Noam Chomsky, who has questioned why schools train for passing tests rather than for creative inquiry, and Sir Ken
Robinson, who has eloquently advocated for changing the factory model of education, Ramirez urges:
While scientists passionately explore, reason, discover, synthesize, compare, contrast, and connect the dots, students drudgingly memorize,
watch, and passively consume. Students are exercising the wrong muscle. An infusion of STEM taught in compelling ways will give
students an opportunity to acquire these active learning skills.
Ramirez goes on to propose a multitude of small changes and larger shifts that communities, educators, cities, institutions, and
policy-makers could implement from neighborhood maker-spaces to wifi hotspots on school buses to university science
festivals to new curricula and testing methods that would begin to bridge the gap between what science education currently is
and what scientific culture could and should be. She concludes, echoing Alvin Tofflers famous words that the illiterate of the
21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn:
The skills of the 21st century need us to create scholars who can link the unlinkable. Nurturing curious, creative problem solvers who can
master the art of figuring things out will make them ready for this unknown brave new world. And that is the best legacy we can possibly
leave.
Originally featured in February see more here.
12. THE ELEMENTS OF EUCLID
Almost a century before Mondrian made his iconic red, yellow, and blue geometric
compositions, and around the time that Edward Livingston Youmans was creating
hisstunning chemistry diagrams, an eccentric 19th-century civil engineer and
mathematician named Oliver Byrne produced a striking series of vibrant diagrams in
primary colors for a 1847 edition of the legendary Greek mathematical treatise Euclids
Elements. Byrne, a vehement opponent of pseudoscience with an especial
distaste phrenology, was early to the insight that great design and graphic elegance can
powerfully aid learning. He explained that in his edition of Euclid, coloured diagrams
and symbols are used instead of letters for the greater ease of learners. The book, a
masterpiece of Victorian printing and graphic design long before graphic design
existed as a discipline, is celebrated as one of the most unusual and most beautiful books
of the 19th century.
Now, the fine folks of Taschen who have brought us such visual treasures asthe best
illustrations from 150 years of Hans Christian Andersen, the life and legacy of infographics godfather Fritz Kahn, and the visual
history of magic are resurrecting Byrnes gem in the lavish tome The First Six Books of the Elements of Euclid (public
library), edited by Swiss polymath Werner Oechslin.

Proof of the Pythagorean theorem
A masterwork of art and science in equal measure, this newly rediscovered treasure mesmerizes the eye with its brightly colored
circles, squares, and triangles while it tickles the brain with its mathematical magic.









Originally featured in November see more here.
13. DOES MY GOLDFISH KNOW WHO I AM?
In 2012, I wrote about a lovely book titled Big Questions from Little People & Simple
Answers from Great Minds, in which some of todays greatest scientists, writers, and
philosophers answer kids most urgent questions, deceptively simple yet profound. It
went on to become one of the years best books andamong readers favorites. A few
months later,Gemma Elwin Harris, the editor who had envisioned the project, reached
out to invite me to participate in the books 2013 edition by answering one randomly
assigned question from a curious child. Naturally, I was thrilled to do it, and honored to
be a part of something as heartening as Does My Goldfish Know Who I Am? (public
library), also among the best childrens books of the year a compendium of primary school childrens funny, poignant,
innocent yet insightful questions about science and how life works, answered by such celebrated minds as rockstar
physicist Brian Cox, beloved broadcaster and voice-of-nature Sir David Attenborough, legendary linguist Noam Chomsky,
science writer extraordinaire Mary Roach, stat-showman Hans Rosling, Beatle Paul McCartney, biologist and Beagle
Project director Karen James, and iconic illustrator Sir Quentin Blake. As was the case with last years edition, more than half
of the proceeds from the book which features illustrations by the wonderful Andy Smith are being donated to a childrens
charity.

The questions range from what the purpose of science is to why onions make us cry to whether spiders can speak to why we blink
when we sneeze. Psychologist and broadcaster Claudia Hammond, who recently explained the fascinating science of why time
slows down when were afraid, speeds up as we age, and gets all warped while were on vacation in one of the best psychology
and philosophy books of 2013, answers the most frequently asked question by the surveyed children: Why do we cry?
Its normal to cry when you feel upset and until the age of twelve boys cry just as often as girls. But when you think about it, it is a bit strange
that salty water spills out from the corners of your eyes just because you feel sad.
One professor noticed people often say that, despite their blotchy faces, a good cry makes them feel better. So he did an experiment where
people had to breathe in over a blender full of onions that had just been chopped up. Not surprisingly this made their eyes water. He
collected the tears and put them in the freezer. Then he got people to sit in front of a very sad film wearing special goggles which had tiny
buckets hanging off the bottom, ready to catch their tears if they cried. The people cried, but the buckets didnt work and in the end he
gathered their tears in tiny test tubes instead.
He found that the tears people cried when they were upset contained extra substances, which werent in the tears caused by the onions. So
he thinks maybe we feel better because we get rid of these substances by crying and that this is the purpose of tears.
But not everyone agrees. Many psychologists think that the reason we cry is to let other people know that we need their sympathy or help.
So crying, provided we really mean it, brings comfort because people are nice to us.
Crying when were happy is a bit more of a mystery, but strong emotions have a lot in common, whether happy or sad, so they seem to
trigger some of the same processes in the body.

(For a deeper dive into the biological mystery of crying, see the science of sobbing and emotional tearing.)
Joshua Foer, who knows a thing or two about superhuman memory and the limits of our mind, explains to 9-year-old Tom how
the brain can store so much information despite being that small:
An adults brain only weighs about 1.4 kilograms, but its made up of about 100 billion microscopic neurons. Each of those neurons looks like
a tiny branching tree, whose limbs reach out and touch other neurons. In fact, each neuron can make between 5,000 and 10,000
connections with other neurons sometimes even more. Thats more than 500 trillion connections! A memory is essentially a pattern of
connections between neurons.
Every sensation that you remember, every thought that you think, transforms your brain by altering the connections within that vast network.
By the time you get to the end of this sentence, you will have created a new memory, which means your brain will have physically changed.
Neuroscientist Tali Sharot, who has previously studied why our brains are wired for optimism, answers 8-year-old Maias
question about why we dont have memories from the time we were babies and toddlers:
We use our brain for memory. In the first few years of our lives, our brain grows and changes a lot, just like the rest of our body. Scientists
think that because the parts of our brain that are important for memory have not fully developed when we are babies, we are unable to store
memories in the same way that we do when we are older.
Also, when we are very young we do not know how to speak. This makes it difficult to keep events in your mind and remember them later,
because we use language to remember what happened in the past.
In answering 8-year-old Hannahs question about what newspapers do when there is no news, writer and journalist Oliver
Burkeman, author of the excellent The Antidote: Happiness for People Who Cant Stand Positive Thinking, offers a primer on
media literacy an important caveat on news that even we, as alleged grown-ups, frequently forget:
Newspapers dont really go out and find the news: theydecide what gets to count as news. The same goes for television and radio. And you
might disagree with their decisions! (For example, journalists are often accused of focusing on bad news and ignoring the good, making the
world seem worse than it is.)
The important thing to remember, whenever youre reading or watching the news, is that someone decided to tell you those things, while
leaving out other things. Theyre presenting one particular view of the world not the only one. Theres always another side to the story.

And my answer, to 9-year-old Ottilies question about why we have books:
Some people might tell you that books are no longer necessary now that we have the internet. Dont believe them. Books help us know
other people, know how the world works, and, in the process, know ourselves more deeply in a way that has nothing to with what you read
them on and everything to do with the curiosity, integrity and creative restlessness you bring to them.
Books build bridges to the lives of others, both the characters in them and your countless fellow readers across other lands and other eras,
and in doing so elevate you and anchor you more solidly into your own life. They give you a telescope into the minds of others, through
which you begin to see with ever greater clarity the starscape of your own mind.
And though the body and form of the book will continue to evolve, its heart and soul never will. Though the telescope might change, the
cosmic truths it invites you to peer into remain eternal like the Universe.
In many ways, books are the original internet each fact, each story, each new bit of information can be a hyperlink to another book,
another idea, another gateway into the endlessly whimsical rabbit hole of the written word. Just like the web pages you visit most regularly,
your physical bookmarks take you back to those book pages you want to return to again and again, to reabsorb and relive, finding new
meaning on each visit because the landscape of your life is different, new, reloaded by the very act of living.
Originally featured in November read more of the questions and answers here.
HONORABLE MENTIONS
The Space Book: From the Beginning to the End of Time, 250 Milestones in the History of Space & Astronomy by Jim Bell, An
Appetite for Wonder: The Making of a Scientist by Richard Dawkins, and The Age of Edison: Electric Light and the Invention of
Modern America by Ernest Freeberg.
Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the weeks best articles. Heres what to
expect. Like? Sign up.
TAGS: BEST OF BOOKS CULTURE EDUCATION SCIENCE
26 SEPTEMBER, 2013
Wisdom from a MacArthur Genius: Psychologist Angela
Duckworth on Why Grit, Not IQ, Predicts Success
By: Maria Popova
Character is at least as important as intellect.
Creative history brims with embodied examples of why the secret of genius is doggedness
rather than god-given talent, from the case of young Mozarts upbringing to E. B.
Whites wisdom on writing to Chuck Closes assertion about art toTchaikovskys
conviction about composition toNeil Gaimans advice to aspiring writers. But it takes a
brilliant scholar of the psychology of achievement to empirically prove these creative
intuitions: Math-teacher-turned-psychologistAngela Duckworth, who began her graduate
studies under positive psychology godfather Martin Seligman at my alma mater, the
University of Pennsylvania, has done more than anyone for advancing our understanding of
how self-control and grit the relentless work ethic of sustaining your commitments
toward a long-term goal impact success. So how heartening to hear that Duckworth is
the recipient of a 2013 MacArthur genius grant for her extraordinary endeavors, the
implications of which span from education to employment to human happiness.
In this short video from the MacArthur Foundation, Duckworth traces her journey and
explores the essence of her work:
We need more than the intuitions of educators to work on this problem. For sure we need the educators, but in partnership I think we need
scientists to study this from different vantage points, and that actually inspired me to move out of the classroom as a teacher and into the lab
as a research psychologist.
In the exceedingly excellent How Children Succeed: Grit, Curiosity, and the Hidden Power of Character (public library) a
necessary addition to thesefantastic reads on education Paul Tough writes of Duckworths work:
Duckworth had come to Penn in 2002, at the age of thirty-two, later in life than a typical graduate student. The daughter of Chinese
immigrants, she had been a classic multitasking overachiever in her teens and twenties. After completing her undergraduate degree at
Harvard (and starting a summer school for low-income kids in Cambridge in her spare time), she had bounced from one station of the mid-
nineties meritocracy to the next: intern in the White House speechwriting office, Marshall scholar at Oxford (where she studied
neuroscience), management consultant for McKinsey and Company, charter-school adviser.
Duckworth spent a number of years toying with the idea of starting her own charter school, but eventually concluded that the
model didnt hold much promise for changing the circumstances of children from disadvantaged backgrounds, those whom the
education system was failing most tragically. Instead, she decided to pursue a PhD program at Penn. In her application essay, she
shared how profoundly the experience of working in schools had changed her view of school reform and wrote:
The problem, I think, is not only the schools but also the students themselves. Heres why: learning is hard. True, learning is fun, exhilarating
and gratifying but it is also often daunting, exhausting and sometimes discouraging. . . . To help chronically low-performing but intelligent
students, educators and parents must first recognize that character is at least as important as intellect.
Duckworth began her graduate work by studying self-discipline. But when she completed her first-year thesis, based on a group
of 164 eighth-graders from a Philadelphia middle school, she arrived at a startling discovery that would shape the course of her
career: She found that the students self-discipline scores were far better predictors of their academic performance than their IQ
scores. So she became intensely interested in what strategies and tricks we might develop to maximize our self-control, and
whether those strategies can be taught. But self-control, it turned out, was only a good predictor when it came to immediate,
concrete goals like, say, resisting a cookie. Tough writes:
Duckworth finds it useful to divide the mechanics of achievement into two separate dimensions: motivation and volition. Each one, she says,
is necessary to achieve long-term goals, but neither is sufficient alone. Most of us are familiar with the experience of possessing motivation
but lacking volition: You can be extremely motivated to lose weight, for example, but unless you have the volition the willpower, the self-
control to put down the cherry Danish and pick up the free weights, youre not going to succeed. If a child is highly motivated, the self-
control techniques and exercises Duckworth tried to teach [the students in her study] might be very helpful. But what if students just arent
motivated to achieve the goals their teachers or parents want them to achieve? Then, Duckworth acknowledges, all the self-control tricks in
the world arent going to help.
This is where grit comes in the X-factor that helps us attain more long-term, abstract goals. To address this, Duckworth and
her colleague Chris Peterson developed the Grit Scale a deceptively simple test, on which you evaluate how much twelve
statements apply to you, from I am a hard worker to New ideas and projects sometimes distract me from previous ones. The
results are profoundly predictive of success at such wide-ranging domains of achievement as the National Spelling Bee and the
West Point military academy. Tough describes the surprising power of this seemingly mundane questionnaire:
For each statement, respondents score themselves on a five-point scale, ranging from 5, very much like me, to 1, not like me at all. The
test takes about three minutes to complete, and it relies entirely on self-report and yet when Duckworth and Peterson took it out into the
field, they found it was remarkably predictive of success. Grit, Duckworth discovered, is only faintly related to IQ there are smart gritty
people and dumb gritty people but at Penn, high grit scores allowed students who had entered college with relatively low college-board
scores to nonetheless achieve high GPAs. At the National Spelling Bee, Duckworth found that children with high grit scores were more likely
to survive to the later rounds. Most remarkable, Duckworth and Peterson gave their grit test to more than twelve hundred freshman cadets
as they entered the military academy at West Point and embarked on the grueling summer training course known as Beast Barracks. The
military has developed its own complex evaluation, called the whole candidate score, to judge incoming cadets and predict which of them
will survive the demands of West Point; it includes academic grades, a gauge of physical fitness, and a leadership potential score. But the
more accurate predictor of which cadets persisted in Beast Barracks and which ones dropped out turned out to be Duckworths simple little
twelve-item grit questionnaire.
You can take the Grit Scale here (registration is free). For more on the impact of Duckworths work, do treat yourself to the
altogether indispensable How Children Succeed: Grit, Curiosity, and the Hidden Power of Character.
Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the weeks best articles. Heres what to
expect. Like? Sign up.
TAGS: ANGELA DUCKWORTH BOOKS CULTURE EDUCATION PSYCHOLOGY
22 AUGUST, 2013
Pioneering 19th-Century Astronomer Maria Mitchell on
Education and Women in Science
By: Maria Popova
No woman should say, I am but a woman! But a woman! What more can you ask to
be?
We are women studying together, legendary astronomer and reconstructionist Maria
Mitchell said to the senior class in astronomy when it entered upon its last year at Vassar
College in 1876, where Mitchell had begun teaching after the Civil War as the only
woman on the faculty. These seemingly simple and unremarkable words sprang from a
remarkable determination that would come to pave the way for women in science. In fact,
Mitchells strides towards equality in education are unparalleled by any other figure of
the era. From Maria Mitchell: Life, Letters and J ournals (public library; free download)
which also gave us the beloved astronomers timeless wisdom on science and life
comes a fascinating record of Mitchells witty, unrelenting spirit and the conviction with
which she steered the wheel of science education.

Maria Mitchell. Portrait by Lisa Congdon for The Reconstructionists project.
To get an idea of just how radical the notion of womens education was in the eras cultural context, here is an anecdote, equal
parts amusing and appalling, that Mitchell relays about one particularly anxious mother who placed her daughter in the
astronomers care at Vassar:
One lady, who seemed to be a bright woman, got me by the button and held me a long timeshe wanted this, that, and the other
impracticable thing for the girl, and told me how honest her daughter was; then with a flood of tears she said, But she is not a Christian. I
know I put her into good hands when I put her here. (Then I was strongly tempted to avow my Unitarianism.) Miss W., who was standing
by, said, Miss Lyman will be an excellent spiritual adviser, and we both looked very serious; when the mother wiped her weeping eyes and
said, And, Miss Mitchell, will you ask Miss Lyman to insist that my daughter shall curl her hair? She looks very graceful when her hair is
curled, and I want it insisted upon, I made a note of it with my pencil, and as I happened to glance at Miss W. the corners of her mouth were
twitching, upon which I broke down and laughed. The mother bore it very good-naturedly, but went on. She wanted to know who would
work some buttonholes in her daughters dress that was not quite finished, etc., and it all ended in her inviting me to make her a visit.
And yet Mitchell had extraordinary clarity of vision when it came to education in all its dimensions, one she eloquently if
sternly articulated to her pupils:
You cannot study anything persistently for years without becoming learned, and although I would not hold reputation up to you as a very
high object of ambition, it is a wayside flower which you are sure to have catch at your skirts.
Whatever apology other women may have for loose, ill-finished work, or work not finished at all, you will have none.
When you leave Vassar College, you leave it the best educated women in the world. Living a little outside of the college, beyond the reach of
the little currents that go up and down the corridors, I think I am a fairer judge of your advantages than you can be yourselves; and when I
say you will be the best educated women in the world, I do not mean the education of text-books, and class-rooms, and apparatus, only, but
that broader education which you receive unconsciously, that higher teaching which comes to you, all unknown to the givers, from daily
association with the noble-souled women who are around you.
Mitchell confronted the issue of womens education head-on, writing in her diary on January 3, 1868:
Meeting Dr. Hill at a private party, I asked him if Harvard College would admit girls in fifty years. He said one of the most conservative
members of the faculty had said, within sixteen days, that it would come about in twenty years.* I asked him if I could go into one of
Professor Peirces recitations. He said there was nothing to keep me out, and that he would let me know when they came.
At eleven A.M., the next Friday, I stood at Professor Peirces door. As the professor came in I went towards him, and asked him if I might
attend his lecture. He said Yes. I said Can you not say I shall be happy to have you? and he said I shall be happy to have you, but he
didnt look happy!
[]
The professor was polite enough to ask us into the senior class, but I had an engagement. I asked him if a young lady presented herself at
the door he could keep her out, and he said No, and I shouldnt. I told him I would send some of my girls.
* Harvard founded Radcliffe College, its sister school for women, in 1879 but female scholars remained segregated there until
1999, when the two schools finally merged and Harvard-Harvard began to admit girls more than a century after the
professors prediction.
Upon visiting Glasgow during her European trip five years later, Mitchell notes the dismal state of womens education there,
emblematic of the eras general disposition even, most tragically, of the eras women in power themselves:
The Glasgow College for Girls. Seeing a sign of this sort, I rang the door-bell of the house to which it was attached, entered, and was told
the lady was at home. As I waited for her, I took up the Prospectus, and it was enough, music, dancing, drawing, needlework, and
English were the prominent features, and the pupils were children. All well enough, but why call it a college?
When the lady superintendent came in, I told her that I had supposed it was for more advanced students, and she said, Oh, it is for girls up
to twenty; one supposes a girl is finished by twenty.
I asked, as modestly as I could, Have you any pupils in Latin and mathematics? and she said, No, its for girls, you know. Dr. M. hopes we
shall have some mathematics next year. And, I asked, some Latin? Yes, Dr. M. hopes we shall have some Latin; but I confess I believe
Latin and mathematics all bosh; give them modern languages and accomplishments. I suppose your school is for professional women.
I told her no; that the daughters of our wealthiest people demand learning; that it would scarcely be considered good society when the
women had neither Latin nor mathematics.
Oh, well, she said, they get married here so soon.
When I asked her if they had lady teachers, she said Oh, no [as if that would ruin the institution]; nothing but first-class masters.
It was clear that the women taught the needlework.
But the very faculties that suited women for needlework, Mitchell firmly believed, were also what primed them to be great
scientists should they choose to pursue that. In another diary entry, she puts the issue in wonderfully poetic terms:
Nothing comes out more clearly in astronomical observations than the immense activity of the universe. All change, no loss, tis revolution
all.
Observations of this kind are peculiarly adapted to women. Indeed, all astronomical observing seems to be so fitted. The training of a girl fits
her for delicate work. The touch of her fingers upon the delicate screws of an astronomical instrument might become wonderfully accurate in
results; a womans eyes are trained to nicety of color. The eye that directs a needle in the delicate meshes of embroidery will equally well
bisect a star with the spider web of the micrometer. Routine observations, too, dull as they are, are less dull than the endless repetition of the
same pattern in crochet-work.
It comes as unsurprising testament to Mitchells character, then, that shortly thereafter, she makes a resolution that would guide
the rest of her life and encapsulate her greatest legacy:
Resolved, in case of my outliving father and being in good health, to give my efforts to the intellectual culture of women, without regard to
salary; if possible, connect myself with liberal Christian institutions, believing, as I do, that happiness and growth in this life are best promoted
by them, and that what is good in this life is good in any life.
This undying faith in the intellectual culture of women comes most vibrantly ablaze in a diary entry from 1874:
For women there are, undoubtedly, great difficulties in the path, but so much the more to overcome. First, no woman should say, I am but a
woman! But a woman! What more can you ask to be?
Born a woman born with the average brain of humanity born with more than the average heart if you are mortal, what higher
destiny could you have? No matter where you are nor what you are, you are a power your influence is incalculable; personal influence is
always underrated by the person. We are all centers of spheres we see the portions of the sphere above us, and we see how little we
affect it. We forget the part of the sphere around and before us it extends just as far every way.
The great gain, she writes, would be freedom of thought:
Women, more than men, are bound by tradition and authority. What the father, the brother, the doctor, and the minister have said has been
received undoubtingly. Until women throw off this reverence for authority they will not develop. When they do this, when they come to truth
through their investigations, when doubt leads them to discovery, the truth which they get will be theirs, and their minds will work on and on
unfettered.

Mitchell bemoaned the disconnect between academic honors, which rewardrote memorization, and actual learning a challenge
that remains unsolved even today.
The whole system is demoralizing and foolish. Girls study for prizes, and not for learning, when honors are at the end. The unscholarly
motive is wearing. If they studied for sound learning, the cheer which would come with every days gain would be health-preserving.
Though Mitchell opposed standardized testing and believed in more dimensional conceptions of intelligence long before it was
fashionable to do so You cannot mark a human mind, because there is no intellectual unit, she remarked she was a fierce
champion of the value of reading, particularly of the meticulous mastication of intellectual food:
My students used to say that my way of teaching was like that of the man who said to his son, There are the letters of the English alphabet
go into that corner and learn them.
It is not exactly my way, but I do think, as a general rule, that teachers talk too much! A book is a very good institution! To read a book, to
think it over, and to write out notes is a useful exercise; a book which will not repay some hard thought is not worth publishing. The fashion of
lecturing is becoming a rage; the teacher shows herself off, and she does not try enough to develop her pupils.
The greatest object in educating is to give a right habit of study.
She was also a proponent of intellectual and creative well-roundedness, the verywide interests that mark most successful
scientists:
Health of body is not only an accompaniment of health of mind, but is the cause; the converse may be true,that health of mind causes
health of body; but we all know that intellectual cheer and vivacity act upon the mind. If the gymnastic exercise helps the mind, the concert or
the theatre improves the health of the body.
Mitchell was an enormous champion of endowment, both financial and intellectual, and noted the brokenness of education
funding more than a century before todays student debt crisis:
A genius should wait some years to prove her genius.
Endow the already established institution with money. Endow the woman who shows genius with time.
[]
When you aid a teacher, you improve the education of your children. It is a wonder that teachers work as well as they do. I never look at a
group of them without using, mentally, the expression, The noble army of martyrs!
The chemist should have had a laboratory, and the observatory should have had an astronomer; but we are too apt to bestow money
where there is no man, and to find a man where there is no money.
When she visits Russia, Mitchell remarks upon the divergent cultural attitudes towards supporting education:
St. Petersburg is about the size of Philadelphia [but] there are thousands of women studying science in St. Petersburg. How many thousand
women do you suppose are studying science in the whole State of New York? I doubt if there are five hundred.
Then again, as to language. It is rare, even among the common people, to meet one who speaks one language only. If you can speak no
Russian, try your poor French, your poor German, or your good English. You may be sure that the shopkeeper will answer in one or
another, and even the drosky-driver picks up a little of some one of them.
Of late, the Russian government has founded a medical school for women, giving them advantages which are given to men, and the same
rank when they graduate; the czar himself contributed largely to the fund.
One wonders, in a country so rich as ours, that so few men and women gratify their tastes by founding scholarships and aids for the tuition of
girls it must be such a pleasant way of spending money.
Her impression is further confirmed when she shares a train ride with a Russian mother and her daughters on a trip across the
country. Mitchells conversation with the young girls is profoundly telling, at once tragic and hopeful:
Are you interested in questions of government? They replied, All Russian women are interested in questions of that sort. How many
American women are interested in questions concerning government?
These young girls knew exactly what questions to ask about Vassar College, the course of study, the diploma, the number of graduates,
etc. The eldest said: We are at once excited when we hear of women studying; we have longed for opportunities to study all our lives. Our
father was the engineer of the first Russian railroad, and he spent two years in America.
I confess to a feeling of mortification when one of these girls asked me, Did you ever read the translation of a Russian book? and I was
obliged to answer No. This girl had read American books in the original. They were talking Russian, French, German, and English, and yet
mourning over their need of education; and in general education, especially in that of women, I think we must be in advance of them.
One of these sisters, forgetting my ignorance, said something to me in Russian. The other laughed. What did she say? I asked. The eldest
replied, She asked you to take her back with you, and educate her. But, I said, you read and speak your languages the learning of the
world is open to you found your own college! And the young girl leaned back on the cushions, drew her mantle around her, and said,
We have not the energy of the American girl!
The energy of the American girl! The rich inheritance which has come down to her from men and women who sought, in the New World, a
better and higher life.
When the American girl carries her energy into the great questions of humanity, into the practical problems of life; when she takes home to
her heart the interests of education, of government, and of religion, what may we not hope for our country!
Above all, however, Mitchell championed the idea of challenging conventionand never ceasing to question:
There is this great danger in student life. Now, we rest all upon what Socrates said, or what Copernicus taught; how can we dispute authority
which has come down to us, all established, for ages?
We must at least question it; we cannot accept anything as granted, beyond the first mathematical formulae. Question everything else.
But perhaps most telling of all was her students heartfelt gratitude. In May of 1889, shortly after Mitchell announced her
reluctant retirement from Vassar as her health was rapidly declining, one student wrote to her, speaking for all whose lives the
great astronomer and educator had touched, both directly and through her landmark cultural legacy:
In all the great wonder of life, you have given me more of what I have wanted than any other creature ever gave me. I hoped I should
amount to something for your sake.
Complement Maria Mitchell: Life, Letters and J ournals with pioneering astronomer Vera Rubins Berkeley commencement
address on science and stereotypes a century after Mitchells heyday, when both so much and so little has changed, then revisit
Mitchells Reconstructionists profile.

Вам также может понравиться