Recently I noticed that I first “met” with all of the following BESTs at age 14 or 15 years:
- The most beautiful piece of music I’ve ever played: Pavel Chesnokov‘s “Salvation Is Created” (high-school summer camp concert band, I played bassoon)
- The most beautiful piece of music I’ve ever sung: Gregorio Allegri‘s “Miserere mei” (church choir)
- The best short story I’ve ever read: D.H. Lawrence‘s “The Rocking-Horse Winner” (found in the doorstop English class anthology-textbook)
- The best play I’ve ever seen: Jean-Paul Sartre‘s No Exit (“competition play” for my school’s Theatre Guild in a “drama” year)
And I don’t really have a best in any other culture/arts category like novel, movie, popular song, with all of which I had more extensive experience already before adolescence, and have continued to accrue more extensive experience.
How much psychological/philosophical/cultural “imprinting” does happen in adolescence, as opposed to earlier, later, or never? It’s a question obviously impossible to answer, but fun to toss out there, and far from irrelevant in the wider world.
The Economist, “Young voters: let’s set the world on fire,” 18 October 2014:
In the long run, however, wooing young voters is of paramount importance. A study by Yair Ghitza of Catalist, a data firm, and Andrew Gelman of Columbia University found that whites who came of age when Democrats were in power are more likely to vote Democratic in later years, and vice versa. In other words: like tastes in pop music, political affiliations forged while young often last a lifetime.
After all, part of the point of adolescent brain development is to hardwire our own shortcuts and “best processes.” National Geographic, “Beautiful Teenage Brains,” October 2011:
The first full series of scans of the developing adolescent brain—a National Institutes of Health (NIH) project that studied over a hundred young people as they grew up during the 1990s—showed that our brains undergo a massive reorganization between our 12th and 25th years. …[A]s we move through adolescence, the brain undergoes extensive remodeling, resembling a network and wiring upgrade.
… [S]ynapses that see little use begin to wither. This synaptic pruning, as it is called, causes the brain’s cortex—the outer layer of gray matter where we do much of our conscious and complicated thinking—to become thinner but more efficient. Taken together, these changes make the entire brain a much faster and more sophisticated organ.
This process of maturation, once thought to be largely finished by elementary school, continues throughout adolescence. Imaging work done since the 1990s shows that these physical changes move in a slow wave from the brain’s rear to its front, from areas close to the brain stem that look after older and more behaviorally basic functions, such as vision, movement, and fundamental processing, to the evolutionarily newer and more complicated thinking areas up front.
I am a consumer of cultural media. How far do its producers imprint during adolescence, or before or after? Well, Robert Pinsky admitted a “very early influence” from Philip K. Dick, one of his favorite writers in his early teens, in response to a question from me about inspiration for his poems’ imagery. (He continued, “So I would have liked to think that all the wonderful writers I’ve encountered since, like Cather and Twain and Hemingway, would have covered up that very early influence, but I guess you saw through to it.” Yes, I’d love to share the full version of “my Pinsky story” to anyone who would like to hear it.)
What about you?No comments
My daughter discovered Disney princesses around age 3.
I had known to expect this. I read Peggy Orenstein’s Cinderella Ate My Daughter a year in advance. My daughter eased us into princesses: she went through an intense Teenage Mutant Ninja Turtles phase 6 months earlier. (I told her recently how an insect or crab is built, using the term “exoskeleton,” and she responded, “Just like in Mikey’s Monster!”) She has plenty of favorite books unrelated to branding and of impeccable quality: we’ve read Phantom Tollbooth with her twice already. And though she does think of princesses in terms of branding very very often (Band-aids = toddler bling, and Princess Band-aids = a tantrum waiting to happen) she also tells stories about them, like when Rapunzel and Belle go visit a party Ariel’s hosting under the sea. (I think she borrowed from the plot of Anansi’s Party Time for that story, too!)
I have no standing to complain. I remember the first time I heard “Part of Your World”: my younger cousin sang it to me; I was eight or nine. During undergrad I was once in a mixed group of 8-10 people walking past Widener after dark when the song was casually mentioned, then immediately sung through with true passion and without a moment’s hesitation over a single word by every female present. (Admittedly, this was a group of Noteables). In the early-mid-nineties I believed as firmly as anyone that seeing the next big Disney animated flick was not optional but required. (I did have at least the good sense to be thoroughly and permanently disabused of that opinion by Pocahontas.)
The focus of my daughter’s obsession is indeed Ariel; said obsession has only been fed by receiving Little Mermaid birthday gifts from two great-aunts and one grandmother (DVD, novelization, swimsuit). As I already admitted, I have no standing to complain.
And yet … my acquaintance with Frozen is so far limited to a novelization and several dozen repetitions of certain Youtube videos. I live in fear of the day I’ll have to see it with my daughter. My husband says it seems like a fairly good story. That’s the problem.
“Let It Go” was first described to me as, “Because ‘Defying Gravity‘ would be too awesome for a Disney princess movie.” The first time I heard it I agreed – which is funny, because it’s the same singer-actress. I respect the performance (sung and drawn) more now, but I feel terrible thinking about it: this character is going insane. She’s expecting and intending never to see another human being again. She’s not losing her humanity: she’s deliberately renouncing it. The visuals are pretty, but a lyric likening one’s own soul to “frozen fractals” is nothing but sad and frightening. “The perfect girl is gone” not because this person discovers herself to be a different girl, or a woman. She’s becoming elemental, and she’s been so miserable for so long that she actually relishes this prospect.
It seems that the movie Frozen takes place during the first few days with any measurable level of plain speaking inside the royal family after at least 15 years of rigid, unbroken, jaw-clenched withdrawal. So even if some real trust begins to grow during those few days, I imagine so much heartbreak and fear and anger over their next decade or so. Family doesn’t heal so easily.
I read at least once some yammering about Beauty and the Beast encouraging girls to stay with abusive partners in hopes of reforming them. I can’t identify with that concern. (BTW: Fifty Shades of Grey is extremely transparently a “Beauty and the Beast” plot.) I don’t fear for what my daughter may learn from Frozen about family relationships, or for that matter how to deal with wolves. I just feel really, really sad for the characters who had to live through that pain, and I can’t believe the pain really ends for them.No comments
Why is fiction always about people and relationships?
I started thinking and talking about this question a few years ago. (I labeled it a “personal-intellectual project” of which my “Who’s the medium now?” set of posts in 2010-2011 is one expression). The only counter-example I know well is Dougal Dixon‘s After Man: a Zoology of the Future. An author friend, Alec Nevala-Lee, pointed me to “hysterical realism” (a term for packed-to-the-gills works such as many by DF Wallace, Pyncheon, Don DeLillo, and Zadie Smith) as potential counterexamples:
From James Wood’s review coining the term:
Zadie Smith has said, in an interview, that her concern is with “ideas and themes that I can tie together–problem-solving from other places and worlds.” It is not the writer’s job, she says, “to tell us how somebody felt about something, it’s to tell us how the world works.”
My dad offered many more well-known and better-loved counter-examples: Beatrix Potter’s books. “They’re about animals, not people.” “Of course they’re about people!” I said. “How so?” “Because they’re about how people feel and think and interact with each other.” My dad then accused me of tautology: I define any work of fiction as being “about people.”
That might be a fair criticism, but not with respect to Beatrix Potter stories, and particularly not with respect to The Tale of the Pie and the Patty-Pan. Its first line may be, “Once upon a time there was a Pussy-cat called Ribby, who invited a little dog called Duchess, to tea,” but The Pie and the Patty-Pan is, in fact, exactly like the Sex and the City (TV) episode “A Woman’s Right to Shoes.” Both are surface-funny and subtext-baffling, showing absurd examples of what one may and may not say in certain social circles, what kinds of miscommunications or properly understood communications can or cannot threaten a friendship. Both poke fun at social conventions but apparently without denouncing them. The little animal details in Potter’s book like Duchess begging for a sugar cube on her nose or the doctor being a known and accepted kleptomaniac by virtue of his species (magpie) is probably all that makes Potter’s version easier for me to stomach than HBO’s.
This is very similar to my argument for Invisible Cities and The Twilight Zone being about people: they use tight, focused, beautiful metaphors to externalize a mind or soul’s internal conflicts. And I think it’s an argument very relevant to speculative fiction fans: the main message of a show like Babylon 5 is always, “Aliens are just people, too.”2 comments
So who’s seen Ender’s Game?
Is it any good as a movie?
Unrelated question: which, if any, parts of the feel & experience of the book does it get right?
It’s funny, isn’t it, that those are unrelated questions? And movie reviews can only answer the second question when it’s obvious that the reviewer is a fan. e.g. for The Hobbit: an Unexpected Journey (3-part Hobbit: WTF?), go straight to Anthony Lane’s review. It summarizes very neatly to “I loved it — but I’m a fanatic” (which is also, word-for-word, my own review of Terminator: Salvation).
I haven’t seen a fan’s review of Ender’s Game. (No, I haven’t searched at Hatrack River, why do you ask?) I knew I was in the wrong place at Entertainment Weekly upon reading, “The problem is, these initiation and training scenes go on forever.” I’m not a Card fanatic, but I’m pretty confident that anyone who considers Battle Room a problem hasn’t the slightest idea what Ender’s Game fandom is about.
P.S. Ben Kingsley as Mazer Rackham is, in theory, an unadulterated stroke of genius. Please tell me that, unlike Russell Crowe as Javert, it also works in practice.1 comment
One of the great parts of being a speculative fiction fan is watching reality catch up to and surpass one’s favorite authors’ imaginations — or just never take a step in that direction at all. As I’ve written before, I live in dread of the genetic-discrimination world of Gattaca, which I now fear may be here before today’s children are dead (though I’m still hoping it won’t be before I am dead).
On the other hand, I’m chomping at the bit for movie acting to become independent of the actors’ own physical attributes, as in The Diamond Age. Avatar was a great leap forward, but so was Final Fantasy: The Spirits Within, because that was the first time CGI tried to render human faces and skin in a way that might fool the audience, even for half-second intervals. (Remember how amazing that one-second teaser of a single eye blinking was at the time?) Avatar was not ambitious on that particular score: only the non-human characters are rendered.
But some day we’ll see a movie where all the actors are wearing suits like Andy Serkis‘s (LOTR and Planet of the Apes!), and I can barely wait, because then we’ll finally be able to have movie stars whose ability to use their faces matters more than their facial features. (My favorite part about this in The Diamond Age is the little bit about Miranda studying how to ract a character with “cat eyes,” since she in real life has “bunny eyes,” which are used differently.)
This is how my husband convinced me to see Les Misérables. I didn’t expect it to be impressive musically as compared to any stage production, and indeed I was pretty much not impressed on that score. But letting the actors sing on-camera and mixing in the orchestra afterwards is a new attempt. The actors were clearly and justifiably over the moon about the chance. So I went along promising to have an open mind in trying to evaluate whether this presages the future of movie musicals.
And does it? Well, of course, I haven’t a clue. Quite irrespective of impressiveness, Les Miserables is always overwhelming, and thus hard to evaluate. Yes, the sung sequences are obviously more immediate than in Singin’ in the Rain. But movies, movie stars, moviemaking, and movie audiences — to say nothing of acting styles — are so different now from then that it seems arbitrary to compare how the songs were recorded between the two.
What new innovations are you watching for (happily or no)?2 comments
Rolling Jubilee is about to kick off, billing itself as “a bailout of the people, by the people, for the people.”
Other comments I’ve seen on this:
I like the idea, in some ways especially the “random acts of kindness” aspect of it. One imagines there’s no way they could actually eliminate any significant fraction of American personal debt, so in some sense randomly is the ‘fairest’ way to try to help anyone. Although they will get some prety impressive ‘bang for the buck’ (in a more literal than usual sense).
This also led my husband and I to look a bit into the actual Bilblical concept of the Jubilee — which turns out to have probably made sense in ancient Near Eastern cultures for reasons including provisioning the armies. See Michael Hudson‘s article in Bible Review 15:01 (1999) “The Economic Roots of the Jubilee.”No comments
A shout-out to HRSFAN Aaron J. Dinkin, linguist of the dialectological variety, who appeared as a Major Quoted Someone for Slate last month in an article on the Northern Cities Vowel Shift (NCVS aka NCS).
The article is raising awareness of some recent (~our lifetime) re-jiggering of “linguistic turf” for short vowels (cat, cot, caught, &c.), which seems to be radiating outward from areas like Buffalo, Cleveland, Detroit &c. The write-up is fun, and Aaron sounds in his element.
I like the content of the article, but am not sure where the tone is coming from. Aaron, Emily, and anyone else with opinions and/or data, please chime in:
- Why does the introductory expert, William Labov, explicitly present the NCS as a PROBLEM? It’s kind of cool to be catching systematic pronunciation change in the act — especially if it may truly be as big a vowel shift as we’ve seen (heard) in the past millennium. And it’s not like Northeast/Midwesterners feel like we can’t understand or be understood by others. Is this actually an aesthetic judgment? I think most of us already feel English vowels are dead ugly, and don’t care except (possibly) in an operatic context.
- Are the experiments described as supporting lack of self-awareness on the part of NCS speakers (Preston, Niedzielski) presented accurately? Neither seem damning to me. How is “flipping a mental coin” for cat v. cot in isolation — if in one’s own pronunciation they are homonyms — different from flipping a mental coin for to v. two v. too in isolation?
I owe Elisabeth a massive apology for some poorly considered writing of mine last winter.
I’m really sorry: it was thoughtless of me not to ask you directly first.
A standard apology probably would have sufficed had I given it when it came due, but that was seven months ago. And culpability, like Rumour, grows swiftly and fearfully. In other cases when someone else here has written something that baffles or otherwise inspires me, I have been more courteous about prior notification of what I’m thinking, so I really have no excuse for neglecting that step in February.
So, just a few more lines to throw out on the question of ‘pleasure’ and/or ‘work’ reading:
- A new short essay on Aristotelian leisure
- Franz of Sunday in the Park with George
Work is what you do for others — Liebchen —
Art is what you do for yourself
- A family member overheard someone in an airport security line complaining that he was 100s of pages into a book and nothing had happened. He turned around to see what book the other passenger was reading, and it was American Gods. My family member nearly blurted out, “What do you mean?”
- This same relative was chewed out by a friend to whom he had recommended Ilium for sending him through an 800-page “slog.” To me (and my relative), reading Ilium is much more like being poor Phaeton in Apollo’s chariot. Slog? How? I’m being dragged too fast for my feet to stick in anything!
- But one of my blockmates says he would characterize both American Gods and Ilium very similarly, simply because he would define “something happening” in a novel as “a scene advancing the main plot.”
- Huh. I never would have thought of that.
So in conclusion, I do grasp that, very often, someone else’s reading tastes utterly confuse me because I’m just not imaginative enough. And I should remind myself more often to ask before I expound, even if I can’t ask before I wonder.No comments
Oh, my. I no longer believe I die before Gattaca. This is frightening.
The NYTimes article where I encountered this news has estimates from the study team that the technology could be available in as little as 3-5 years: whole-genome sequencing of a fetus based on only a maternal blood sample and a paternal saliva sampe, with the fetal genome reconstructed from fragments in the mother’s blood.
I still think I’ll die before direct genome sequencing is used in hiring decisions. But it seems suddenly only too feasible that my grandchildren will be born into a society where parents choose to learn genetic propensities before birth, the same as today’s parents often choose to learn a child’s sex before birth.No comments
OK, so now I’ve had two humanities people tell me more or less categorically that non-fiction reading is not pleasure reading. (The first instance prompted much of what I’ve written here in the past year and a half; the second came initially as a comment on this weblog.) I would not have expected that particularly of humanities people, honestly. I suppose I had assumed that English or classics or folklore majors and suchlike were more likely to be into any and all reading.
In both cases there has been the clarification that the information gained (“understanding stuff about the world”) may give one pleasure “even when reading about it feels like work.” And yet … this still implies that the readers approach vast categories of written documents strictly from a utilitarian point of view—news, debate, most forms of essay, and (most pertinently to these discussions) academic and non-academic book-length works: pop science and ethnography, self-help and philosophy, history and historiography, history of science, biography, literary and art criticism, poli-sci. It truly does surprise me if the ‘not-for-pleasure’ category is that broad for either elisabeth or the grad student who told me she doesn’t read non-fiction.
I am accustomed to expansive, voracious, and usually compulsive reading from my close associates in the hard sciences and social sciences. My dad, an old-school sysadmin, keeps on hand nearly the complete œvres of Faulkner, Vonnegut, Lessing, and Erdrich (one of his most evocative comments on the last: “… so fierce that I can frankly understand her husband committed suicide“). He also keeps a personal subscription to Science magazine, setting himself the goal of understanding one article per issue (an ambitious goal that is by far not always reached). The mother of a mathematician friend had to set a rule during middle school that she would select every other book for his pleasure reading: she chose good classic YA, he plowed through the local library’s math collection. This is the same person who introduced me to The Ancestor’s Tale and, on my recommendation, read The Archivist in a day and a half. My little brother, a history-major-turned-‘financial-analyst’ of whom I was seriously proud when he started (with The Fourth Hand) recommending to me books based on his own taste, is an Andrew Jackson buff who is also my original source for King Leopold’s Ghost. A chemistry undergrad friend of mine, now in graduate school, recruited friends for ‘salon’ book groups in two states in which he’s recently lived.
King Leopold’s Ghost and The Ancestor’s Tale are both written with exceptional clarity, perceptiveness, and outreach towards the audience. I have actually “grown” a favoritism for interdisciplinary non-fiction author Steven Johnson, and have wished that I had been a Harvard undergrad more recently so I could have become a disciple of Daniel Lord Smail, whose academic training is in 14th-century French legal documents, but who also is passionately advocating for historians to claim as their field all of human history, the way they used to before Western culture imagined how many orders of magnitude longer than Biblical history all is.
And non-fiction can inspire such awe—often for its subjects (Catherine of Aragon or the early epidemiological triumph of 1854 London, and more shrouded figures/incidents such as Mary Eleanor Bowes, Countess of Strathmore, and the training of medical residents), but also for the brilliance of the research (Montaillou) or the mind (An Experiment in Criticism).
Truly … people can dismiss all the vast variety of non-fiction as not intended for pleasure reading? Is this another case where I am failing to understand what kind of pleasures others seek from their reading?3 comments