Tag Archives: science

I’m 2.7% ‘Neanderthal’: The language of evolution

According to a recent analysis of my DNA done by this company (which was a company used by Henry Louis Gates Jr.’s series “Finding Your Roots” on the PBS), 2.7 of my genetic material comes from Neanderthals.

dna

Not that this amount of Neanderthal makes me unique — I’m only at the 41st percentile. But it was only recently that it humans were understood to have any Neanderthal ancestry at all. But knowing this about myself, I watch shows about Neanderthals, like this one I watched last night, in a different way than I used to. They’re my people — well, are Neanderthals people? I guess it depends how we define people.

Definitions like this matter a lot when we start talking about genetics and evolution, as “Your Inner Fish” does (also, here). Watching episode 2 — “Your Inner Reptile” — last night, I was struck by how easy it is, when talking about evolution, to make it seem like evolution is an active force that guides/aims the change in organisms toward the end of becoming what they have become, rather than thinking of evolution as a passive descriptor of an essentially random process that it technically is.

I’m not a biologist (and if I’m getting this wrong, I’d appreciate hearing from an evolutionary expert), but my understanding of evolution is that the act that causes alterations to the bodies of a certain line of creatures over generations is genetic change in new offspring. Due to sexual recombination of genes, as well as random mutations that occur, individual creatures are born with features that may be different than what any parent has. For a crude example, perhaps a baby squirrel is born with four eyes. And, while many new features are useless or even harmful to the individual creature, perhaps having four eyes helps that creature survive and reproduce more than its fellow squirrels. Eventually, four-eyed squirrels could be so much more successful at living and reproducing than normal two-eyed squirrels that eventually all of the squirrels that get born and survive are four-eyed squirrels.

Now, of course, none of the two-eyed squirrels became four-eyed squirrels. No squirrel born without four eyes would spontaneously start to grow four eyes. A squirrel is born with the genes it gets, and even if some person decided to give that squirrel two additional eyes via surgical implantation, that squirrel would not have the genes to create children with four eyes. (Of course, there could be genetic engineering to do such a thing, and some bacteria just share their genes).

And so lately I’ve been thinking that what we label as “evolution” is an abstraction used to describe the perception of physical changes in successive generations of offspring. This is associated with the idea that each person alive now (assuming no human has yet been made in a lab from one parent’s doubled genes) must have had ancestors going back to, well, when life first started. I am here because my parents created me, and their parents created them, and so on, back to the first molecules that could replicate themselves. So I come from a genetic line of individuals who were successful at reproducing themselves going back to early humans, to proto-humans, to proto-mammals, to creatures who looked more like reptiles, to creatures who looked more like fish, to single-celled organisms.

And the change between any two generations, children compared to parent, was likely quite small — the big changes, like from one species of proto-human to Homo sapiens, can be seen only by comparing individuals who are millions of years (and many, many generations) apart.

But of course, to label different creatures as being of differing species is to draw distinctions that are perhaps useful but certainly arbitrary. (Yes, a species is defined here as “the largest group of organisms capable of interbreeding and producing fertile offspring,” but of course, this boundary is apparent only many, many generations after the original divergence).  Any two individual organisms that are compared will have some things in common but not other things. The term “evolution” then is applied to explain these differences in individuals where one individual may be an ancestor of the other, but “evolution” is not itself a physical entity. Physical organs are touchable things, as are offspring, but “evolution” is an abstraction.

Links: Be present grieving, diaries, story, etc.

1. What to say to those who are grieving? Maybe say nothing.

2. Bad Lip Reading in the NFL. Sounds almost like random-word poetry.

3. A history of computation.

4. Five published diaries, from which this quote: “I like them because although people can fake diaries I tend to feel that they don’t. And I like the perspective of a diary, in that people don’t know what’s going to happen next.”

5. Story rather than science, to explain lived experience.

Wisdom of the unknown, or Why doesn’t Santa bring me a Lexus?

2013_12_01_mh (8)_cropReading the previous post  a day later, I realize that I’m being intense about — I’m taking very seriously — my desire to remove magic from my life. It’s not real, so why deal with it?

But as I was emailing a friend this morning, I thought of another angle here: why is there no Santa for adults? I mean, why shouldn’t there be? And I’m not saying that there should be some group of humans who go around fulfilling adults’ wishes, leaving us new cars in our driveways (I always wonder when I see those holiday car ads  where a new Lexus, or whatever, is wrapped in a bow: who the fuck gets a NEW LUXURY CAR for Christmas? Maybe they’re not new cars — maybe they’re, per the patois, “gently used.” I guess this could happen somewhere, but not in my ZIP code.), although that actually would be pretty cool.

But it’s an interesting, if daydreamy, question to ask: Why isn’t there magic? I know some people like to describe certain things — like winning the lottery, or recovering from a serious illness — as “miraculous,” which is very similar to magic. But, as the saying goes, the Lord works in mysterious ways. But why be so mysterious, Lord? Why shouldn’t I walk outside tomorrow morning or, OK, Christmas morning (it’d be acceptable for me if God, like Santa, delivered miracles only once a year), and find a new hundred-thousand-dollar car in my driveway? Or what if God actually solved real problems, like curing addiction or preventing poverty or ending child abuse?

This brings us to the problem of evil, for which of course, there are no good answers. And one could also argue that if God, or Santa, really did things for us, we’d get lazy or something. On the other hand, we’re such dependent creatures, anyway. I mean, we can’t go more than a few minutes without needing some oxygen from the world — why should being so dependent on oxygen be well and good, but being hooked on nicotine be unhealthy?

I’m not trying to be entirely facetious here, either (a little facetious, but not entirely so). I’ve long thought that certain things are our birthright as humans: oxygen, for one, but also water and food. Of course, in a world of lots of people and limited resources, not everyone would agree that having clean water and healthy food should be human rights. We humans were born here, in a world where there exist the things we need to survive, and yet, we find ourselves at times having to face challenges to our survival, such as threats (lions, tigers, bears, the Marburg virus, etc.) and competition (from other humans).

So we learn — as individuals, and as a species — ways to live in the world, ways to get what we need, and even what we want. The things we learn, we call rules, ideas, laws of science, etc., and we feel that this knowledge can tell us how to act, so that we can be not merely passive, not helpless (even if we sometimes still are powerless, as when there’s an earthquake or a tornado, and even if what we do makes things worse — global warming, endocrine-disrupting chemicals, etc.). We who live in cultures that give science and rationality the authority to determine what’s real don’t accept any explanation that requires God (or magic, or Santa, or ghosts, etc.). Relying on science discourages us from burning witches and following leaders who do what their dreams command, but it also gives us a worldview in which some things are easily knowable (for instance, acceleration due to gravity) and other things (such as why moms and dogs die) completely unknowable.

So I know why Santa won’t bring me a Lexus — science says there is no Santa, and there has to be a set of physical (and paperwork) steps for a Lexus to come to me and be my own. It’s not impossible that humans would conspire to bring about my Lexus (hint, hint?), but there’s no evidence for God or Santa there. If God and Santa work completely through the acts of others (of people or of nature), well, then we could subtract God and Santa and still have the same people and nature, without losing anything but the empty ideas of “God” and “Santa.”

Of course, I’ll grant that it can be fun to mislead children — I am a teacher, after all. But instead of asserting that people should have unjustifiable faith in the particular idea that magic, including Santa, and our humanly defined God, are real, we can instead know that there is an unknown, wherein we can be humble about, and even hopeful in, what we do not know. To assert knowledge about the limits of the possible is perhaps as faulty as asserting knowledge about the physically impossible. I don’t have to ask God to change the physical world for my betterment, and also, I don’t have to think that such change is impossible. Whether Santa brings me a Lexus, or my wife does, I still would have a Lexus. (I don’t have a Lexus.) Also, what’s more beautiful than having a Lexus is, of course, realizing that I don’t need to have a Lexus at all, that having a Lexus doesn’t make me smarter or write more goodly.

I often find a refuge in the unknown, in thinking that I don’t need to know the answers. I take as existing what seems to exist, and I generally feel pretty good in my own existence without postulating divine, magical beings. I can make and find my own meanings, without needing to get those meanings from, or ground them in, some unknowable supernatural entity.

So who am I to complain about the concept of Santa? In some ways, it’s pretty wonderful to be a child and to believe that some dude you don’t even have to thank is gonna bring you some pretty terrific loud-and-shiny stuff. Just because adults don’t get to believe that doesn’t mean that adults really know the world, either. (It’s actually kinda interesting to consider how adults made up such a as simple entity as Santa. It’s as if someone took human capabilities — generosity, material wealth, sleigh-driving — and just magnified or distorted those — giving gifts to all houses in a night, driving a flying sleigh — to create some kind of magnified super-person — a superhero, as it were. I sometimes wonder why we humans have such small-bore imaginations: instead of coming up with beings who are us, but a little bit more, why not imagine heroes who are unlike us, beings whose realms are beyond comprehension. Even when we try to describe God as being all-powerful and all-creating, we end up in logical cul-de-sacs such as this one. If instead we just say, “there are things beyond comprehension,” we at least allow ourselves to be wise in our silence.)

A philosophy of ghosts: How the scary unreal illuminates the real

I don’t like being scared.

If there’s a biological component to thrill-seeking, I don’t have it. (Some people, of course, may have it.) As a kid, I forced myself to go on roller coasters, and I did that, proving to myself I could face my fear, and having done that, I don’t have to go on roller coasters any more. It’s just not fun for me. Likewise, I don’t watch horror films, and I don’t go to “haunted houses,” and I even get a little anxious after seeing my neighbors’  Halloween decorations.

Pretty much all of Halloween is tough on those of us who are prone to anxiety. I get scared enough worrying about the various aspects of my present and my future that I don’t need any more reminders of death or the unknown. I much prefer those holidays were we celebrate life and have pastel bunnies and evergreen trees and whatnot.

I’m not the first to say that what’s scary about Halloween decorations like scarecrows and sheet-ghosts, is that they somewhat, but not precisely, resemble real people and inanimate objects. Like the “uncanny valley” of human reactions to robots who have near-but-not-yet-human bodies and movements, seeing levitating, wind-fluttered sheets in a tree and human forms in unaccustomed positions and places (like scarecrow decorations) perhaps causes an anxious need to resolve the differences between what we see and what we expect to see.

And sometimes it’s hard to resolve this difference. In my life, I have had experiences that seemed to be a little “otherworldly.” I have had moments of “déjà vu,” where I’d see a particular situation in front of me and feel like I’d dreamt that situation earlier. Another time, I remember having a strange, almost intoxicated feeling after talking with a person of a religious tradition little known to me. But rather than interpret these feelings as implying that there really was an “other world,” in which there could be prophetic dreams and people in contact with spirits, I just labeled these as odd, unexplained experiences, and I go on living my life in a world of regular physical things with a mind that sometimes has weird experiences.

And of course, how our minds operate, and how they interact with the physical world (for example, how nonphysical minds arise from physical brains) are themselves mysteries. But just because something is unexplained or mysterious does not mean that it can justify belief in the supernatural.

We educated moderns have mostly agreed to let science be the basis of our understanding of reality. What is real are things that many people can witness repeatedly. Rainbows and cows and electricity are real because we can observe these things under repeatable conditions. And in this world, certain things happen, and certain things don’t: for instance, objects don’t pop into and out of existence. If a pen I expected to find on my desk is no longer there, I assume that there is some physical explanation for where it went (maybe I bumped it off the desk, or my cat did, or a vibration from a passing truck pushed it off, etc.), rather than assuming that either the pen disappeared (as if by magic) or that some ghost took the pen.

We never see magical or supernatural things in our everyday perceptions of the world. (This is where it gets tricky: those who do see supernatural things, we would call mentally disordered — because brain malfunction is a more scientific explanation than assuming someone is beyond-human, no matter what a large number of fiction storytellers propose).  If we are to acknowledge ghosts as scientifically real, we would need to see them appear to groups (and not one individual) of people in repeatable ways — like rainbows do. Even if scientists were to verify by repeated observations that some of the phenomena that so-called “ghost hunters” look for — weird voices, cold spots, inexplicable phenomena — were real, scientists could not declare “ghosts” to be real, because “ghost” is a causal interpretation/explanation that requires nonphysical definition. A ghost, as commonly understood, is the soul or spirit of a dead person — and this connection cannot be made by rational argument. It must be made on faith alone.

Now, of course, some people choose to see the world through an understanding based on faith. They believe something is real because, well, they believe it’s real. Faith does not require evidence. Faith takes over where science cannot comment, which is in any realm in which there is no physical evidence. Science has no evidence into my personal, subjective experience; scientists can watch my brain scans and try to correlate those results with what I report experiencing, but no scientist can experience anything directly from or in anyone else’s mind.

But it is within one’s mind that one makes meaning from, one interprets, what one sees and feels. And so one is free to choose what one’s experiences mean. And so some people, including some of my students, assign to their unusual experiences the meaning of “ghost.” I choose not to accept that interpretation for my own irregular experiences because, frankly, I don’t want to believe in ghosts. I don’t want to believe the world is full of supernatural things. I find the idea of ghosts scary, and I choose to not be scared, so I accept the scientific view that ghosts, as a theory of what causes observed reality, cannot be justify as physically real.

However, my students who believe in ghosts often say that they want to believe in ghosts, because this belief allows them to think their deceased family members are still with them. (Mary Todd Lincoln reportedly believed in the ghost of Abe for the same reason.) One student this year told me she believes in ghosts because if they do exist, they would treat her better for having believed in them (an argument that seems silly but is pretty much the same argument made by the respected thinker Blaise Pascal.)

And I like having this discussion in my English classes because it makes clear some of the issues between science and religions, observations and theories, epistemology and metaphysics. I don’t understand ghosts as physically real, but I appreciate the ghosts as a real idea that can be discussed.

The Iliad, consciousness, reality: How I get tired this evening

I’m tired tonight, so I’m not sure how coherent this post will be, but I’ve been waiting for a chance to post some things, so here goes:

I’m reading selections from Homer’s Iliad (in a recent translation, though the translator’s name escapes me just now) and as we’re reading, I’m finding lots of weird and wonderful things that I point out to my students, and things I’d also love to talk to other adults about. For instance, there are moments in this serious work about war and grief that seem to me to be just plain funny, as when Hector says he will fight Achilles and kill him, or he will die an honorable death — and then when they meet, Hector turns and runs around the city of Troy, three whole laps.

It occurs to me that discussing artworks is one of the few things in life where many people can share the same experience and then discuss it. We can all read or watch the same book or movie, and then compare our experiences of reading or viewing. In much the rest of our lives, we have experiences separately (for example, even if two friends are each parents, they are parenting distinct children, in different houses, etc.), and while we can discuss our separate experiences, we cannot directly compare our experiences, the way we can when we experience artworks.

I experience subjectively — that is, even if you are standing next to me, you do not know what I experience. At best, I can communicate through words what I experience, but of course, that’s not direct experience. You can get my symbolic interpretation/representation of my experience, but you do not see through my eyes, or sense my mind.

So, when we experience, we are sensing (seeing, touching, etc.) and we are processing/interpreting what we sense. Much of what we experience, we forget. We may remember certain sights and smells, etc., but what links those senses to meaning is the stories we form from our experiences. For me, at least, much of what I know about my past is in the form of stories — that is, abstracted experiences, ideas of connected interpretations that often describe not the experience that was had but the world itself. These stories tend to compress time and ignore the moment-by-moment nature of our lived experience.

These stories may help us to structure and remember our experiences, but these stories may also be complete bullshit. Our memories are often faulty, but even if they are not, our stories edit out moments from continuous time. It’s so easy to look back at our own lives and think that all we were thinking about was the experience at hand — but I don’t seem to experience my waking moments that way; I’m often doing one thing now but also aware of what I should do, or would like to do, next.

I realize it’s sorta futile to discuss, in words and ideas, the limitations of words and ideas, and how words and ideas are always at best a kind of (what physical metaphor to use here?) layer, a kind of overlay, on top of physical reality.

Another of my classes is discussing the definition of “real,” and so far we have “something that exists or is proven to exist” and so far we’ve spend many minutes discussing what a “thing” is and what we’ve come up with is that a thing is a boundary we imagine around a piece of matter so that we can talk about the physical realm one piece at a time. We notice that a certain piece of matter, a fork, can be separated from another, a table. To simply be able to see pieces of matter as separate is an abstraction — and of course even words like “matter” and “physical realm” are abstractions.

No words exist outside human consciousness (or so it seems — it’s quite a generalization to make there). Or, perhaps some animals — like apes who use sign-language — can think symbolically. But the point remains — a fork can never declare itself to be a fork.

But to see how arbitrary the label of fork is, is also to see how hard it is to keep talking about the physical realm without the help of differentiating labels. We revert to “object” and “thing” and “this thing” and “that thing.”

So maybe we can’t escape words, but we can, through the ongoing process of thinking, become aware how loosely our ideas about the world are connected to the world itself (even such a loose term as “the world” starts to feel like bullshit and the word wilts, somehow — “wilting” is a pretty good metaphor).

And I asked my students how we can talk about things we don’t have labels for, and they suggested we talk about relative terms, and that we make comparisons — a platypus has a beak like a duck’s, but a body like a beaver’s, for example. So our ideas connect one to another, from these we can build whole systems of ideas, and yet, …

And yet, it seems to me lately that whole systems of ideas — Hegel’s metaphysics, histories of World War II, mathematics — start to seem deflated, as if they were held up by hot air that, once it escapes, leaves the idea-systems flat on the ground, unimpressive, step-on-able.

Taking a bit of a leap here, but it makes sense in my head to do this (and what are all writings, all texts, if not signs that there was a consciousness that produced them?), to say that fiction works and nonfiction works have in common that they are both ideas. Sure, nonfiction purports to be about the real world, but if the “real world” is itself an idea, a construct … and further, there are no facts in nature — there is no tree or rock on which facts are discovered. Facts are made by people, in the form of words, ideas, symbols, and these are what we are comparing nonfiction or fiction to.

But we have a notion of what the real world looks like. As my class has read The Iliad, I’ve become aware of how careful the story is to make most of the human-god interactions believably subjective, so that the story could be read in two different ways: as a fantasy-tale featuring personified gods who intervene directly in human activities, or as a realistic tale of human-only activities (and where the gods speak to only one person at a time, or in the guise of a human, so that the gods could be said to be the product of a particular person’s subjective experience).

That The Iliad can be approached in two ways, or as two distinct stories, seems very subtle, very wise, and it suggests that we can approach any text and decide whether it’s fiction or not based on what the text contains. I mean, if there is no truth “out there” — and where, exactly, would that be if there were? — but all ideas are products of human minds, then what exactly are we asking for in a distinction between fiction and nonfiction (or in any distinction, really — guilty/not guilty, here/there, up/down, etc.)

I’m not quite sure what I’m getting at, which to me is the beauty of the writing process — if I knew what I was saying, I wouldn’t need to say it. Sometimes I have ideas, and they seem cool, and I start to think I should write them up — but then I think that maybe they are just so much inert deflated ideas (as described above). But then I think, eh, what I write is just the byproduct of my mind’s ongoing function, and perhaps somebody else will have some of their own ideas provoked by something here.

One of the earlier discussions my class of sophomores had before we started The Iliad was about where the world began, where everything came from. I gave the case from science, that there was a Big Bang from which all matter and energy and life descend, and we also discussed the Bible’s Creation story in the first chapter of Genesis, in which God creates the world. But science can’t know what came before the Big Bang (because how could there ever be evidence before there could have been evidence?), and Judaism and Christianity can’t explain how God came to exist, and so both the religion and science accounts are just stories, are sets of ideas. Yes, the science account has more physical evidence to explain the physical realm, and religion can go beyond what has evidence, but both science, in its generalizations called facts and theories, and religion, in its formal structure of creeds and theology, have little to say to inform my personal, particular, subjective experiences.

After all, my mind contains ideas from many external sources, but whatever it is that gives rise to my mind, to my thoughts, my words, my experiences — whatever it is that is me feels like its beyond explanation, beyond theory, beyond labeling. I am complete in every moment, in every thought, continuously the same through the years I’ve been alive but I experience my consciousness discontinuously, leaping from crystallized thought to the next crystallized thought, each thought whole-born. I exist only and wholly now. And now. And now again. (And even talking about “now” or “the present moment” feels inadequately abstract.)

But in my thinking, I’m attracted to discovering the limits of ideas, the boundaries of what can be known. I’m not sure why this feels more important and interesting to me than other sorts of thinking. This, too, is part of the mystery of where ideas come from. (See here for related post.)

And now, I really am getting tired, and I’m feeling that in my attempt to distance myself from abstraction, I’ve gotten quite abstract. Ah, well. Such is a mind and its chatter. The ideas come and go but the thinking goes on.  Living is more than merely figuring stuff out abstractly, of course. Living is also falling asleep in my comfy bed.

So this post may not satisfy — but writing it felt good.

Links: 30 July 2013

1. Ta-Nehisi Coates visits France and notes differences between French and American culture (for one, there are fewer overweight people AND fewer bodybuilders), and points out that the particular ways we do things in each country may be inseparable from larger cultural contexts.

2. James Fallows points out that NSA surveillance will cost the United States its Internet influence and commerce in coming years.

3. A list of 10 influential soft wares.

4. Andrew Sullivan points out that Biblical literalism is “itself an inherent contradiction, since the Bible repeatedly contradicts itself if taken literally,” and he criticizes those who “are terrified of using their minds because their faith is so often mindless.”

5. Comic “Frazz” on science and wonder.

6. Massive open online courses — MOOCs — “could be disastrous for students and most professors.”

7. Governing by sabotage: here, and here.

Math isn’t real, but neither are ‘atoms’

This short video is a little manic, but generally does a nice job of summarizing the philosophical (specifically, metaphysical and epistemological) discussion as to whether math exists somewhere in the universe (though it can’t be, you know, physically detected) or whether math is just a human construct, a set of ideas, or as the video describes it, a “fiction.”

The video’s narrator describes the “math realist” position as believing that math is discovered, like new species are discovered, “out there” in the world. However, since math’s ideas are not physical and thus can never be directly observed by our senses, then believing in “objective” existence of math ideas requires, well, faith.

The video contrasts math to physics and other sciences, which take as their objects of study things that happen or exist in nature. I’d point out, however, that while we can sense things, like a round-shaped, sticky, sweet-smelling pastry we label “pie,” we cannot sense the measurements of the circumference and diameter of said pie placed in a ratio we call “pi”  (also, it’s hard to see how any of the terms just used — circumference, diameter, measurements, ratio, pi, and for that matter, pie — exist except as concepts.

I sometimes tell my students that one way of defining real is by determining whether one would mind being struck by that thing. I don’t want to get hit by a pie, but I don’t see how throwing pi at me could ever hurt.

And so, of course, while physics, chemistry, and biology deal with sensible things, as “ologies,” they are sets of ideas, and so these science concepts, models, theories, and laws may not exist physically any more than math ideas do.

In “Zen and the Art of Motorcycle Maintenance,” Robert Pirsig describes science theories as our contemporary technological culture’s equivalent of earlier cultures’ ghosts — things we think are real but, of course, can’t be seen, etc.

And, of course, none of our words are real, either — depending on how you define ‘real’! If real describes things available to our senses, then words are only interpretations of sounds heard or shapes seen.

So it can get complicated to figure out what’s real when one sees something like IBM’s movie of atoms:

The words “These are atoms. Magnified over 100 million times.” are needed to explain what we are seeing in the video, because what I seem to be seeing is something that looks like little metallic spheres — ball bearings, perhaps. I see ball bearings on a gray background that seems to have shading variations in a gray background.

But we are informed that these are atoms — defined in the press release as “one of the tiniest elements in the universe” — but then, these are not quite atoms. They are “molecules” of carbon monoxide, one atom of carbon and one of oxygen — but this molecule doesn’t look in the movie to have two components.

But “look” and “see” also become problem terms here. The images in the movie aren’t a result of light directly striking a light-sensor (as in a digital camera or in our eyes), but are more like graphs made by information received by the probe in the scanning tunneling microscope. According to the Wikipedia page:

The STM is based on the concept of quantum tunneling. When a conducting tip is brought very near to the surface to be examined, a bias (voltage difference) applied between the two can allow electrons to tunnel through the vacuum between them. The resulting tunneling current is a function of tip position, applied voltage, and the local density of states (LDOS) of the sample.[4] Information is acquired by monitoring the current as the tip’s position scans across the surface, and is usually displayed in image form.

So we’re not seeing atoms. In all likelihood, we can never see atoms. But, OK, let’s call what is detected by the scanning tunneling microscope an atom. But an atom, as we’ve all learned, has constituent parts, which are vastly smaller: protons, neutrons, and electrons. So “atom” just means an organizational level, like, say, a “class” is a bunch of students who gather at the same time in the same room, but there’s no class as a particular physical entity — just particular students.

However, the picture gets more complex, as protons and neutrons are comprised of smaller-yet pieces called “quarks,” and these quarks and other fundamental particles may be made of smaller items yet.

Of course, these definitions are part of an elaborate, technical model of the most basic components and interactions of physical reality, which itself is not seen as a complete model.

But of course, any explanations for things we see in the physical world is going to be an idea, an interpretation, which means it’s not the same as the physical world itself. Even these ideas I’m using now are merely ideas, by which I mean they are arbitrary, subject to replacement or revision as we see fit.

In talking about the arbitrariness of ideas with my high school students, some asked why schools teach “fake ideas,” to which I responded, “It’s better to learn fake things than nothing at all.” I’m not sure I would stand by that position all the time, but it’s worth considering from a philosophical and educational perspective.

Link: Teaching reading is crap-detecting

This New York Times article, saying it’s harder to raise students’ test scores in reading than it is in math, indirectly raises some cultural/epistemological questions about the differences between math and reading.

The article opens with this anecdote from someone who has apparently taught both:

David Javsicas, a popular seventh-grade reading teacher known for urging students to act out dialogue in the books they read in class, sometimes feels wistful for the days when he taught math.

A quiz, he recalls, could quickly determine which concepts students had not yet learned. Then, “you teach the kids how to do it, and within a week or two you can usually fix it,” he said.

Helping students to puzzle through different narrative perspectives or subtext or character motivation, though, can be much more challenging. “It could take months to see if what I’m teaching is effective,” he said.

I have taught high school science and English, and I’m not sure I’d say it’s easier to teach science, because of what it means to “teach science.” The expectation (as I was informed after I pursued a different goal) is for students to learn and apply the set of science ideas (theory of evolution, atomic theory, Newton’s Laws, etc.) that are provided in the textbooks. The discipline of science observes and tries to explain the physical world, but most science classes don’t allow this. Students take notes, do equations, take tests. Real research is not done by most students (though some high school science competitions, such as this one, show that students are capable of doing impressive work).

Science classes, then, just teach a set of ideas — let’s call it a story, Science Stories — and so do math classes. Math classes could be theoretical explorations of these abstract ideas, but many high school math classes simply teach procedures (algorithms) for doing things: to find the area of a rectangle, multiply the base times the height. Sure, that’s useful information, but hardly intellectually all that challenging. Math, as taught to high school students, is a tidy system of right and wrong methods for arriving at an answer. What mathematicians do is far more abstract and creative, of course, but we don’t generally let students see that.

In English, however, we’re actually asking students to do the same things (though obviously adapted to younger minds) that English professors do — read and analyze texts and write about them. What I love about teaching writing is that students are truly CREATING texts. Students in our science and math classes are not making anything — they are just taking in the ideas that others have made.

Of course, creating something is more intellectually demanding than just memorizing and applying an idea (even Bloom’s Taxonomy, that education cliche, says so). So we writing and literature teachers give our students guidelines and models to help them “scaffold” (in the teachers’ vernacular) their way to completed projects.

But of course, there are very few right or wrong essays or literature interpretations — there are worse ones and better ones, and judging which is which is highly subjective. The student essays I like best are those that go beyond what is merely stated in a text to make connections that are not obvious. In other words, I like essays that are interesting, that say things I hadn’t read or thought of before.

Lately I’ve been suspecting that maybe the best way to teach this kind of creative thinking and individual judgment is to model it for my students. As a teacher, I have my own biases and peculiarities, and so I’m not an ideal (Platonic?) model — but maybe learning to be analytical and/or creative is really more of an apprenticeship anyway, rather than something that has set standards for students to adhere to.

And here’s where teaching the study of literature gets interesting and/or controversial. The recent Common Core State Standards for teaching literature include statements such as the following:

CCSS.ELA-Literacy.RL.11-12.2 Determine two or more themes or central ideas of a text and analyze their development over the course of the text, including how they interact and build on one another to produce a complex account; provide an objective summary of the text.

The very use of the the word “objective” in relation to a text is nearly an absurd statement to someone familiar with the critical theories used to interpret literature that arose in the last few decades. Taking undergrad literature classes in the mid-1990s, I gained just a limited understanding of some of these approaches, but judging by the enthusiasms of the younger professors and by the resistance of the older ones, I understood these ideas to be important.

But the Common Core standards seem written in ignorance of these developments in interpretation, as if the standards writers were just gonna elide the last 50+ years of criticism. And though the standards are careful to call their lists of texts for use in classes “illustrative” rather than “recommended,” this listing shows 14 texts, only two of which were published in the last 50 years.

So the act of reading and interpreting texts is something that, in addition to necessitating word-processing skills, also “requires background knowledge of cultural, historical and social references” (as the Times article states), and from these basic skills and resources, we ask students to make coherent, logical statements of analysis. That’s asking a lot of anybody. But then, all too often, standardized reading tests ask students to select an interpretation from multiple choices, which requires students to also analyze the test to see which of the many possible interpretations of a text is the one that the test will honor as the “right” answer. The student has to match minds both with the text-writer and with the test-writer. In the Times article,

But when [the teacher] asked [students] to select which of two descriptions fit Terabithia, the magic kingdom created by the two main characters, the class stumbled to draw inferences from the text.

Uh, yeah. Why only two descriptions? We ask students to make this complex, creative, personal interpretation, and then ask them to compare theirs to an adult’s?

This might all be despair-inducing, except for the fact that when we teach interpretation skills to students, we also empower them to see the tests and the standards as the bullsh*t they so often are. This reminds me of the Hemingway quote featuring his one standard of education:

“Every man should have a built-in automatic crap detector operating inside him.”

(One, too, might employ such a crap-detector while reading Hemingway.)

 

Links: 30 April 2013: Technology, pets, food stamps, etc.

Playing catch-up here with links to sundry articles:

1. Writing and reading as more interactive than before. (via The Dish)

2. Food stamp participation by county.

3. U.S. students make up the largest proportion of top-scoring students. It turns out that we don’t need education reform so much as we need poverty reform.

4. We have relationships with our dogs, which relationships we can tell stories about; but we only look at our cats, of whom we make images. Thus, there are more books about dogs but online video and photos of cats. From my experience living with both, I’d say that’s about right.

5. The first World Wide Web page, recreated. Already, I feel like a oldster, telling my students of the days when I was first online, 1992, when I used the Gopher program to find addresses of people at other universities, and when I had email but only had two or three other people with whom to communicate online. I liked this story above for both the Gopher mention and for the screen image from NeXT computers, which I also used in fall 1992 and which now seems like the Edsel of computers.

6. The New York Times Book Review may be on its last legs. , and with it, “Book reviews, I am afraid, are a downer, an outdated form. Literary editors – hell, literary people in general – are mightily outdated, too.” And as much as I enjoyed reading the Book Review as a younger person who wished to participate in the community represented by the Book Review, I’m not sure any more that the end of “literary people” is necessarily a bad thing. “Literary culture” now seems an idea founded as much on myth and opinion and posturing as much as anything else.

7. Birth of a new conjunction: “slash.”

8. What you eat help forms what you like to eat.

9. A “Lord of the Flies” real-life adventure that wasn’t so “Lord of the Flies”-ish at all. :

One day, in 1977, six boys set out from Tonga on a fishing trip. They left safe harbor, and fate befell them. Badly. Caught in a huge storm, the boys were shipwrecked on a deserted island. What do they do, this little tribe? They made a pact never to quarrel, because they could see that arguing could lead to mutually assured destruction. They promised each other that wherever they went on the island, they would go in twos, in case they got lost or had an accident. They agreed to have a rotation of being on guard, night and day, to watch out for anything that might harm them or anything that might help. And they kept their promises—for a day that became a week, a month, a year. After fifteen months, two boys, on watch as they had agreed, saw a speck of a boat on the horizon. The boys were found and rescued, all of them, grace intact and promises held.

10. A post about literary pets contains this quotation from William S. Burroughs about his cats:

Thinking is not enough. Nothing is. There is no final enough of wisdom, experience — any fucking thing. Only thing can resolve conflict is love, like I felt for Fletch and Ruski, Spooner, and Calico. Pure love.

Love? What is It?
Most natural painkiller what there is.

11. Pictures from the frontlines of TV news on-location reports, showing some of what the edited image excludes. This reminds me of some of the press conferences I went to as an agriculture reporter, where my first-person accounts could have easily been more interesting to read than the items being conferred.

12. Media reporting tends to misunderstand and misstate science results.

13. Andrew Sullivan considers how a lot of online media exposure may influence/alter our thinking.

Links: 6 April 2013: Miracles, eggheads, and so much more

1. This post points out some goofy thinking about meaning of miracles and science.

2. The cultural status of the intellectual elite, the “egghead.”

3. Some of Roger Ebert’s advice on writing. Also, this article contains some more of Ebert’s thinking about writing and writing careers, such as:

He emphasized that such ephemera like “career” and “success” were mostly beside the point. “Just write, get better, keep writing, keep getting better. It’s the only thing you can control.”

4. An older piece attempting to explain why Nietzsche gets celebrated by those who misunderstand him.

5. I’m cautious by anyone who makes assertions about reality, but I’m usually pretty open to those who find fault in others’ reality-assertions. Here is a take-down of people who would misunderstand and/or distort vaccines and climate-science.

6. A justifiably angry piece about the difficulties of seeking a tenure-track job in literature (though this probably applies to many parts of academia now):

During graduate school, you will be broken down and reconfigured in the image of the academy. By the time you finish—if you even do—your academic self will be the culmination of your entire self, and thus you will believe, incomprehensibly, that not having a tenure-track job makes you worthless. You will believe this so strongly that when you do not land a job, it will destroy you, and nobody outside of academia will understand why.

The bold-emphasis above is mine. The more I learn to trust my own instincts in my creative writing (and it took a while to overcome my training in the standards of journalism — in learning to do what others thought was valuable — and to learn to trust my own standards), the more I question the value of what exactly it is that education does. We teachers, after all, mostly can only teach students to become more like a model student, and we mostly don’t know what that is, but it often resembles what the teacher him-/herself is capable of, as R. Hugo wrote:

You’ll never be a poet until you realize that everything I say today and this quarter is wrong. It may be right for me, but it is wrong for you. Every moment, I am, without wanting or trying to, telling you to write like me. But I hope you learn to write like you. In a sense, I hope I don’t teach you how to write but how to teach yourself how to write.

Of course, much of what we teachers do is widely valuable, but I suspect that this gets less better the higher one gets into academia. And when I occasionally consider getting a creative writing MFA, I remind myself that the writing I do and want to do and need to do doesn’t really have much to do with the writing that I would be being trained to do in an MFA program. I’m not saying these aren’t ever useful, but I suspect such programs can’t help people develop as writers unless one wants to write texts that are very much like the texts produced by writing faculty members who need to write things that tenure committees will agree have general value.

Let’s bluntly overstate my point: I’m asserting here that grad schools are not receptive to the new and unusual ideas that I most love reading in others’ texts and I most love having as I write texts.

(P.S. A small quibble with the Slate article: If there was a “boom of the late 1990s” with hiring associate professors, that was not the message of Bérubé and Cary Nelson in their book The Employment of English, which advised  in 1997 (if I remember correctly) English lit grad students seek employment in high schools rather than in colleges.