Shut Up and Take the Picture!

May 14, 2023

Mercado do Bolhão, Porto, Portugal, (1955) by Henri Cartier-Bresson 

Even physicists themselves acknowledge that “quantum weirdness” is an apt description for the strange behavior of subatomic particles — and nothing is weirder than the phenomenon known as “wavefunction collapse.”  Einstein was the first to report that light (photons) behaved like both particles and waves.  So which was it?  Succeeding generations of theorists pursued the question, concluding that elementary particles remain in an indeterminate state with regard to such measurable qualities as mass, location and velocity until there is an act of observation.  This triggers a “wave function collapse” that causes photons and other elementary particle to take on measurable qualities — a phenomenon that physicists refer to as the “observer effect.”  It was as if the building blocks of physical reality were waiting around for an audience before strutting their stuff.

Both Einstein and Max Planck — who between them had laid the theoretical foundations for quantum physics — were profoundly disturbed by its implications, which indicated that the universe did not exist independently of an observer. "No reasonable definition of reality could be expected to permit this," Einstein protested in paper he wrote in 1935 with Boris Podolsky and Nathan Rosen.  Nevertheless, nearly a century of experimentation has confirmed that elementary particles do indeed play peekaboo with physical reality, even though physicists remain sharply divided on the whys and hows.  Many subscribe to the “shut up and do the math” school, rather than waste time trying to explain the seemingly inexplicable.

John Wheeler, a theoretical physicist on the Manhattan Project who coined the term “black holes” for collapsed stars, has suggested that if an act of observation is required to bring elementary particles into existence, the same must be true for everything else. He theorized that we live in a “participatory universe” in which consciousness is not a bystander to physical reality but is an essential element in its formation. In a sense, you could say the world comes into being because we are here to witness it.  We become collaborators in what Wheeler called the “genesis of observership.”  

You might say the universe is reborn at every moment through the action of human consciousness.  If so, then artists are surely its midwives. “Things are because we see them,” the writer Oscar Wilde insisted, long before quantum physicists did the math.

I doubt the iconic street photographer Henri Cartier-Bresson knew anything about quantum mechanics.  But his description of the creative process dovetails nicely with what physicists say goes on at the subatomic level. “There is a creative fraction of a second when you are taking a picture. Your eye must see a composition or an expression that life itself offers you, and you must know with intuition when to click the camera.”  If, as Wheeler suggests, we live in a participatory universe, then that “decisive moment,” as Cartier-Bresson characterized it, becomes the equivalent of a wave function collapse.  Click!  In that moment, color, form, texture, light and shadow come together just so to capture a tiny piece of reality.  In that moment, as the physicist Erwin Schrödinger might have put it, “subject and object are only one.”  But his colleagues in the scientific community might just as easily have said, “Shut up and take the picture.”   


The Mists of Time

April 14, 2023

"Tracks in Fog 4" by Eric Rennie

I am looking at a photograph I took several years ago.  When I snapped the picture, I was standing on rusty railroad tracks that ran straight down the center of the resulting photograph before disappearing into the foggy distance.  The picture was taken early one morning in late fall.  There are bare trees and scrubby bushes down the rocky embankments on either side of the track.   The sun was starting to break through in the foreground, but the fog predominated in the middle distance. A little way further on, the tracks dissolved into silvery oblivion before they could converge on the horizon.  

Looking at the picture now, it occurs to me it is a perfect representation of time.  Spatial metaphors are often used to delineate time. The nearer an object, the closer to the present time, whether in the past or in future.  Here objects become progressively more obscure as they recede into the distance.  You might say they disappear into the mists of time.

In reality, of course, time does not move either forward or backward, at least not in the sense of an object moving through physical space.  Time has no spatial direction or dimension whatsoever.  If you strip away the metaphors, you are left with the thing itself, which can be measured with great precision but which essentially remains an enigma.  We all have a sense of time passing — of duration — but what exactly passes?  Time is not detectable by any of the physical senses, and clocks measure nothing beyond their own ticking.  Time, for all practical purposes, doesn’t appear to have any tangible existence, except as an abstract measure of change.

An astronomer can look through a telescope at stars and galaxies as they existed billions of years ago, but their light can only be seen right now.  The actual celestial bodies may have burned themselves out long ago.  Similarly, a geologist may study rocks that were formed when our planet was new, but his hypotheses about their origins depend on what he sees right now.  A paleontologist digs up bones from creatures that roamed the earth tens of millions of years ago, but his findings are based on what is in hand right now.  There is evidence aplenty of things that existed before now, but the fact remains that right now is all we have to work with — and all we will ever have to work with.  Right now is really all there is.           

So what is it that disappears into the mist?  I might vividly recall something that happened just yesterday, yet my memories of last week or decades ago are less and less distinct, much like the railroad tracks in my picture receding into the fog. But do past events recede into the mists of time, whatever that might be — or is it just the mists of memory?  If the present moment is all that tangibly exists, and it has no duration in itself, then our sense of time passing must come from our recollection of previous moments.  If we have no memory of prior events, every moment is sui generis.  But, of course, we do remember previous moments, and we assume we are looking through a window into the past.  But the memory isn’t happening in the past; it’s occurring right now.  

Memory, said Ralph Waldo Emerson, is “the thread on which the beads of man are strung.”  Without memory, the moments of our life follow one another in meaningless succession.  Like time itself, my understanding of who I am exists only in relation to my past.  Without memory there is no past and no duration, no sense of time moving forward.  I am reborn from moment to moment but die just as quickly.  

I gained crucial insights into the interplay of time, memory and identity as caregiver to my mother, who suffered from vascular dementia in the final years of her life.· As her memory faded, she lost not only any sense of her own past but even a firm sense of time passing.  Familiar people in her life became strangers, and in the end she became a stranger to herself.  I watched helplessly as her life slowly faded into the mist before disappearing altogether into oblivion.  


An Unspeakable Perfect Miracle

March 18, 2023

If the world were perfect, it wouldn't be.
-- Yogi Berra

"To me, every hour of the day and night is an unspeakable perfect miracle,” said Walt Whitman, a poet given to meme-worthy ecstatic utterances.  One can’t help wondering whether he walked the same earth as the rest of us.  And yet Whitman was witness to some of the worst carnage in the American Civil War and was anything but starry-eyed.  At age 43, he had traveled to the front after the bloodbath at Fredericksburg in 1862, searching for a younger brother whom he feared was mortally wounded.  He found his brother George had been only slightly injured in battle, but the sights that greeted him left an indelible impression.  Visiting a nearby field hospital, he recounted, “Outdoors, at the foot of a tree, within ten yards of the front of the house, I noticed a heap of amputated feet, legs, arms, hands, etc. -- about a load for a one-horse cart. Several dead bodies lie near, each covered with its brown woolen blanket. In the dooryard, toward the river, are fresh graves, mostly of officers, their names on pieces of barrel staves or broken board, stuck in the dirt.”  Whitman was so moved by his experience that he relocated to Washington, DC, took a job as a clerk in a government office and spent all his free time for the remainder of the war visiting the wounded and dying in area military hospitals.

Whitman’s having witnessed the worst of humanity begs the question of what he could possibly have meant when he said that every hour of day and night was an unspeakably perfect miracle.  Even those who are spiritually minded are apt to concede that this world is a vale of tears and that perfection can only be found in the Great Beyond.  For Buddhists, the process of finding perfection can take many lifetimes.  Clearly, Whitman’s understanding of perfection must have differed radically from what we normally mean by the term.

The word “perfection” comes from the Greek root telos, meaning “end” or “goal.”  In other words, it implies a process with perfection as the end state.  As a fine arts photographer who works with a digital camera, I regard the taking of a picture as merely the first step in the process that usually involves considerable manipulation of the raw image using Photoshop and other software.  I may adjust the exposure, crop the picture, sharpen the focus, lighten or darken certain elements within the picture and carefully remove any imperfections that I may find.  The process is painstaking and can take considerable time — certainly far longer than the fraction of a second required to snap the picture in the first place. The result is never perfect, but it is as close to perfect as I can make it, given my level of skill and experience. 

Like my photography workflow, spiritual perfection is a painstaking process that involves the removal of imperfections and many other adjustments.  In Christianity, it is known as sanctification, which literally means "to set apart for special use or purpose” — in this case, transforming a raw Christian into the very image of a saint.  Orthodox Christianity takes it a step further in a process called theosis, or deification, inspired by St. Athanasius’ statement that “God became man so that man can become God.”  There are sharp differences among Christian denominations as to whether perfect union with God is even possible on this side of the grave.  Martin Luther eventually came to realize the utter futility of his striving for perfection — or indeed of any moral self-improvement — if salvation were the aim.  Earlier, the Buddha had pretty much reached the same conclusion and abandoned asceticism as a path to enlightenment.  He famously resolved to do nothing but sit under a bodhi tree until he found what was looking for.

If perfection is our aim, then we must first deal with the fact that mankind was created in God’s image, at least according to our biblical creation story.  If that is our starting point, where do we find room for improvement?  The creation story itself provides the answer.  There was one bad apple that threw everything into disarray: the forbidden fruit that Adam and Eve ate when God told them not to.*  So whose fault was that?  The miscreants were clueless.  The serpent put them up it, but who made the serpent the “subtlest of God’s creatures,” if not God himself?  You could make the case that God himself introduced imperfection into his creation.

According to an old Persian proverb, a Persian rug is “perfectly imperfect, and precisely imprecise.”  Traditional rug makers deliberately weave minor flaws into their hand-made carpets because God alone is perfect, and it would be impious to strive for perfection.  But perhaps they need not have bothered.  Suppose God had deliberately woven imperfections into his own creation so that the whole enterprise is perfectly imperfect and precisely imprecise? 

Surely it is within God’s power — if, indeed, God exists — to make his creation without flaws.  So why would he bother to produce creatures in his own image and then leave them to the tender mercies of the wily serpent that he himself introduced into the Garden of Eden?  The whole thing looks like a deliberate set-up.  The question is why.  

The problem with perfection is that it is the final state of a process, not the beginning or any intermediate stage.  If life were perfect, it wouldn’t be life; it would be a vast tableau vivant, a simulacrum of life in which nothing ever changed. Had Adam and Eve eaten the fruit from the Tree of Life, they might have lived forever in a childlike state of perfection. But the serpent promised them they would become like God if they ate the fruit of the knowledge of good and evil instead.  As it happened, the serpent spoke the truth.  The Lord acknowledged as much when he said, "Behold, the man has become like one of us, knowing good and evil.”  

How is it that knowing good and evil makes you like God?  If God is perfect, where is the evil, unless he made it part of his perfectly imperfect creation?  The preamble to the U.S. Constitution declares that the document is intended to form a more perfect union, establish justice and insure domestic tranquility.  And yet in countenancing the evil of slavery, the Constitution set the nation up for the catastrophic failure of the Civil War.  Thus, Walt Whitman was greeted by the sight of amputated feet, legs, arms and hands heaped up outside a field hospital near Fredericksburg in 1862. 

We might be forgiven for concluding that Whitman’s experience at Fredericksburg was an unspeakable perfect hell.  And so it was, a perfectly imperfect flaw woven into the carpet of God’s creation.  Toward what end?  Alas, we are not given to know the end.  We know only that the carnage suffered in the American Civil War did lead to a more perfect union, although it is still far from perfect.  The painstaking process of removing imperfections continues.    

“You seek perfection and it lies in everything that happens to you – your suffering, your actions, your impulses are the mysteries under which God reveals himself to you,” the 18th-century Jesuit spiritual director Jean-Pierre de Caussade instructed.  We turn our eye to the horizon, but it is never given us to know more than the next step.  “We must confine ourselves to the present moment without taking thought for the one before or the one to come,” de Caussade said.  He goes on: “No moment is trivial, since each one contains a divine Kingdom, and heavenly sustenance.”  This perhaps is what Whitman was getting at when he exulted that every hour of the day and night is an unspeakable perfect miracle.          

*The creation story in Genesis does not actually specify that the forbidden fruit was an apple, although that is generally how it is portrayed.

Genesis 3

Jean-Pierre de Caussade, The Sacrament of the Present Moment


Look, See

February 14, 2023

Henry Thoreau

“Wisdom does not inspect, but behold,” Thoreau wrote. “We must look a long time before we can see.” He was speaking of scientific observation, but he might as well have been writing about what artists and poets do.  We are all looking for things, but how many of us really see?  To begin with, seeing requires us to unlearn all those shortcuts that enable us to swiftly find what we are looking for without really noticing anything else.  There are obvious evolutionary advantages to being able to quickly spot predators or prey without being distracted by the view.  But this is not the same thing as seeing what you are not looking for, which is another name for discovery. 

Two years after his sojourn at Walden Pond, Thoreau embarked on a new daily regime that involved long walks around Concord to observe and record the natural world.  In the evenings, his notes were carefully transferred to his journal, which by the time of his early death from tuberculosis at age 44 already amounted to some two million words.  Thoreau was a younger contemporary of Charles Darwin and avidly read his works.  Like Darwin he was always careful to ground his conclusions in a close observation of nature.  He wrote, “The true man of science will know nature better by his finer organization; he will smell, taste, see, hear, feel, better than other men. His will be a deeper and finer experience. We do not learn by inference and deduction, and the application of mathematics to philosophy, but by direct intercourse and sympathy.”

One must truly see in order to learn, but one must also learn to see, as Thoreau well knew. “For the newly sighted,” Annie Dillard wrote in Pilgrim at Tinker Creek, “vision is pure sensation unencumbered by meaning.”  She was speaking of those who had gained their sight following advances in cataract surgery in the 19th century, but the same applies to newborns for whom the world is still a crazy-quilt of raw color and motion.  They must learn the ins and outs of this world, quite literally: its heights and depths, foreground from background, object from surroundings.  Eventually names are put to the things they see.  

For scientific observers, the process undergoes continual refinement as they learn the ins and outs of the world they see in a microscope or cloud chamber. “Direct perception of form requires being experienced in the relevant field of thought,” stated the Polish bacteriologist and philosopher Ludwik Fleck. “The ability directly to perceive meaning, form, and self‐contained unity is acquired only after much experience, perhaps with preliminary training.”

Fleck added, “At the same time, of course, we lose the ability to see something that contradicts the form.”  In effect, the process of learning to see blinds us to things that don’t conform to our habits of seeing.  Thomas Kuhn used the word “paradigm” as a term of art to describe a particular way of seeing the world in his book The Structure of Scientific Revolutions.  A paradigm is the conceptual framework around which scientific observations are organized, and observations that fall outside that framework are apt to be ignored.  Scientists are supposed to base their theories on the careful observation of natural phenomena, but Kuhn noted they are as prone to fit facts to theory as anyone else.  They tend to reexamine their theories only when the weight of accumulated anomalies forces them to do so.

Photographers and other visual artists must also go against habits of seeing in order to do their work properly.  To see the world afresh, they must forget everything they think they know about it.  For the artist, learning to see is at once unlearning all they have seen, stripping away all the words and concepts, until nothing remains but pure sensation unencumbered by meaning.  It is the world as a newborn might find it, all color, shape, texture, light and shadow.  Realistically, one can sustain such a vision only momentarily.  For a photographer, that is precisely the moment when you snap the picture.



January 10, 2023

Lumière Brothers

Motion picture pioneers Auguste and Louis Lumière created a sensation before the turn of the last century with their actualités, or “actuality films.”  These were one-minute documentaries of everyday life in Lyons, Paris and elsewhere, shot by setting up a camera on a street corner or railway platform and just letting the world pass by.  Once the novelty of life-sized images moving on a screen had worn off, the Lumière brothers dispatched camera crews to Russia, Japan and the Middle East in search of new and exotic locales.  Eventually, however, audiences demanded that films do what it turned out they were made for: storytelling. 

With narrative films, you can’t just set the camera up and watch the world go by.  You have to create scenes that advance the narrative and then edit them so that audiences can follow the storyline.  Stories can rarely be told minute by minute in real time in a single shot.  Andy Warhol released an avant-garde film in 1964 called Empire  in which a camera was trained on the Empire State Building at night for eight hours straight without moving.  Warhol’s announced intention was "to see time go by."  In this he succeeded.  But the film was unwatchable, causing one cultural critic to observe: "If I were the camera, I would faint with boredom, staring that long at one thing...." 

To tell a story, you have to move things along both in time and in space.  The Great Train Robbery (1903), one of the earliest narrative films, featured 20 separate shots in 10 separate indoor and outdoor locations during 12 minutes of total running time.  Filmmakers had previously been reluctant to splice together different scenes for fear for confusing audiences.  However, viewers had little difficulty following the story in this early classic Western, even without dialogue or title cards.  It was understood that when an exterior shot of robbers boarding a train was followed by an interior shot of those bad guys breaking into a mail car, the mail car was on that same train.  One scene lead to another as the robbers made off with the loot and were eventually gunned down by a posse that gave chase on horseback.
Nothing in more than four million years of hominid evolution would seemingly have prepared viewers to make sense of discontinuous visual information, and yet they do it with ease.  As University of Washington psychologist Jeffrey M. Zacks puts it, “Why don’t our brains explode when we watch movies?”  The short answer is that even though we evolved in a continuous physical world, that’s not how we see things.  We are constantly sampling bits and pieces of the world around us and putting them together to form a coherent picture of reality.  Intentionally or not, early filmmakers developed editing techniques that closely track with how we normally perceive the world. 

University of North Carolina professors Todd Berliner and Dale J. Cohen write that “the brain perceives spatial coherence when observing classically edited cinema because the perceptual system evolved to accept imperfect and disjointed visual information, to reconstruct the fragmented information into a model of the physical world, and to ignore gaps and discontinuities.”  This enables filmmakers to manipulate time and space without any sense of dislocation in telling their stories.  Thus, The Great Train Robbery was able seamlessly to take audiences on an extended journey by train and horseback within 12 minutes of actual running time and without leaving their seats.

The mental processes we use to track the action in a movie and to make sense of the real world are essentially the same.  In each case, memory plays a prominent role.  A person who is suffering from severe anterograde amnesia and who is unable to form new memories would be incapable of following a movie’s plot, because every scene would appear sui generis, unconnected to its vanished predecessor or to the scene that follows.  Such persons are also incapable of functioning in the real world, because for them time itself has effectively come to a standstill.

The term filmmakers use for techniques to move a story forward is “continuity.”  The physical world, of course, is presumably already continuous.  But the mind’s perceptual apparatus doesn’t see it that way.  The mind must stitch together its fragmentary perceptions into the facsimile of a continuous whole, much as a film editor would.  This enables us to wake up, get ready for the day ahead, eat breakfast, bring in the morning paper and go about our business each day without any sense of dislocation in time or space.  And yet, if we were to try to retrace our steps from moment to moment, we would quickly discover that our short-term memory has been emptied of everything that is not needed to keep things moving forward. 

There are no doubt sound evolutionary reasons for this.  Imagine if every moment of your life, no matter how fleeting or trivial, were engraved in memory forever.  You would be unable to recall anything without subjecting yourself to the mnemonic equivalent of Warhol’s Empire.  The effect would be as paralyzing as being unable to form new memories at all.  Your life would lose any sense of continuity because you would be stuck in an endless feedback loop, condemned to relive every moment in excruciating detail from start to finish every time you remembered it.  

Our sense of continuity comes from being able to remember what happened before now but then of forgetting the details and allowing events to recede into the past.  This may account for the apparent foreshortening of perceived time as we grow older and accumulate more experience.  Time seems to speed up, and distant events appear closer than they actually are — a psychological phenomenon called “telescopy,” much like objects seen through a telescope.  Of course, time doesn’t actually have a spatial dimension, so you can’t really call a past event “distant.”  If you retrace a journey of 70 miles, there is no getting around the distance you would have to travel.  But if you retrace a journey of 70 years, you would find you have forgotten most of it.  Perhaps this why we come to think life is so short.  The mechanisms that control our sense of continuity have edited out all the boring bits.



January February March April May June July August (3) September (1) October (1) November (3) December (2)
January (1) February (1) March (1) April (1) May (1) June July August September October November December