Tuesday, November 05, 2024

The Ethics of Belief

An excerpt from “The Ethics of Belief,” by William Kingdon Clifford—an essay on epistemology, rationality, and the care with which we should apply when forming our beliefs:

A bad action is always bad at the time when it is done, no matter what happens afterwards. Every time we let ourselves believe for unworthy reasons, we weaken our powers of self-control, of doubting, of judicially and fairly weighing evidence. We all suffer severely enough from the maintenance and support of false beliefs and the fatally wrong actions which they lead to, and the evil born when one such belief is entertained is great and wide. But a greater and wider evil arises when the credulous character is maintained and supported, when a habit of believing for unworthy reasons is fostered and made permanent. If I steal money from any person, there may be no harm done by the mere transfer of possession; he may not feel the loss, or it may prevent him from using the money badly. But I cannot help doing this great wrong towards Man, that I make myself dishonest. What hurts society is not that it should lose its property, but that it should become a den of thieves; for then it must cease to be society. This is why we ought not to do evil that good may come; for at any rate this great evil has come, that we have done evil and are made wicked thereby. In like manner, if I let myself believe anything on insufficient evidence, there may be no great harm done by the mere belief; it may be true after all, or I may never have occasion to exhibit it in outward acts. But I cannot help doing this great wrong towards Man, that I make myself credulous. The danger to society is not merely that it should believe wrong things, though that is great enough; but that it should become credulous, and lose the habit of testing things and inquiring into them; for then it must sink back into savagery.

The harm which is done by credulity in a man is not confined to the fostering of a credulous character in others, and consequent support of false beliefs. Habitual want of care about what I believe leads to habitual want of care in others about the truth of what is told to me. Men speak the truth to one another when each reveres the truth in his own mind and in the other’s mind; but how shall my friend revere the truth in my mind when I myself am careless about it, when I believe things because I want to believe them, and because they are comforting and pleasant? Will he not learn to cry, “Peace,” to me, when there is no peace? By such a course I shall surround myself with a thick atmosphere of falsehood and fraud, and in that I must live. It may matter little to me, in my cloud-castle of sweet illusions and darling lies; but it matters much to Man that I have made my neighbours ready to deceive. The credulous man is father to the liar and the cheat; he lives in the bosom of this his family, and it is no marvel if he should become even as they are. So closely are our duties knit together, that whoso shall keep the whole law, and yet offend in one point, he is guilty of all.

To sum up: it is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence.

If a man, holding a belief which he was taught in childhood or persuaded of afterwards, keeps down and pushes away any doubts which arise about it in his mind, purposely avoids the reading of books and the company of men that call in question or discuss it, and regards as impious those questions which cannot easily be asked without disturbing it—the life of that man is one long sin against mankind.

If this judgment seems harsh when applied to those simple souls who have never known better, who have been brought up from the cradle with a horror of doubt, and taught that their eternal welfare depends on what they believe, then it leads to the very serious question, Who hath made Israel to sin?

It may be permitted me to fortify this judgment with the sentence of Milton – “A man may be a heretic in the truth; and if he believe things only because his pastor says so, or the assembly so determine, without knowing other reason, though his belief be true, yet the very truth he holds becomes his heresy.”

And with this famous aphorism of Coleridge – “He who begins by loving Christianity better than Truth, will proceed by loving his own sect or Church better than Christianity, and end in loving himself better than all.”

Inquiry into the evidence of a doctrine is not to be made once for all, and then taken as finally settled. It is never lawful to stifle a doubt; for either it can be honestly answered by means of the inquiry already made, or else it proves that the inquiry was not complete.

“But,” says one, “I am a busy man; I have no time for the long course of study which would be necessary to make me in any degree a competent judge of certain questions, or even able to understand the nature of the arguments.” Then he should have no time to believe.

Saturday, November 02, 2024

Origins of Life

Today I learned the abiotic origin of organic compounds was established in the early 1800s, but the experiment wasn't actually intended to put forth a hypothesis for "abiogenesis"—or how life began on Earth.

The question of abiogenesis is the following one: how does so-called inanimate, non-living matter become animate, living matter? 

Friedrich Wöhler's so-called seminal contributions to organic chemistry would eventually lead to further hypothesis exploration about abiogenesis. Wöhler took two inorganic compounds—silver cyanate and ammonium chloride—and synthesized them to create urea, an organic compound that was previously believed to only be produced by living things carrying a "life force."

After Wohler's experiment, a large number of similar organic chemistry experiments would follow throughout the 19th century—and later those experiments would be followed by the Miller-Urey experiment.

The Miller experiment explored an origin of life scenario—simulating possible early conditions on Earth. By combining the gases methane (CH4), ammonia (NH3), and hydrogen (H2) with water—and exposing this amalgamation to electricity—various amino acids were produced, which are the building blocks of proteins. The related hypothesis is known as the prebiotic or primordial soup hypothesis.

But is the “prebiotic soup” theory a reasonable explanation for the emergence of life? Contemporary geoscientists tend to doubt that the primitive atmosphere had the highly reducing composition used by Miller in 1953. Many have suggested that the organic compounds needed for the origin of life may have originated from extraterrestrial sources such as meteorites. However, there is evidence that amino acids and other biochemical monomers found in meteorites were synthesized in parent bodies by reactions similar to those in the Miller experiment. Localized reducing environments may have existed on primitive Earth, especially near volcanic plumes, where electric discharges may have driven prebiotic synthesis. In the early 1950s, several groups were attempting organic synthesis under primitive conditions. But it was the Miller experiment, placed in the Darwinian perspective provided by Oparin’s ideas and deeply rooted in the 19th-century tradition of synthetic organic chemistry, that almost overnight transformed the study of the origin of life into a respectable field of inquiry. (via Prebiotic Soup—Revisiting the Miller Experiment)

The question of whether Earth's early atmospheric conditions were different from those in the Miller experiment is up for debate. The synthesis, however, continues to be a pioneering experiment in the study of abiogenesis—since it has further demonstrated that inorganic compounds can result in the formation of simple-to-complex organic compounds under circumstances potentially like those following asteroid impacts on Earth during the prebiotic atmosphere.

The hypotheses involving volcanic plumes and hydrothermal vents aren't the only abiogenesis hypotheses, of course. But they are particularly compelling ones, since one of the earliest forms of life on Earth was discovered in a ~3.42-billion-year-old subseafloor hydrothermal environment.

However, our last universal common ancestor is thought to have lived 4.2 billion years ago.

Friday, November 01, 2024

On Forgetting

I saw a post on Twitter today. Someone asked, “There are a number of techniques to help one recall and remember anything. From a neuroscience perspective, is it possible to intentionally learn to forget?”

Questions and thoughts like this are amusing. Someone appends the word "neuroscience" to a question or remark that is, in a way, about neuroscience. But it is also capable of being thought of in simpler, broader terms.

The term neuroscience sometimes invokes ideas of complex explanations—often for totally mundane things. The mere use of the word in a discussion can make any argument sound more authoritative than it actually may be.

What I'm trying to say is that, it seems to me that, more often than we might expect, neuroscience is about… other things. (Unless you work in a lab somewhere.)

For example, take the question "How can I use the power of neuroscience to forget bad memories?" Practical, useful answers probably have very little to do with neuroscience itself.

I believe it’s both more helpful and accurate to think of “forgetting” as one's brain recontextualizing and reshaping itself. 

We find ourselves on the outside of an old context we were once in and now in a new context we've yet to become fully self-aware of.

And I can't shake the intuition that, a lot of what happens during such a change, relies on messy meaning-making processes, rather than formulaic things we can pin down exactly with computational models.

I’m also not convinced that “forgetting” is a completely achievable objective in the first place--to literally be able to erase a memory. Episodic and long-term memories tend to stick around. This is a feature of the mind, not a bug.

For example, our brain remembers that time we burned our hand on the stove, so in the future, we don't do that again. Initially, we wince. But later, our brain molds itself around this memory. Without the ability to store information over large time scales, neither language, relationships, nor our own personal identities would develop.

Your memories probably don't change much. It’s your perception that rotates as you learn new things about your memories. There’s a lot of literature to support this hypothesis (brain plasticity, constructivism, memory consolidation, etc.).

But our brains are also imperfect—prone to distortion, illusions, and biases. Sometimes it wishes it could more clearly remember things. And other times, it wishes it could forget them.

In the spirit of Hebbian theory—"what fires together, wires together"—it seems to me that the only way to even come close to “forgetting” a memory is to replace it with a more powerful one. 

In other words, you think you want to forget, but what you really want is to think new thoughts. And thinking new thoughts requires either noticing new things—or seeing old things in new ways—or visiting new places, either figuratively and/or literally.

The primary paradox of the past is this: the past has a grip on us, not because of the past itself, but because we're unable to conceptualize things that haven't occurred.

In the end, I tend to believe that forgetting is more about changing how we understand and relate to our memories than anything to do with forgetting itself. 

Friday, September 13, 2024

Stuff

Lately, I've been tightening my own personal feedback loops. And working out. I've also been thinking about how sometimes it can be positive to forget things.

Wednesday, July 03, 2024

A Taxonomy of Communicative Modes

While lurking on the internet, I stumbled across a post from 2019 by @literalbanana on Twitter/X outlining different modes of communication.

Sunday, May 26, 2024

Using Reflection in Go

Have you ever been writing Go and needed to quickly find all the possible methods or fields you can use with a particular function?

Tuesday, April 09, 2024

Knowledge vs Information

One way to conceptualize the difference between knowledge and information is this: knowledge involves some metric of computational difficulty to arrive at, while mere information lacks this property.

Using Python To Access archive.today, July 2025

It seems like a lot of the previous software wrappers to interact with archive.today (and archive.is, archive.ph, etc) via the command-line ...