Rock, apparently, is not a synonym for stone. Now, I know a lot of words, and the definitions to most of them, but I went a little over three decades before I encountered this geological distinction in an anecdote that Barry Lopez shares in his book Horizon—a distinction that he, too, missed for years. A rock is still what you would expect. The hard stuff made of minerals. A stone, however, is a rock that has been modified in some way for human use. Crushed for gravel, shaped for a cornerstone, knapped into an arrowhead. I guess that means that rocks belong to geologists, and stones to archaeologists*.

*I also just learned how to spell the word archaeologist. Who knew there were so many “a”’s?

Something about it bugged me, though. Maybe it was my annoyance at being so fundamentally mistaken. The notion that a human picking up a rock, smashing off a shear face, and slicing through deer hide can change the definition of the thing just didn’t sit. Wasn’t it still made of the same stuff, just shaped a little differently? For sure, it would be called a tool, but a stone? It seemed like a sharp, useful rock to me.

Then it dawned on me, what we’re talking isn’t the thing, but the category of it. There are many subcategories under the “rock” heading—granite, gneiss, limestone, etc.—that refer to more specific properties. Stone would be separate from all of them, as another subcategory that refers to a rock modified for human use. Wondering if a fragment is a rock or a stone is an error of logical type. It’s akin to wondering if this crayon is “purple,” or “color”. The real question should be: is this rock also a stone? All stones are rocks. Not all rocks are stones.



Human use doesn’t change the fundamental nature of the rock, it only provides additional context, allowing us to glean more information. Let’s say an archaeologist finds a rock at a dig site. Knowing that it’s flint is useful information, in the context of a knowledge of nearby flint deposits. But identifying it as a stone—used by humans as a tool, adds another rich context, from which much more can be deduced. An arrowhead might tell the style of tool, which can give an approximate date when compared to other finds. It shows the skill of the maker, and the likely game. If the nearest flint deposits are far away, it shows something of the user’s range and migration patterns, possibly contact with other settlements. Its presence also betrays human hunting grounds, and possibly a habitation. And it means that there may be other artifacts nearby. All of this and more can come not from any difference in the object, which is the message, but by placing the message in an additional context.

But who cares other than nitpicky geologists? As far as this particular example is concerned, it doesn’t matter. Except that it provides an easy-to-grasp doorknob into a discussion of context mistakes leading to much deeper problems. Errors of logical type are common fallacies that pervade many people’s thinking. They compare members of a category to the category, as if both exist on the same level and we have to make a binary choice between them. Or they assign parts like a particular piece of flint, to a broader category like “rock”, and miss all the information provided by a correct assignment to an additional intermediate category, like “stone”. I intend to argue that this type of error, which everyone including myself has committed, when made into a habit, can lead down a very dark road to things like madness, or worse: bureaucratic intervention.

The particulars of how these mistakes rear their ugly heads are Hydra-like. For simplicity, I’ll lump them into three categories and give examples to illustrate each.

1. Missing a sub-category

The first mistake is when we have a message (an object or an event, for example), and we’ve identified it as part of a larger category, or existing within a larger context, but we fail to recognize the existence of a sub-category that significantly modifies the information we get from that message. This is what would have happened if our archaeologist had misidentified the piece of flint as just a rock, instead of a rock and a stone cutting tool.

It becomes a problem when missing out on that information leads us to take actions that are potentially harmful, when having the information would have lead to actions that decrease the likelihood of harm. For example, a couple of recent storms have turned the Houston and Winnie areas into inland waterways. Those storms could be seen as particulars that are part of a larger category, “harmful weather events.” We have no control over the weather, but that alone might lead to actions like improving drainage on one hand, and stocking up on emergency supplies, making plans, and whatnot.

But what if we also assigned it to a subcategory, “harmful weather events exacerbated by human action”? While we can’t control the weather, we can certainly (ok, theoretically) control ourselves. When you have a massive sprawling slab of concrete like the Houston metropolitan area, it’s hard for huge amounts of water to soak into the ground quickly enough. In fact, concrete roads and barriers make great artificial rivers that concentrate the water in certain areas. And urban sprawl pushes residents and developers to ever-more-marginal land located deeper in flood plains. Maybe people can’t find anything affordable in a 50-year floodplain, so they go for something in the 25 range, or even 10. And then just build up a lot of dirt work to elevate your new house which you don’t want to flood, which pushes the water to your nearest unelevated neighbor.

Failure to recognize this—or recognition and failure to act—would mean that rare floods will become less rare and cause more damage as people just pay higher premiums for flood insurance and hope for good weather. But that additional context could change the way both private citizens and government agencies prepare. Maybe the floodplain maps need an update since a round of recent development. Maybe people will avoid buying in low areas, or just move to higher ones. And the city can place certain limits on new developments to avoid compounding the problem.

I don’t actually know what Houston is doing, or the specifics about the situation. This is just an example of how a modifying category in the middle changes information, and thus changes action.

Another example is the recent problem with honeybee die-offs. A nationwide collapse of pollinators is only bad news for people who like eating food, but those people might file the events under the category “threat to food security.” I’m sure if people connected the events to that category, they would take certain actions to shore up honeybee populations. But if we were able to also include a middle subcategory, “pesticide-induced die-off,” and realized that a large number of the bees were killed by neonicotinoids, the actions we took to avoid famine might include not spraying the stuff everywhere and requiring manufacturers of such things to prove harmlessness before going to market, or suffer massive, rolling-head penalties if it proved otherwise.

Those two examples involve seeing something unpleasant and failing to see an intervening category that made it so. You can also have the reverse, in which we see something good, something we want more of, occurring more or less in a natural state, and decide to “make some more of it” by intervening with policies, organization, etc. Imagine two small kids playing together, having a great time, when an adult notices and decides to help them have an even better time by helping them, directing their play, until they get bored and wander their separate ways.

2. Comparing a part to the whole of which it’s a member

Is that crayon “purple,” or “color”? Is it a rock, or a stone? Comparing a category and its member, or a system and its part, as if they operate on the same playing field leads to confusion. It would be wrong to say that since this stone arrowhead is knapped to a razor sharp edge, I should also expect any other rock to have a razor sharp edge. There may be properties of the arrowhead that are common to all rocks, but there are also properties that are common only to knapped stone arrowheads, or common only to arrowheads of the Poverty Point culture. That seems obvious, yet we do this all the time.

Consider sampling errors in polls or studies. Do the answers given by a few hundred people willing to speak to a pollster on the phone represent the entire voting population? Do the choices of a handful of middle class American college students who needed extra cash for weed and pizza, and so signed up for a psychological study, tell us anything at all about human decision-making across all cultures and classes? Isolating variables and studying them is vital to the scientific process, and it works well in things like chemistry. But some systems literally can’t be modeled accurately with a model that’s any less complex than the system itself. Recall Jorge Luis Borges’ On Exactitude in Science, a story about a map as large as the territory itself, an idea which Lewis Carroll also poked fun at. The systems that defy easy models tend to be complex and organic. We know a lot about the human body, but even something as thoroughly-understood as chemistry gets murky when you try to understand it in that context. Well-studied drugs end up having unexpected consequences, because they don’t behave in the body as they would in isolation, and they don’t behave the same way in all bodies that they did in the sample subjects, or their harmful effects are longer-term and more abstract.

Those cases involve an error in extrapolating from a part to the whole. That means at least they recognized that there were two different categories involved. Sometimes that distinction is missed altogether. Take leadership, for example. We generally assume that a good parent guiding her child is a helpful thing. So we look for individual leaders at all levels. As soon as you add more people, though, it isn’t individual leading individual. It’s individual leading group-of-which-she-is-a-member. A part leading a whole, a group mind which behaves differently than all the individuals in it. I think we could still agree that having solid leadership from an individual has been demonstrated to benefit groups time and time again. The question is how far that remains true, and also at which order of magnitude the leadership skills exist. Is it possible for an individual or even a group of them to lead a massive nation? At some point the complexity of the whole system exceeds the order of magnitude at which the leader can reasonably be expected to predict results (since you can’t model certain kinds of complexity with anything less than the system itself, and the human brain is not that whole system). That mean good leaders of large populations aren’t much more than good guessers.

There are also different skills required in mentoring an individual, leading a fire team, marshaling hundreds of people to run an event, running a vast global organization, or ruling over a country. It’s reasonable to expect that “good leader” refers to a set of skills that apply to only specific orders of magnitude. A good parent of four boys won’t necessarily be a good military general. A good CEO won’t necessarily make a good 10th grade business teacher—not only due to the different realms, but the size and interactions of the group of people being lead.

We also assume that what’s good at one order will be good at the next. When a kid shows signs of healthy physical growth, we decide that growth is good. Can you apply that same principle to a growing human population? Different order. A person never runs the risk of growing to be 72 feet tall and 144,000 pounds. That would be inconvenient, and we may find that feeding him becomes a burden. Populations have no genetic cap to their size, only resource caps. Assuming that because growth is good for a person or a fruit tree, that it’s always good on a population order is a mistake that means the difference between diversified and sustainable on one hand, and sprawling and fragile on the other. Yet every time I hear news of population growth slowing down, let alone declining, it’s accompanied by weeping and the gnashing of teeth in search for a solution.

Finally, to bring it back to this recent essay, we may interpret the unpleasant information received from a small failure to mean that risk, information-seeking, is unpleasant, and thus avoid it. The class of behaviors called “exploration” that seeks information to inform future actions, though, will involve many things both pleasant and unpleasant which guide further actions, that set up conditions to produce...well, more desirable conditions. If a rat stopped exploring the maze when the first box it came to contained a small shock instead of cheese, it would starve to death. Rats never do that. People do.

3. Missing a higher category

When farmers in pre-grocery store human civilization harvested their crops, they didn’t look at the great bounty before them and assume that “endless prosperity has arrived,” with more food than anyone could ever eat, and things would be easy now that they could just feast and reap—no more annoying sowing. They knew that this was, in fact, Fall. A season of harvest followed by Winter, a season of rationing and letting fields lie fallow, which was followed by Spring, planting, and Summer, tending. In other words, they could see whole classes of events nested in larger cycles. Had they blown through all the wheat like a lottery winner, we wouldn’t be here, because they’d have committed an error in failing to recognize the existence of a higher category.

In ecology, a stage involving a certain community of life is called a sere, and seral succession is the process by which these stages pass to another. Imagine a wildfire annihilates a woodland. The first plants to return are the pioneers. Weeds shoot up and dominate. Soon they’re replaced by grasses and brush, and all the animals that prefer them. Then smaller trees and slower-maturing vegetation, and eventually a new woodland. If you happened upon the meadow when it was just grass and low brush, and decided that this is the landscape of the area now and always, you’d have missed a higher category, the seral cycle. Organic communities and things that behave organically tend to move through such cycles. Yet it isn’t hard to find someone who’ll tell you that “the market always goes up,” or that when there are more people than resources, the next logical stage is “terraform Mars” instead of “contraction to a level well-below actual carrying capacity, followed by modest recovery” as happens with literally every other living thing.

We also miss the higher category when we pass the risk and the harm up the taxonomy chart. If a person fails ineptly but is insulated from harm by the family’s money, it’s the family (and the people who have to deal with that person’s next misguided ventures) that take the blow, unless they too can pass it up. When a business—or a whole lot of them, as in 2008—fail, and they’re bailed out by taxpayers, the harm isn’t avoided. It’s kicked not just down the road but up to a higher order of magnitude where, once it has time to catch up to that scale and hit again, will cause an order of magnitude more misery than it would have if it had been limited to the firms involved.

Summary

Errors of logical type like the ones illustrated here are common and harmful. Being aware of—and sensitive to—context gives us the most reliable information on which to base our actions. Ignoring or misreading context confuses us, and leads to poor decisions. That’s because what we’re doing is using maps to navigate a territory. Good maps lead us in a predictable fashion, even though they aren’t comprehensive by any means. Bad maps fail to correspond to the territory, and so lead us astray. In the next essay, I’ll discuss how consistently missing the context of information leads to consistently bad maps, which in turn leads to mental illness, and that context blindness might as well be a synonym for madness.

June 2025

S M T W T F S
1234567
891011121314
15161718192021
22232425 262728
2930     

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Sep. 8th, 2025 12:19 am
Powered by Dreamwidth Studios