mellowtigger: (changed priorities)

I still want to write up some definitions of capitalism and neoliberalism, mention alternatives, and generally just think about what we're doing to ourselves by choice. First, though, I feel it's necessary to draw attention specifically to the central lie of the free market economy: the invisible hand of the market will produce the best outcomes for the system.

It sounds like a brilliant idea inspired by the principles of evolution, namely the self-organization of parts into a complex whole without central coordination. It's true that competition can encourage creativity, but only insofar as all necessity does. That necessity can be imposed by the harsh physics of the universe or by the reasoned decision to choose certain limitations. The hypocrisy of unfettered competition's superiority is easily exposed. Consider this simple question:

If a sports game is so good in its current form, then would it be even better to remove the sports official, so each team and each player can freely choose at will their own rules?
Also interesting is a related question of why English has so many names for sports officials, if sports would in fact be better off without any of them? Why the emotional investment in rules and enforcement within sports but nowhere else?

Capitalism and neoliberalism simply do not care about the cost to the environment or to the human population, as long as wealth is extracted and collected first for a few key individuals. We see this effect in practice with predictable bank failures, pollution scandals, falling wage values, and the ongoing biosphere crisis. Now, we even have PFAS "forever chemicals" in toilet paper. Yes, capitalism has managed to poison toilet paper. Profit is the sole driving force of our current economic system, for as long as it can be maintained, even though it's obvious to everyone (since 1993, at least) that it can't be maintained forever.

Which world do we want to live in? World 1, in which corporations and people work to extract profit from everyone, everything, and everywhere at all times? World 2, in which corporations and people work to continue their existence within constraints of sustainability and ethics? It's simply a choice, and it always has been. Will we choose to add constraints to make the game more enjoyable? Which system of those two possibilities do we want to maximize for its efficiency?

Choose. May you live in the world that your efforts create. Whether my wish for you is a blessing or a curse probably depends on your choice.

people are not Pokemon

2023-Mar-14, Tuesday 11:39 am
mellowtigger: (crazy)

As a rule: I don't remember names. The verbalizations, the spoken syllables, are not a part of my experience of another person, so they get lost as extraneous information. The same applies to the written form of the names. These data are not part of my subjective visual, auditory, tactile, or olfactory experience of the other person... so they eventually get lost. Sometimes, the erasure is practically immediate.

At a job 20 years ago, I learned that in order to send emails to my coworkers with whom I had already worked for years, I needed to stand up from my cubicle, look down the hallway at the nameplates on the cubicle walls, identify their name, then sit down again and address my email to them. At my most recent job, I had supervisors who lived in other states, so I couldn't use the nameplate trick. Instead, I searched my email folders for topics that I knew we had discussed (a subjective experience of mine that was associated with the person), so I could retrieve their name and begin a new email conversation.

I couldn't remember my mother's name once or twice as an adult, although I'm sure that's the name I've forgotten less often than any other. I eventually retrieved the name (which I didn't associate with the person) by recalling a very old memory (which I did associate with the person) of me at a very young age learning to spell my mother's name. I've been told that I learned to speak/read at a young age. I don't remember that age, but I do remember a mnemonic device that I was proud of creating at that time. I already understood how to speak the name "Carole", but I had difficulty with the spelling because it was an unreasonable combination of letters. I knew how to spell "car", so I created an inappropriate 3-syllable pronunciation to help me remember: /kar-OH-lee/. By remembering the 3-syllable /kar-OH-lee/ word, I then knew to assemble the reasonable letters 'c-a-r-o-l-e' to produce the 2-syllable name "Carole".

Without going into such details, I usually explain this forgetfulness to others by stating simply that people are not Pokemon. I can remember the name of Pikachu, because every verbal utterance of that character is of the form /PEE-kuh-CHOO/. In essence, the name becomes part of my auditory experience of them. People, however, don't do that... as a rule. Many years ago, someone tried to popularize the idea of repeating someone's name many times during conversation with them, as a way to overcome exactly this limitation. I once experienced someone doing this process and stating my name at me repeatedly, and I thought they were being peculiar until I remembered that piece of advice and realized the learning that they were attempting to accomplish.

I'm not the only one. This article (free archive copy) provides a solid explanation:

"Studies like this one, from the Quarterly Journal of Experimental Psychology, suggest that we’re better at remembering names than faces. In my case, the opposite is true. I’ll recognize a face, but their name escapes me. It turns out that one of the reasons is I’m not giving my brain a chance to process the information.

“The hippocampus is key to our ability to take two things that are not associated in our minds and put them together,” says Bradley Lega, associate professor of neurological surgery at UT Southwestern/Texas Health Resources in Dallas. When you meet someone whose name and face aren’t previously associated in your mind, your hippocampus plays an important role in putting these things together into a single memory. That gives you the ability to know how to address the person. The good news: Familiar names no longer depend on your hippocampus."

Bingo. Hippocampus irregularities are a known feature of autism, although how this brain difference affects behavioral traits within individual persons seems not to be clear cut. I'm certain this effect plays a role in my current dislike of shared-pronoun social protocols in English. Individually-assigned pronouns are now pro-names instead of pro-nouns, and as such will assuredly be forgotten. I'm still self-examining to tease apart the factors involved here, but the forecast doesn't look good for my future social experiences, given this new proname trend. I'd much rather share gender identity information, if that's really the goal. And I want to add a universal pronoun so that people like me can still participate socially without being rude.

mellowtigger: (the more you know)

English is weird, as everyone knows. There are, however, practical rules about the sounds that humans can create.

The official linguistic rules that we learn as students in school are not absolute. My favorite broken rule is the one about using the articles "a" versus "an" before the next word. We always use "an" before a word that begins with a vowel letter, right? Wrong. We use "an" before a word that begins with a vowel sound. The best example is the simple (if nonsensical) sentence: a /y/unicorn with an umbrella. In this case, there is a y-consonant sound before the leading vowel in unicorn. To separate multiple vowel-sounds, umbrella needs a consonant prefix provided by the article "an", but unicorn already leads with a consonant sound, albeit a weak one. The practicality of moving the tongue and throat around to make these vowel sounds will demand the flex provided by adding the brief "n" sound with article "an".

This 6-minute video is excellent at teaching these biological/physical rules of vocalization. It also reveals the sounds that could exist in human languages but are nowhere to be found (yet).

So... why doesn't English explicitly encode that "y" letter spelling into words that pronounce it? (For example, "a yunicorn with an umbrella".) As I keep repeating, English is weird.

mellowtigger: (flameproof)

Words hold power. There's no argument against it. We've known this truth for ages, succinctly expressed in 1839 in England: "The pen is mightier than the sword." How we use that power is certainly an important consideration.

Read where I use 4 words of power...

1) 'nigger' is the same as 'faggot'

Besides the same consonant-vowel pattern, these two words share a common problem. At a gay male social event several years ago, someone showed up and said, "Hey, faggots, how's it going?" That greeting flew like a lead brick, as you'd expect. There may be situations in which 'faggot' can be used deftly to illustrate a clever point, but it's useful that way only because it still holds its negative connotation, verbally held at bay through some artful linguistic construction. There is no automatic "free pass" to use the word just because the speaker is gay. It requires interpretation to find a positive use of it in any context.

Culture is now telling us that 'nigger' has exactly that bizarre distinction. Some people can use it safely, and some people never can, or so we're told. I think it's a big lie. I claim here that both words work in exactly the same fashion within human brains. These two words are always negative, and only creative use can find a clever positive point to make with them. While it's impossible to provide evidence for my assertion as myself, a white American, it is possible for someone who is black in America to provide that evidence. What we need are brain scans of black Americans listening to audio recordings of different people speaking this word of power. Some recordings should have racial clues within the voice pattern, and some should not. Some recordings should be of unknown (to the listener) speakers, and some should be familiar and trusted voices. Maybe even throw in some AI constructed voices and some actual song recordings. I expect a "danger!" signal to arise in the listener's brain scan every single time that word is presented. Perhaps that signal is short-lived after a subjective evaluation is made as to the intent of the speaker, but that delayed amelioration is a separate issue here. Some claim that 'nigger' is NOT dangerous when black people say it, but I say it is, because it works exactly the same way in our receiving brains as 'faggot'. Gatekeeping of words is a process that never did, and never will, reclaim that word's power. Or so I declare prior to brain scans as proof.

Brain scans would reveal the truth of the matter. Either way the results trend, we learn something very important. Either there IS a neural network in the brain that determines linguistic content solely via speaker context (which is something new that I haven't heard proposed before), or the word IS harmful in any context and is subconsciously damaging black listeners who hear the word frequently, particularly in music lyrics. We will benefit by learning more. We need universities somewhere to perform these tests.

2) 'blacklist' and 'whitelist' have nothing to do with skin color

All physical surfaces radiate a color of light when illuminated. It's very unfortunate that any color has achieved social relevance. For humans, those colors primarily are black and white, these being associated (inaccurately) with skin colors. Should we reclaim or eliminate words that use these colors?

This black-versus-white symbolism can be traced backwards in 4 obvious phases.

  1. Overt racism is still here, using skin color as a proxy for social bad-versus-good. It remains powerful symbolism today, because the USA civil war never ended, not completely. We still carry that same cultural baggage.
  2. Black and white hat symbolism in old television shows was widespread. It was partially a visual shortcut, allowing easy identification of people in distant views using a monochrome medium that lacked detailed resolution. It fell out of style because modern images can provide identification without those color-coded cues. I argue, however, that this metaphor was just a shortcut to convey an older truth.
  3. Black and white magic are an older version of this principle. The dark forces should be feared, while the bright forces should be welcomed. Bad versus good. I argue again, however, that this metaphor is just a shortcut to convey an even older truth.
  4. Darkness versus light. This is the core of the matter. No amount of linguistic manipulation will make humans (as a whole) stop being afraid of the dark. This is our truth. While our ancient mammal ancestors might have been crepuscular or even nocturnal, it doesn't negate the worry of hidden predators threatening us from the shadows. We welcome the light of knowledge as it dispels the threat in the darkness.

I think that blacklisting and whitelisting harkens back to this oldest and most meaningful metaphor. Some interactions are bad and should be discouraged (blacklisted), while others are good and should be encouraged (whitelisted). I think that universally forbidding use of words of color that might have reference to human skin tones (black, white, yellow, brown, red) would be disastrous for human psychology in the long term. I recommend reclaiming these words for their old powers, when they are rooted in human experience and the natural environment, not in superficial similarities to stupid human social conflicts which are hopefully temporary.

So there you have my 2 unpopular opinions. I'm sticking to them, at least until someone can provide hard evidence against them and prove where the greater harm occurs. I'll keep ostracizing or reclaiming these words differently than the mainstream culture dictates.

mellowtigger: (clock spiral)
I've called myself old since age 50. I figure "half a century" is plenty of years for that description.  I'm now 55.

Other people think differently, though, so I've been pondering what objective measure we could use more definitively than that easy-to-identify number. It's more complicated than it sounds. It turns out there are different plausible ways to measure it.
  1. Age 45. The typical human begins losing 1% muscle mass per year around age 30. It's a convenient marker for "top of the hill" metabolism. Allowing roughly 15 years as an adult beforehand (following puberty), then add 15 years after that peak, and you've got age 45 as the other terminus for the range of adulthood.
  2. Age 50. "Half a century". I still think it sounds as good as any number.
  3. Age 65. This is bureaucratic "retirement age" in the USA, qualifying for Social Security benefits.
  4. 2% mortality chance per year. This standard is probably the most accurate, but it's also where it starts getting complicated.
See the charts and read the 3 details...This chart and story provide a good explanation for this standard and how it changes over time. The article provides a similar USA chart for females.
chart showing age at which males reach 1%, 2%, and 4% mortality, from CBS News 2017 June 29
If your chance of dying within the next year is 2 percent or more, Shoven suggests you might be considered "old." The above chart shows that the threshold age for being considered old for men increased from about 55 in the 1920s to 70 today.
- https://www.cbsnews.com/news/what-age-is-considered-old-nowadays/, Steve Vernon, 2016 June 29

So there was a time in USA history a century ago when my current age would qualify as "old".  But if it's situational, depending on how fast everyone around you is dying, then there are other considerations in play too.
  1. What about the effects of systemic racism and plutocracy? Pollution is higher in the parts of towns with more black residents. Access to healthcare is lower for poor people. The response of healthcare providers is worse in both cases, and if you're female. The stress of racism and/or poverty increases health problems. The list goes on.

    Luckily, the USA has a government program designed to check this kind of stuff at a fine-grained geographic level, the U.S. Small-area Life Expectancy Estimates Project, USALEEP. So I checked the CDC's visual map for life expectancy in my census tract. Sure enough, total life expectancy is quite lower than the rest of my county, state, or country. Here in my little corner of the warzone, we are in the lowest quintile (the bottom 20%).
    life expectancy in census tract 1257 of Hennepin County, MN; state life expectancy 81.0, tract life expectancy 74.5

  2. And what about the last epidemic? I often oversimplify and state that "half of my demographic died", referring to gay men who were out in the 1980s. The details, while more accurate, are still just as horrifying.
  3. Life expectancy at age 20 for gay and bisexual men ranged from 34.0 years to 46.3 years for the 3% and 9% scenarios respectively. These were all lower than the 54.3 year life expectancy at age 20 for all men.
    - https://pubmed.ncbi.nlm.nih.gov/9222793/, "Modelling the impact of HIV disease on mortality in gay and bisexual men" (in Canada)

    In other words, I passed the 2% mark into old age a very long time ago. Maybe even in my early 20s when I already knew people around me dying. And if we're going with socially-defined seniority, then here's an article on gay authors teaching us how to "age gracefully", presumably from people who have done it already. They range in chronological age from 54 to 82. They're the elders we have left from the last epidemic.

  4. And autistics don't fare any better. We are so much more likely to die from bodily injuries that life expectancy can be as low as 36 years! Seriously, it's that's bad.

    One study, published in the American Journal of Public Health in April 2017, finds the life expectancy in the United States of those with ASD to be 36 years old as compared to 72 years old for the general population. They note that those with ASD are 40 times more likely to die from various injuries. About 28 percent of those with ASD die of an injury. Most of these are suffocation, asphyxiation, and drowning. The risk of drowning peaks at about 5 to 7 years old. ...Those with ASD without a learning disability had an average age of death at about 58 years.
    - https://www.psychologytoday.com/gb/blog/caring-autism/201810/early-death-in-those-autism-spectrum-disorder

    I noted a decade ago that I try to avoid power tools. This is why. I'm a klutz, and I know it, and I don't want to injure myself any more than the usual range of cuts, bruises, and scrapes would entail. Several years ago, I had a doctor x-ray my foot, because I seriously had no idea how I got the seemingly bruised toe that wouldn't heal quickly. At least he confirmed that it wasn't broken.

So, apparently, "old" is relative, but I qualified decades ago if we use the 2% rule.  There just aren't as many people like me left in the world as there should be.

Profile

mellowtigger: (Default)
mellowtigger

About

July 2025

S M T W T F S
  12 345
6789101112
13141516171819
20212223242526
2728293031  

Most Popular Tags

Syndicate

RSS Atom
Powered by Dreamwidth Studios
Page generated 2025-Jul-07, Monday 10:01 am