Consumption and Creation

[Note to self: add this to the Quantitative Philosophy Index when it posts]

My personal life philosophy defines an individual’s value on the activities one engages in when in an autonomous state. More simply: what you do with your personal time quantifies your life’s meaningfulness. I don’t see the level of impact itself to be the defining factor, since so few of us are ever granted the circumstances under which to achieve greatness, but that doesn’t preclude us from seeking a virtuous live, even if the tangible results are comparatively minor.

Setting this premise: after a day of my daughter watching anime and binge-eating, I tried to explain that she was, in some non-so-friendly-terms, being a completely self-indulgent and useless sack of loafing teenage flesh. In the aftermath of that conversation, however, I though it more helpful to create some definitions. Here’s how I break them down:

All voluntary human activities fall into one of four categories:

  1. Active Creation (Cra): activities that require direct engagement and production.
  2. Active Consumption (Coa): activities that involve using someone else’s creation, but still require direct engagement.
  3. Passive Creation (Crp): activities that are either a secondary component of active creation, or prerequisites/maintenance activities to support active creation.
  4. Passive Consumption (Cop): activities that involve using someone else’s creation in a manner that is strictly self-indulgent.

These activities are not equal in value. Cra is the highest, with Coa and Crp secondary, and with Cop the least.

Creation/Consumption vs Active/Passive graph

As an example, washing dishes and doing some reading rank above watching TV all day, but rank below cooking dinner. Coa and Crp ultimately support Cra – without which Cra couldn’t take place, while Cop remains generally nonconstructive outside some mental health benefits. Obviously these baselines require some interpretation. I’d consider reading a classic novel to be Coa but reading a trashy romance novel Cop – one must be honest with themselves.

This is all fine for abstraction, but let’s quantify. What constitutes a day seized? At what point does one achieve virtue for the day? I’ll assign values:

Cra = 5

Coa = 3

Crp = 3

Cop = 1

This almost works with a Fibonacci sequence. Indeed, Coa and Crp could probably have tier 2 and 3 pointed subsections, but I’ll keep it simpler for the sake of this exercise.

Virtue = Cra + Coa + Crp + Cop

Day’s value = amount of daily virtue.

As for a daily virtue benchmark, here are the highlights from a recent Saturday, which I feel was a notable example of one such virtuous day. I…

Made pizza, made my own cheesey bread, cleaned the kitchen x3, cleaned out the fireplace, started a fire, watched Fallout, took measurements and material inventory for needed house projects.

I’m sure there were more, but these are what I remember. This would come out to:

5+5+3+3+3+3+3+1+3 = 29

It was a busy day, so lets round down to 25 to be more realistic with goals. A virtuous day requires 25 points. For a day off. As for a working day, let’s say 12 – half rounded down.

Now math:

Cra=5,Coa=3,Crp=3,Cop=1

S:=Cra+Coa+Crp+Cop

O = day off

V = day is virtuous

V⟺(O∧S≥25)∨(¬O∧S≥12)

Not having a philosophy background, the concepts of virtue and excellence seem to escape the kid’s comprehension. Maybe this could add context. If not, it’s a good overview and reminder to myself for when I start to feel lazy, now that I’ve thought the concept through. Virtue is universally available. All we have to do is act towards it.

–Simon

The Decline of Restaurants: An Anecdotal Observation

[Note to self: add this to the Quantitative Philosophy Index when it posts]

Remember those times when eating at restaurants was fun? I had attributed this to a combination of not having to eat mom’s boiled vegetables and not possessing any financial knowledge of a restaurant’s expense. Childhood, in essence, was the best time to eat out at restaurants.

But now, it’s usually disappointing. And there are so many more dining options out there than what was available to me as a kid! There has to be more to it.

So I sat down and compiled an arbitrary list. Here goes:

Given that the experience quality is defined by 5 operators:

  1. (A) Base cost of restaurant food
  2. (B) How much I’m expected to tip
  3. (C) How good I am at cooking
  4. (D) Novelty of eating at a restaurant
  5. (E) Perceived quality of restaurant food

Then:

D+E-(A+B+C) = Quality of the experience.

As these are mostly relative measures, attempts at quantification prove difficult. This approach also fails to represent why restaurants were fun before but suck now. No – a timeline representation is needed for this one:

Now I’ll point out some observations having thought back through this timeline:

  • The novelty of eating at a restaurant started high as a child, then declined as an adult as I could make the personal choice any time I wanted. This trend continued until COVID lockdowns, when the option was taken away, peaking after places began to reopen, following a drop to prior levels.
  • The perceived quality of restaurant food again started high as a child, generally maintained its allure through adulthood, seemed even better when it was less available during lockdowns, then drastically collapsed thereafter, following the industry’s maladaption to post-COVID labor costs and all that it impacts along the way. American businesses never cut profits, so restaurants instead turned to lower quality ingredients and even less-skilled labor.
  • Also, to further counter rising business costs, restaurants raised prices, and very quickly indeed.
  • Then, restaurants and the dining culture turned to collective guilt and overhauled tipping expectations. The tip itself, based on a percent of the meal’s cost, shouldn’t change if the base meal’s cost is increasing to offset overhead. In theory, the workers would see a proportional increase in their compensation as a result. Yet now we’re expected to give them a greater percentage, out of our own pockets. I don’t need guilt added to my dining experience, nor an additional expense to further raise the final expected cost.
  • And all this might be tolerable if I didn’t know how to cook. But I do, and my standards are often higher.

That said, here’s a final observation to further drive home the point: All of these dynamic variables chronologically, mostly, intersected a couple years back, which I’ve visually represented as the “Approximate industry failure point”. This was the moment at which dining out became almost entirely non-viable for me.

Everyone will have their own version of the graph, and perhaps restaurants still make sense to some people. But unless either the quality and novelty of fine dining drastically increase, or costs go way down, I don’t see this industry as a cost-effective source of entertainment for the foreseeable future.

–Simon

Fingernail Growth Rate and Life Expectancy

“Wow Simon, this is such an enticing title for a blog post!”

“I know, right?!”

Okay chill out. This is indeed mildly interesting, at least to me, because the thought never occurred to actually measure this. That is, until some time following the events of March 17, 2024:

Das blood!

A slipup washing knives, totally unrelated to this being St. Patrick’s day, resulted in 3 stitches.

Ouch!

Here’s a better pic, 4 days later:

Twitches get stitches

The wound, being a clean incision brought to me by alcohol-impairment and fine German steel, closed within 2 weeks. But as my fingernail grew out, I noticed that it had been damaged behind the cuticle. What started as a divet turned into a flaking nail as the damage neared the fingertip.

So much for being a hand model

Eventually though, the damage grew out and was clipped away. And unlike my toe which slid under the bathroom door while exiting the shower some 30 years ago, the growth cells were not permanently rendered incapable of uniform nail growth. Huzzah! At long last, the injury is fully healed.

Except for that small split on the side, which might be permanent. But it’s not very noticeable and the skin doesn’t even have a perceptible scar, or more importantly, lasting numbness (I was worried about that for a while).

So how long did this injury take to completely heal? 181 days! Essentially two full seasons. So back to the original line of thought: what is my nails’ growth rate, and naturally – is that normal? Squander not an opportunity, for I have definitive empirical measurements based on when that crack grew out.

This was a difficult picture to take myself

It would appear that, based on photos of the original injury’s location, that between March 17 and September 14, 15 millimeters’ worth of nail grew out. So if we apply some basic math, that’s…

~2.5mm per month, or…

~0.08mm per day.

Which seems like a long damn time to be catching that cracked nail on things. But is that normal?

According to healthonline.com (seems like a legit website), the average nail grows 3.47mm per month, or “roughly a tenth of a millimeter per day”. My math works out to 0.12mm per day, so they’re using fuzzy math, but whatever. Dear God! My nails are growing at 2/3 the average rate for a healthy human! Do I need more alpha-keratin in my diet?

Okay, so digging deeper reveals that’s a rough estimate and nail growth peaks at age 10. I’m 40, so not exactly in my prime anymore, granted. So that means, if my nail growth indeed peaked at age 10, then for each decade since, my nail growth has decreased by 22.2%, if we’re assuming a linear function, which I can only do with two data points. 66.7/3=22.2.

Now the important question: can I use nail regeneration rate as a benchmark for all my cellular regeneration? And if so, can I use that to predict the point at which I’ll no longer be able to adequately heal – i.e. die?

Let’s try. So for X, when X = current percent rate of nail growth (66.7%)…

And when Y = # of decades passed since X, then…

My predicted rate of cellular regeneration, C, = 100-((X/3)*(1+Y))

Then we see where C falls to zero. Then I can simply narrow it down by dividing (X/3) by 10 to determine degeneration rate per year.

My conclusion: I will die sometime just before my 75th birthday!

Not very encouraging. I think I need more data points. Give me another 10 years and I’ll complete another measurement. I hope the prediction is a little more encouraging, otherwise I’ll be looking at early retirement!

–Simon

P.S., This counts as Quantitative Philosophy!

Is It Reading?

As I often quip, I’ve received much accusation that I was never a reader, by my mother, owner of a library of double-stacked bookshelves containing romance novels, which totally isn’t pornography, unlike, apparently, my father’s collection of annual Sport’s Illustrated: Swimsuit Edition magazines (she HATED those). I guess if it isn’t visual stimulation then it doesn’t count, which is good news considering my personal enjoyment of all those Literottica stories from the good ol’ days of the Internet. Had I stopped there, I might have been able to go to heaven after all.

And I’m not so arrogantly boastful that I’ll post my résumé as evidence of a contrarian opinion, but I don’t exactly maintain my socioeconomic position from my original read-free occupations: bagging dirt at a greenhouse and bussing tables; so normally I shrug off this odd perception of illiteracy. But naturally success, however moderate, will attract hate. Haters gonna hate hate hate, right? So it is that my Family of Origin* must find merit negations.

*(I discovered this term recently. It’s used to differentiate one’s family they spent childhood with from their current one. I like it, because I don’t consider the former group to be my family anymore, as it’s essentially been disbanded, and I’ve since started my own. Oh, and I found the term through reading, incidentally.)

So it was that my father joked about my presumed lack of mathematical skills. Or he did, until he caught on that I was taking a tally and timestamp every time he brought it up. Pity. I was going to use that in a Quantitative Philosophy post: Time to Math. Oh well.

And so it is that certain other members of my FOO bring up the reading bit, and it’s not just my mother. I overheard a snide comment from a phone conversation recently that made just this particular snipe at me again (it’s not wonder my daughter hesitates to answer calls when the caller inevitably insults her own father). But unlike the math bit, which has a base in actual personal struggles, I never quite got the illiteracy dig. Surely my FOO knows that I read to some extent or I wouldn’t be able to function in my daily occupation, but apparently that doesn’t qualify as reading? I was therefore determined to build a logic tree that determines what is considered reading, which in their minds I’m not doing, based upon all the reading they’re apparently doing that actually counts as reading. Here goes:

  1. Is the medium paper? If yes, then proceed to question 2. If no, proceed to question 4.
  2. Is the content in novel form (printouts/PDFs don’t count)? If yes, then proceed to question 3. If no, then proceed to question 6.
  3. Is the content technical in nature? If no, then this counts as reading. If yes, then this does not count as reading.
  4. Was the content in its original form paper (e.g. now in ebook format)? If no (e.g. news articles, blogs), proceed to question 5. If yes, then go to question 2.
  5. Is the content related to your occupation? If no, then this does not count as reading. If yes then go to question 6.
  6. Is your job academia or are you working a job based on an advanced STEM degree? If yes, then this counts as reading. If no, then this does not count as reading.

After thinking it through, I found it’s easily distilled down to 2 scenarios. Reading is only reading if the text is:

  • On paper in novel form, but the content cannot be related to knowledge gain unless your job is in academia or are you working a job based on an advanced STEM degree. Or…
  • In any other form of media besides paper, but only if the original text was in novel form or if your job is in academia or you are working a job based on an advanced STEM degree.

Observant readers will have noticed some implications. Here’s my psychological take on how my FOO defines reading:

  • My job is more important than yours and more difficult, I’m sure, so any reading I do is important, unlike yours, and therefore qualifies as reading while whatever it is that you “read” doesn’t.
  • I have an insecurity and when I can’t justify the importance of my own existence I turn a leisure activity into an intellectual one in my own mind.
  • Either or both of the above.

So what’s the answer? Well, in my case, it’s to have fewer conversations with my FOO and answer the phone less. But in a broader sense, it does raise some societal questions. Intellectual snobbery aside, what is “reading” in that the consumed content is literature or “higher” information? That’s a question that warrants significant debate beyond individual opinion. It’s a question that needs the involvement of educators and policy-makers alike.

As a final outtake, here’s a related article I stumbled upon after writing this. I wanted to know how others have thought this through. Excluding the personal irritations with family, I’m certainly not alone in the pursuit of discovering what true reading actually is (even though reading this article isn’t true reading as per the above outlined criteria):

https://medium.com/@bbayless15/what-counts-as-literacy-and-for-whom-510b073a402e

(I know, it’s Medium. But that also betrays my own prejudice against defining sources whose content consumption qualifies as reading.)

Myself, I’ll just talk to family less.

–Simon

Life Expectancy and Weighted Voting

Today, as with all election days, I waited in line and internally judged all the decrepit husks of barely-living people around me and wondered why they should have a hand in forming government policy when they probably wouldn’t live through the term. Many of them couldn’t walk, hell-several of them couldn’t even breathe on their own without compressed oxygen. Yet they get a say in how future generations will live.

Why? I don’t presume to know, so I’ll defer to a historical political precedent for a reference point: age requirements for political offices. Specifically, the POTUS, which maintains a minimum age of 35. Apparently when this rule was enacted, it was done so on the grounds that an unquantifiable degree of experience that could only be obtained through living long enough should be in the candidate’s background.

Conversely, while minimum age requirements remain in effect, maximum age restrictions for political office remain primarily absent. Apart from the fact that people eventually die.

So I’m going to call out a number of inferred points:

  • Life experience is needed to make good political decisions
  • 18 is the minimum age requirement to officially make any political decisions
  • 18 is therefore the publicly-accepted minimum life experience requirement for politics
  • 35 is the minimum age requirement to hold the office of the US Presidency
  • 35 is therefore the minimum age requirement to officially make political decisions of the greatest import
  • 43 is the age at which a president will enter the last year of a second-term presidency (assuming they’re sequential, which they usually are)
  • 43 is therefore the maximum age at which we expect the president to be fully competent to make the most important political decisions
  • Death is the ultimate limiter for making any political decisions
  • 79 is the current American life expectancy

Therefore 18 to 79 is the age range in which we can make political decisions, with 35 being the age at which we are qualified to make the most important political decisions.

Next point to consider: does this mean that 35 to 79 is the period in which we are fully suited to making the most important political decisions? Cognitively-speaking, the jury is out on that. Without citing specific sources, I’ll say that from the studies I’ve seen reported, peak intelligence occurs earlier in life, with some mental decline thereafter, but long term memory stays intact and contributes to total intelligence until dementia sets in. So rather than argue for a specific age limit on voting or holding office, which no one has agreed on yet, I’ll make a simpler point:

  • Who is most impacted by our voting decisions?

Or rather: younger people have to live longer with a political decision unless a future vote changes the policy.

More pragmatically: if we all vote in our own self-interest, we have less time to benefit from doing so as we get older, and any such policies enacted in this space of time will be of greater impact to those who are younger. Once we hit the age of average life expectancy, it’s a crapshoot how long we’ll live to see the results of how we vote.

Now to the point. I will offer a final formula that weighs an individual’s vote based on age, with the following criteria (that’s right-it’s a Quantitative Philosophy post!):

  • 18 and under: static weight of 0% since you can’t legally vote yet.
  • 18-35: increasing weight to account for increasing experience, culminating in a maximum weight of 100% at age 35, the age we decided as a country that you have sufficient life experience to hold the highest political office and make the most impactful decisions.
  • 35-43: the tenure period for a sequential two-term presidency, which assumes this is the age range during which someone is most qualified to make the most impactful political decisions-therefore a static weight of 100%.
  • 43-79: decreasing weight to account for the decrease in time that we have left alive, corresponding to how many years we potentially have left to live under any new political policy changes.
  • 80+: static weight of 50%. At this point you’re still entitled to vote, but the uncertainty of living to see the impact of your voting should greatly limit how much your vote counts.

Formula (in Excel format, because I work in finance and that’s the format I know):

For: age = X

=IF(X<18,0,IF(X>79,50,IF(X<35,100*(X+35)/70,IF(AND(X>=35,X<=43),100,100*1/((X+35)/70)))))

I’m 39 and my vote should count as 100% of one vote (for now). The kid in highschool gets counted as 76% of a vote. A new retiree is counted as 69% of a vote. And that old geezer on oxygen and living on Medicare and Social Security gets counted as a half vote.

Live in the present and shape the future, but then abdicate it to those who follow.

(Oh, and no one’s using abortion as birth control…whatever the fuck that means.)

–Simon