Fingernail Growth Rate and Life Expectancy

“Wow Simon, this is such an enticing title for a blog post!”

“I know, right?!”

Okay chill out. This is indeed mildly interesting, at least to me, because the thought never occurred to actually measure this. That is, until some time following the events of March 17, 2024:

Das blood!

A slipup washing knives, totally unrelated to this being St. Patrick’s day, resulted in 3 stitches.

Ouch!

Here’s a better pic, 4 days later:

Twitches get stitches

The wound, being a clean incision brought to me by alcohol-impairment and fine German steel, closed within 2 weeks. But as my fingernail grew out, I noticed that it had been damaged behind the cuticle. What started as a divet turned into a flaking nail as the damage neared the fingertip.

So much for being a hand model

Eventually though, the damage grew out and was clipped away. And unlike my toe which slid under the bathroom door while exiting the shower some 30 years ago, the growth cells were not permanently rendered incapable of uniform nail growth. Huzzah! At long last, the injury is fully healed.

Except for that small split on the side, which might be permanent. But it’s not very noticeable and the skin doesn’t even have a perceptible scar, or more importantly, lasting numbness (I was worried about that for a while).

So how long did this injury take to completely heal? 181 days! Essentially two full seasons. So back to the original line of thought: what is my nails’ growth rate, and naturally – is that normal? Squander not an opportunity, for I have definitive empirical measurements based on when that crack grew out.

This was a difficult picture to take myself

It would appear that, based on photos of the original injury’s location, that between March 17 and September 14, 15 millimeters’ worth of nail grew out. So if we apply some basic math, that’s…

~2.5mm per month, or…

~0.08mm per day.

Which seems like a long damn time to be catching that cracked nail on things. But is that normal?

According to healthonline.com (seems like a legit website), the average nail grows 3.47mm per month, or “roughly a tenth of a millimeter per day”. My math works out to 0.12mm per day, so they’re using fuzzy math, but whatever. Dear God! My nails are growing at 2/3 the average rate for a healthy human! Do I need more alpha-keratin in my diet?

Okay, so digging deeper reveals that’s a rough estimate and nail growth peaks at age 10. I’m 40, so not exactly in my prime anymore, granted. So that means, if my nail growth indeed peaked at age 10, then for each decade since, my nail growth has decreased by 22.2%, if we’re assuming a linear function, which I can only do with two data points. 66.7/3=22.2.

Now the important question: can I use nail regeneration rate as a benchmark for all my cellular regeneration? And if so, can I use that to predict the point at which I’ll no longer be able to adequately heal – i.e. die?

Let’s try. So for X, when X = current percent rate of nail growth (66.7%)…

And when Y = # of decades passed since X, then…

My predicted rate of cellular regeneration, C, = 100-((X/3)*(1+Y))

Then we see where C falls to zero. Then I can simply narrow it down by dividing (X/3) by 10 to determine degeneration rate per year.

My conclusion: I will die sometime just before my 75th birthday!

Not very encouraging. I think I need more data points. Give me another 10 years and I’ll complete another measurement. I hope the prediction is a little more encouraging, otherwise I’ll be looking at early retirement!

–Simon

P.S., This counts as Quantitative Philosophy!

Is It Reading?

As I often quip, I’ve received much accusation that I was never a reader, by my mother, owner of a library of double-stacked bookshelves containing romance novels, which totally isn’t pornography, unlike, apparently, my father’s collection of annual Sport’s Illustrated: Swimsuit Edition magazines (she HATED those). I guess if it isn’t visual stimulation then it doesn’t count, which is good news considering my personal enjoyment of all those Literottica stories from the good ol’ days of the Internet. Had I stopped there, I might have been able to go to heaven after all.

And I’m not so arrogantly boastful that I’ll post my résumé as evidence of a contrarian opinion, but I don’t exactly maintain my socioeconomic position from my original read-free occupations: bagging dirt at a greenhouse and bussing tables; so normally I shrug off this odd perception of illiteracy. But naturally success, however moderate, will attract hate. Haters gonna hate hate hate, right? So it is that my Family of Origin* must find merit negations.

*(I discovered this term recently. It’s used to differentiate one’s family they spent childhood with from their current one. I like it, because I don’t consider the former group to be my family anymore, as it’s essentially been disbanded, and I’ve since started my own. Oh, and I found the term through reading, incidentally.)

So it was that my father joked about my presumed lack of mathematical skills. Or he did, until he caught on that I was taking a tally and timestamp every time he brought it up. Pity. I was going to use that in a Quantitative Philosophy post: Time to Math. Oh well.

And so it is that certain other members of my FOO bring up the reading bit, and it’s not just my mother. I overheard a snide comment from a phone conversation recently that made just this particular snipe at me again (it’s not wonder my daughter hesitates to answer calls when the caller inevitably insults her own father). But unlike the math bit, which has a base in actual personal struggles, I never quite got the illiteracy dig. Surely my FOO knows that I read to some extent or I wouldn’t be able to function in my daily occupation, but apparently that doesn’t qualify as reading? I was therefore determined to build a logic tree that determines what is considered reading, which in their minds I’m not doing, based upon all the reading they’re apparently doing that actually counts as reading. Here goes:

  1. Is the medium paper? If yes, then proceed to question 2. If no, proceed to question 4.
  2. Is the content in novel form (printouts/PDFs don’t count)? If yes, then proceed to question 3. If no, then proceed to question 6.
  3. Is the content technical in nature? If no, then this counts as reading. If yes, then this does not count as reading.
  4. Was the content in its original form paper (e.g. now in ebook format)? If no (e.g. news articles, blogs), proceed to question 5. If yes, then go to question 2.
  5. Is the content related to your occupation? If no, then this does not count as reading. If yes then go to question 6.
  6. Is your job academia or are you working a job based on an advanced STEM degree? If yes, then this counts as reading. If no, then this does not count as reading.

After thinking it through, I found it’s easily distilled down to 2 scenarios. Reading is only reading if the text is:

  • On paper in novel form, but the content cannot be related to knowledge gain unless your job is in academia or are you working a job based on an advanced STEM degree. Or…
  • In any other form of media besides paper, but only if the original text was in novel form or if your job is in academia or you are working a job based on an advanced STEM degree.

Observant readers will have noticed some implications. Here’s my psychological take on how my FOO defines reading:

  • My job is more important than yours and more difficult, I’m sure, so any reading I do is important, unlike yours, and therefore qualifies as reading while whatever it is that you “read” doesn’t.
  • I have an insecurity and when I can’t justify the importance of my own existence I turn a leisure activity into an intellectual one in my own mind.
  • Either or both of the above.

So what’s the answer? Well, in my case, it’s to have fewer conversations with my FOO and answer the phone less. But in a broader sense, it does raise some societal questions. Intellectual snobbery aside, what is “reading” in that the consumed content is literature or “higher” information? That’s a question that warrants significant debate beyond individual opinion. It’s a question that needs the involvement of educators and policy-makers alike.

As a final outtake, here’s a related article I stumbled upon after writing this. I wanted to know how others have thought this through. Excluding the personal irritations with family, I’m certainly not alone in the pursuit of discovering what true reading actually is (even though reading this article isn’t true reading as per the above outlined criteria):

https://medium.com/@bbayless15/what-counts-as-literacy-and-for-whom-510b073a402e

(I know, it’s Medium. But that also betrays my own prejudice against defining sources whose content consumption qualifies as reading.)

Myself, I’ll just talk to family less.

–Simon

Life Expectancy and Weighted Voting

Today, as with all election days, I waited in line and internally judged all the decrepit husks of barely-living people around me and wondered why they should have a hand in forming government policy when they probably wouldn’t live through the term. Many of them couldn’t walk, hell-several of them couldn’t even breathe on their own without compressed oxygen. Yet they get a say in how future generations will live.

Why? I don’t presume to know, so I’ll defer to a historical political precedent for a reference point: age requirements for political offices. Specifically, the POTUS, which maintains a minimum age of 35. Apparently when this rule was enacted, it was done so on the grounds that an unquantifiable degree of experience that could only be obtained through living long enough should be in the candidate’s background.

Conversely, while minimum age requirements remain in effect, maximum age restrictions for political office remain primarily absent. Apart from the fact that people eventually die.

So I’m going to call out a number of inferred points:

  • Life experience is needed to make good political decisions
  • 18 is the minimum age requirement to officially make any political decisions
  • 18 is therefore the publicly-accepted minimum life experience requirement for politics
  • 35 is the minimum age requirement to hold the office of the US Presidency
  • 35 is therefore the minimum age requirement to officially make political decisions of the greatest import
  • 43 is the age at which a president will enter the last year of a second-term presidency (assuming they’re sequential, which they usually are)
  • 43 is therefore the maximum age at which we expect the president to be fully competent to make the most important political decisions
  • Death is the ultimate limiter for making any political decisions
  • 79 is the current American life expectancy

Therefore 18 to 79 is the age range in which we can make political decisions, with 35 being the age at which we are qualified to make the most important political decisions.

Next point to consider: does this mean that 35 to 79 is the period in which we are fully suited to making the most important political decisions? Cognitively-speaking, the jury is out on that. Without citing specific sources, I’ll say that from the studies I’ve seen reported, peak intelligence occurs earlier in life, with some mental decline thereafter, but long term memory stays intact and contributes to total intelligence until dementia sets in. So rather than argue for a specific age limit on voting or holding office, which no one has agreed on yet, I’ll make a simpler point:

  • Who is most impacted by our voting decisions?

Or rather: younger people have to live longer with a political decision unless a future vote changes the policy.

More pragmatically: if we all vote in our own self-interest, we have less time to benefit from doing so as we get older, and any such policies enacted in this space of time will be of greater impact to those who are younger. Once we hit the age of average life expectancy, it’s a crapshoot how long we’ll live to see the results of how we vote.

Now to the point. I will offer a final formula that weighs an individual’s vote based on age, with the following criteria (that’s right-it’s a Quantitative Philosophy post!):

  • 18 and under: static weight of 0% since you can’t legally vote yet.
  • 18-35: increasing weight to account for increasing experience, culminating in a maximum weight of 100% at age 35, the age we decided as a country that you have sufficient life experience to hold the highest political office and make the most impactful decisions.
  • 35-43: the tenure period for a sequential two-term presidency, which assumes this is the age range during which someone is most qualified to make the most impactful political decisions-therefore a static weight of 100%.
  • 43-79: decreasing weight to account for the decrease in time that we have left alive, corresponding to how many years we potentially have left to live under any new political policy changes.
  • 80+: static weight of 50%. At this point you’re still entitled to vote, but the uncertainty of living to see the impact of your voting should greatly limit how much your vote counts.

Formula (in Excel format, because I work in finance and that’s the format I know):

For: age = X

=IF(X<18,0,IF(X>79,50,IF(X<35,100*(X+35)/70,IF(AND(X>=35,X<=43),100,100*1/((X+35)/70)))))

I’m 39 and my vote should count as 100% of one vote (for now). The kid in highschool gets counted as 76% of a vote. A new retiree is counted as 69% of a vote. And that old geezer on oxygen and living on Medicare and Social Security gets counted as a half vote.

Live in the present and shape the future, but then abdicate it to those who follow.

(Oh, and no one’s using abortion as birth control…whatever the fuck that means.)

–Simon

XP Padding

Did you know that Liz and I have a total of 23 years of finance experience?  That’s pretty amazing to think about.  A family unit has over half an entire career lifetime’s worth of knowledge in an industry?  Wow!

That means, collectively, we know as much about the credit/deposit industry as someone who’s worked in it since the 1990s.  And to think that in 1998, we were in middle school.

Yes, I’m being obnoxiously sarcastic here, because this crap needs to stop.

It’s encountered more among younger managers with lower payband teams.  Some smoothskin fresh out of business school wants to make a large group of grunts feel important, so they come up with ways to make menial work sound valued with big numbers.  Now, pulling from my own career experience, a 1000 people with 1-2 years tenure in a call center have, according to this asinine logic, 1-2 thousand years experience with the company!  Big numbers are exciting and I feel like I’m actually contributing significantly to the bottom line!

No, I don’t.  I felt patronized.

I will explain why this is stupid.

Given that entry level employees share the same basic knowledge pool from their training, this knowledge overlaps.  It doesn’t compound.

Given that knowledge is dependent on the individual’s memory to be of use.

Given that memories fade after their creation.

Then a large pool of shared knowledge only increases the chance that a selection of said knowledge is retained somewhere in the group, but still fails on the individual level at the same rate.

Therefore increasing the labor pool only increases the chance that someone retains an element of training, not that the collective unit as a whole can all access this information simply because one person has it.

Therefore experience is not cumulative across a group.  It can only complement the total group’s value.  It’s part of the equation, certainly, but a different formula is needed beyond Excel 101 sum(A:A).  Something more complicated is required.

***

I will begin with Hermann Ebbinghaus’s oft-referenced simplified formula on memory loss.  Where t is time and S is the relative strength of a memory, then R equals the probability of that memory being recalled:

R = exp(-t/S)

For the sake of this exercise, I will assign t to the number of days since the memory was created, and S to a static value of 25–which I’m arbitrarily defining as a 25% value to the individual, because work training material is really riveting.

In this example, a person trying to recall a fact after 7 days would have a 76% chance of doing so.

Now if we scale this to a group, cumulative probability would calculate the chance at which all people with a group, P, would recall that memory (Rc):

Rc = (exp(-t/S))^P

Let’s say 3 people are in this group.  Scaling the above example would yield a 43% chance of every person remembering the fact.  The more people we add to the group, the less the chance that all members would remember the same fact.

I’m going to get crazy here and use this as a basis for my own theorem: Simon’s Theorem on Group Memory Loss Dynamic Experience Offset over Time.

And theorem’s are great, because they’re hypothetical formula extrapolated as mathematical representations of empirical observations.  As long as the math itself is correct, no one can deny what I’ve witnessed personally.  Ergo, while I can never prove my theorem to be right, no one can prove it’s wrong.  Suck it!

Ahem.  Anyway…

I’ll assign a value to the group now (Ev).  As in usefulness, not numerical.  A 1:1 would be the ideal ratio, but that’s not going to happen because of the initial premise.

Ev = P((exp(-t/S))^P)

So after 7 days, the data retention of those 3 people on a 25%-level of interest piece of information turns these people’s usefulness, as units of the whole, into the equivalent of 1.3 people.  Note how increasing the personnel further reduces the usefulness.  That’s because, again, information isn’t pooled across the group.

But also remember that increasing the group size increases the probability that any one individual will remember the information (Rg).  So we take the individual retention rate and raise it to the inverse of the group size.  Retention will never be perfect.  A data point may be lost to time no matter how many people are hired.  But it does continually raise the probability:

Rg = exp(-t/S)^(1/P)

Of those 3 people, individually there’s only a 76% chance that a specific individual will remember a piece of information, and of the group there’s only a 43% chance that they will all retain that information, but across the group there’s a 91% chance that any of them will remember that information.

This is where the group size makes an impact–on the chance that across the group as a whole, one of them will prove their use having retained the necessary information.  By increasing the group size, we increase that possibility.

But let’s go even further.  Because if you’re still reading, I feel we’re now on a journey together and I don’t want to disappoint.  I’ve grown fond of you, dear internet reader.

And because, if you’re very attentive, you’ll note that time will still gnaw away at the group recollection chance.  More people will increase the chance, but that’s not scalable.  What we need is a third way to increase value, since we can’t ever reduce time, and staff size always has a limit.  We need another variable.

That’s right!  We increase the number of informational items, which we have to do over time, else memory loss will still degrade the total usefulness at the same rate.  So we increase the total number of informational points learned per day.

I offer one final formula: the ultimate value of the group (Uv), which incorporates the logic of the prior formulas, quantifies the equivalent value of the group based on the equivalent value of people as units, but taking into account the chance of any one person remembering a select piece of information, and increases the value based on the number of information points presented per day (I) for the duration of t:

Uv = exp(-t/S)P((exp(-t/S))^P)tI

As mentioned, this value degrades with time, but can be increased with additional information points.  Also known as experience.  Ah, we’ve come full circle finally.

Conclusion:

The value of a group is more complicated than its collective time.  If we base the value on total information, we can’t assume that all members of a group retain that information, and a linear function doesn’t apply.  We can increase the value of the group by increasing its number, which in turn will increase the chance that information will be retained by an individual, but to ultimately avoid group value loss, additional information–or novel experience–must find its way into each individual of a group on a continual basis.

And this is why we can’t just add up everyone’s tenure.  Experience isn’t cumulative.  It’s one variable in a probability function that someone in a sample size will increase group value through novel experience recollection.

Maybe lower management should cut back on the 3 martini lunch team building.

–Simon


  • t = # days
  • S = strength of memory (25%)
  • P = total # of people trying to remember
  • I = items of value learned per t
  • R = probability of memory retention
  • Rc = Chance of all people remembering
  • Ev = Equivalent value of total people as units
  • Rg = Chance of any one person remembering from total # of people
  • Uv = Ultimate value of group

Kópsimodendroacrophobia

The fear of cutting wood at heights

Also: Phobia Quotient!

The neighbors rented a boom.

(A tangent here–I don’t think I’ve ever created a name for these neighbors, probably because they’re nice and reasonably normal.  I’ve just called them by their first names: Brian and Kelly.  Let’s change that now.  I shall call them the Busybees.  Because they’re always rather busy.)

Anyway, they hate trees.  Well, to be fair, all Ohioans hate trees.  Almost as much as they hate dressing appropriately for the weather.  Liz is a prime example.  She also hates trees.  Here’s a typical conversation:

Statement: “This tree looks a little brown.”

Response: “Cut it down!”

Statement: “This branch looks dead.”

Response: “Cut it down!”

Statement: “This tree isn’t perfectly erect.”

Response: “‘Erect’…*teehee….Cut it down!”

But this year the trees in question really did look dead, and so I agreed after much insistence to cut them down.  Liz, the Ohioan, had already been convinced.

Cut it down!

So after this roundabout lengthy preamble, I arrive at the point of my post: I don’t like heights.  Never did.  Figured those who do are idiots or showoffs.  Of course, in my youthful egocentric stubbornness, I forced myself to endure them.  Indoor rock climbing, rappelling, mountain hiking, amusement parks–been there; done that.  And while being young grants a greater allowance for risk in the face of death, probably due to the amount of testosterone that was oozing out of my every orifice, approaching middle age has forced a more practical approach to death–like fearing things that cause it.

Consequently, my parasympathetic nervous system now strongly advises me that death should be avoided and doing certain things increases its risk potential.

But damned if I didn’t try.  I went up there twice and cut branches, though in the end, Liz did the bulk of the work.

So this got me thinking.  Is my phobia truly debilitating, or just a common healthy fear of death, albeit somewhat too strong?  Internet time!

I didn’t vet this information at all, but it seems sound.  Let’s see how I stack up:

  1. Snakes?  Some Indiana Jones shit right there.  But they do have a creepy shape and are among the few large terrestrial animals that are venomous, so I get it.  I do not have this fear.  Pass.
  2. Heights.  Already discussed.  Good to know this is #2.  Fail.
  3. Public Speaking.  I don’t really think this is a phobia.  It’s anxiety over social acceptance, not a life or death scenario, unless you consider the tribal fear of being banished which might lead to death.  Exempted.
  4. Spiders.  See #2, though they’re smaller.  I like spiders.  Pass.
  5. Claustrophobia.  I don’t like being restrained, probably from childhood memories.  My parents thought it was funny to sit on me for extended lengths of time.  Sick Boomer humor.  But small places don’t bother me.  Pass.
  6. Airplanes.  Nah.  I hate them more than fear them.  Smell farts for hours, get felt up by security, then packed in like an Amazon warehouse.  But not fear.  Pass.
  7. Mice?  No.  Pass.
  8. Needles.  I hate getting poked.  Triggers a primal fear, though I don’t have a panic attack from it.  Pass.
  9. Crowds.  Nah.  Just an inconvenience.  Pass.
  10. Darkness?  Only after watching Alien or Jurassic ParkPass.
  11. Blood?  Only my own.  Pass.
  12. Dogs.  I love dogs.  Pass.
  13. Clowns?  I hate them, but it’s not fear.  Sort of like cats.  Shoot them for entertainment, but that’s it.  Pass.

My total score: 1/12.  But, these are weighted based on commonality, so I will use sketchy math to quantify this.

I’ll take the inverse of each item (only counting the “very afraid” numbers, because really, most of us are probably “a little afraid” of many of these, which does not a phobia make), multiplying by 100, and excluding #3, the total equals 169.9.  This is the total max sissy quotient, which I’ll set as the baseline of 100% total sissy.

I posses #2, inverse of which is 4.2.  Then to scale it with the baseline, that’ll be 4.2*100/169.9, which equals 2.5%.  I am a 2.5% sissy.

But where is the median sissy?  I really don’t know, because I don’t see these as cumulative probability, so let’s take a nice midpoint in the range: 5+((32-5)/2)=18.5.  1/18.5*100=5.4.  5.4*100/169.9=3.2% sissy.  So I’m lower than baseline, according to my questionable math from unvetted sources.

I guess I’m pretty normal after all.

But you’re a total sissy if you fear blood.

–Simon