Or, why I think its useful to be able to see extinct Shorthorn EPD information whether the owner is an ASA member or WHR dues payer.
Unless we understand the algorithm generating EPD's, we can't really weigh them. I'm not aware of any place to access the mathematical way the EPD's are generated- and couldn't comprehend it anyway.
But, I can recognize patterns and compare what I see or what an old timer said they saw, to what the numbers tell me I can expect to see. Without the ability to look back at the change over time in EPD's, how does one compare the past ( however inaccurate) values to the present and future inaccuracies? It's tempting to dismiss the exercise as irrelevant, but the relevance is that EPD's are driving reasoning in commercial breeding.
If Shorthorn EPD's are a Pay to Play database, how does this influence Commercial Acceptance?
Some will say, the past doesn't matter-but with all this information technology, the pattern seems to be to erase the trail. All trails lead somewhere, even fictional numbers tell a story.
this part probably is irrelevant, unless you want insight into how really disconnected from tangibility information rumination thru computers really is.
I tried to learn what an algorithm is and ran into something called the Principle of Maximum Entropy https://en.m.wikipedia.org/wiki/Principle_of_maximum_entropy
In ordinary language, the principle of maximum entropy can be said to express a claim of epistemic modesty, or of maximum ignorance. The selected distribution is the one that makes the least claim to being informed beyond the stated prior data, that is to say the one that admits the most ignorance beyond the stated prior data.
Testable information
The principle of maximum entropy is useful explicitly only when applied to testable information. Testable information is a statement about a probability distribution whose truth or falsity is well-defined.
Entropy is a measure of unpredictability of information content
And on and on, until it urns out informational entropy has little in common with thermodynamic entropy, rather beautiful described as Time's Arrow
"Let us draw an arrow arbitrarily. If as we follow the arrow we find more and more of the random element in the state of the world, then the arrow is pointing towards the future; if the random element decreases the arrow points towards the past. That is the only distinction known to physics. This follows at once if our fundamental contention is admitted that the introduction of randomness is the only thing which cannot be undone. I shall use the phrase ‘time's arrow’ to express this one-way property of time which has no analogue in space."
A related mental arrow arises because one has the sense that one's perception is a continuous movement from the known (past) to the unknown (future). Anticipating the unknown forms the psychological future which always seems to be something one is moving towards, but, like a projection in a mirror, it makes what is actually already a part of memory, such as desires, dreams, and hopes, seem ahead of the observer.