Who's down with entropy? (Yeah, you know me.)
I was brainstorming at Peet's this evening -- I was thinking quite productively, actually, because I also came up with an excellent idea for an expository article -- and I came up with a way of defining "entropy" for scientists. The formula is simple:
scientific entropy = log(arrogance/talent)
For other types of entropy, see wikipedia.
One could (and should), of course, incorporate things like reputation and things that cause such time-dependent phenomena (such as number of papers, citations, etc), but let's just keep it simple. The more talented you are, the more people will put up with your shit (i.e., the more arrogance you can exhibit).
As John von Neumann once said while talking to Claude Shannon, "Nobody knows what entropy really is, so in a debate you will always have the advantage."
2 days ago
8 comments:
So, if I were to apply that formula to mysel...
PROCESS INTERRUPTED: SIGFPE
Are you having confidence issues?
Yes, but the idea is that that can be my salvation -- once I have no confidence, I can apply L'Hopital's rule and (hopefully) end up with something finite.
I could also add a fudge factor and refine the definition to be log[(arrogance + 1)/(talent + 1)], where arrogance and talent are both nonnegative. (Or I could stipulate that they're both positive, but it's probably better to just add the fudge factor.)
Good luck with the L'Hopital thing.
You guys are dorks.
*sigh*
By the way, I sigh because I understand you....
That's comforting. I was worried the *sigh* was because you just had a fantastic bowel movement.
As we've so poignantly demonstrated, a Caltech education prepares us very well for life, the universe, and everything.
Post a Comment