Monday, December 04, 2006

Scientific entropy

Who's down with entropy? (Yeah, you know me.)

I was brainstorming at Peet's this evening -- I was thinking quite productively, actually, because I also came up with an excellent idea for an expository article -- and I came up with a way of defining "entropy" for scientists. The formula is simple:

scientific entropy = log(arrogance/talent)

For other types of entropy, see wikipedia.

One could (and should), of course, incorporate things like reputation and things that cause such time-dependent phenomena (such as number of papers, citations, etc), but let's just keep it simple. The more talented you are, the more people will put up with your shit (i.e., the more arrogance you can exhibit).

As John von Neumann once said while talking to Claude Shannon, "Nobody knows what entropy really is, so in a debate you will always have the advantage."

8 comments:

  1. So, if I were to apply that formula to mysel...
    PROCESS INTERRUPTED: SIGFPE

    ReplyDelete
  2. Are you having confidence issues?

    ReplyDelete
  3. Yes, but the idea is that that can be my salvation -- once I have no confidence, I can apply L'Hopital's rule and (hopefully) end up with something finite.

    ReplyDelete
  4. I could also add a fudge factor and refine the definition to be log[(arrogance + 1)/(talent + 1)], where arrogance and talent are both nonnegative. (Or I could stipulate that they're both positive, but it's probably better to just add the fudge factor.)

    Good luck with the L'Hopital thing.

    ReplyDelete
  5. By the way, I sigh because I understand you....

    ReplyDelete
  6. That's comforting. I was worried the *sigh* was because you just had a fantastic bowel movement.

    ReplyDelete
  7. As we've so poignantly demonstrated, a Caltech education prepares us very well for life, the universe, and everything.

    ReplyDelete