Wednesday, June 27, 2012

Mathematical notation....and mathematical NOTAAAAAAAAAAAAAAAAAAAAAAAATION

There's mathematical notation.

And then there's mathematical notaaaaaaaaaaaaaaaaaaaation.

'Mathematical notation' includes all kinds of notation.  '+' is a symbol to denote addition, or 'e_i' refers to basis vectors, or even the picky stuff like 'capital letters refer to matrices and lower case letters refer to scalars'.  And, unfortunately as I'll get into, the VAST vast VAST VAST...vast?...VAST VAST VAST majority of 'mathematical notation' is what kind of letters stand for what (f,g,h are typically reserved for function, x,y,z for variables, etc.).

But then...then there's notaaaaaaaaaaaaaaaaaaaaaation.

Notaaaaaaaaaaation is the really big, fundamental stuff.  It's when some mathematician realizes that when you write something in a different way, it completely changes your conception of the world.

Here is an example:

'x+y'

This is a standard, basic, operation.  But the point is that if this is the notation you have, while analysts can use it quite well and all, it is really suited more for algebraists.  However, rewrite the above as:

'+(x,y)'

And now just writing it so differently has changed the conception completely.  Now instead of thinking of it as an operator, it is a function.  It certainly made sense before to think of '+' as a function and to say it is continuous.  However, simply writing it in this form greatly pedagogically clarifies the matter.  The notation is reflecting the paradigm of thought.

When notation can do that, then it's notaaaaaaaaaaation.

But, then there's even bigger notaaaaaaaaaaation.  This is stuff like...the guy who first thought of writing:

A->>B

Meaning A maps surjectively into B.  Modern module theory would not have been possible had someone not thought of making these arrow-chasing diagrams of modules, with each arrow representing a different kind of map (surjective, injective, etc.).  Notaaaaaaaaaaaaaation also involves a way of just being able to even represent certain things.

To a certain extent, I suppose this is time dependent.  Whereas now-a-days saying 'x' stands for a variable is just notation, when the first mathematician thought of representing algebraic equations by x's and y's, that was first class notaaaaaaaaaation; although now-a-days it's rather taken for granted.

Often, I think now that mathematicians have a few stages:

-Average mathematicians slightly generalize a result, or show a certain, VERY specific counterexample.
-Good mathematicians create new results, or completely pathological counterexamples.
-Great mathematicians develop amazing notation and definitions.
(FWIW, I'm a bad mathematician :P)

I suppose for possibly this reason, I might controversially say that Leibnitz was a better mathematician than Newton.  He developed first-rate notation that enabled calculus to really get going.  (nonetheless, Newton was a great mathematician as well by this criteria, as he did invent some notation that caught on at least in physics (the dot notation)).

FWIW, I remember talking to Charles Von Loan, and he very much thought of tensors and tensor product as notaaaaaaaaaaation for the upcoming generation.

No comments:

Post a Comment