Home / Blog /

Mathematicians do it... lazily?!

-
Mathematician writes equations on chalkboard.

Mathematicians, by and large, are bad programmers.

Despite what some people think, programming is really just a small extension of generalized algebra.  The only real differences are the notation and the extension to include explicit input/output and procedural side effects,  A few programming languages lack one or both of those last two.  An example of procedural side effects is the following which tends to be nonsensical to some people who know math but have had no exposure to procedural languages.

a = 10
b = 20
a = b

What are a and b?  The mathematician would say 10 and 20 and possibly point out that 10 can’t be equal to 20.  A programmer would tell you they are both 20.  The problem with that example is that many (but not all) programming languages use ‘=’ for assignment and ‘==’ for equality testing rather than prefixing the statement with ‘let’ which would also explicitly signal to the mathematician that it’s a sequential procedure with memory of the previous step.  Whether that particular notation change was a good idea is open to debate — it’s mainly just a point of personal preference and you’re welcome to use the languages that have a let keyword and use ‘=’ for equality.

As a programmer, my first priority is, of course, that the code works as intended every time no matter the input. Second, and just as important, is that the code is maintainable. In order for that to be true, the code must be easily readable and have little if any ambiguity. Of course, compilers also require there be little if any ambiguity and language standards strive to define behavior if there is unavoidable ambiguity in the notation.

A good example of this is a common ambiguity in languages that use the If-Else construct and allow nesting. Most languages get around this by defining ‘Else’ so that its matching ‘If’ is the one nearest to it. A few languages ban nesting outright.  Most also have some sort of grouping that can resolve the ambiguity and recommend (or occasionally require) you use it.

Additionally, all modern style guides recommend consistent and readable naming of variables and functions. If you have a variable that holds the total number of albatrosses in the world, you do not call it ‘a’.  You call it ‘totalAlbatrosses’.  The case is unimportant though you should be consistent about it. The only common exceptions to this rule are iterators (i, j or k; similar to their use with ), spatial coordinates (w,x,y and z; which only have meaning in relation to each other) and very occasionally, temporary intermediate values which have no easy description of what they are and only exist because it’s clearer or performs better if you break parts of the function into smaller steps.

The point of this isn’t to pick nits. If you don’t name your variables and functions well, reading it is exceptionally difficult and error prone. Even if it’s your own code, if you have to come back and modify the program later, you’ll have a difficult time of it unless things are clear. For anyone else, it’s a nightmare.  Further, if it still isn’t clear, we add comments to explain why we do things a certain way.

Programming wasn’t always done this way.  The first programmers were, of course, mathematicians (like Ada Lovelace) so the older the code the more likely it is to be full of single letter variables and function names or cryptic Hungarian Notation.  We’ve learned over the years that these practices have a terrible real world costs associated with them in time, dollars and proliferation of bugs. Of course not all programmers do this even now and nobody is perfect but it is, and should be, the goal.

Mathematics is different, however. Despite algebras and calculus essentially being the same thing as programming languages, mathematicians and scientists of all stripes still to this day persist in using single letter variables, greeks, strange symbols that can have many different meanings in different contexts, single letter function names (often just the one letter, ‘f’, which they sometimes fail to define for wider audiences) and, by and large, naming things after one or more discoverers rather that what they are or mean.  Don’t even get me started on mathematicians who use functions, parenthetical grouping and implicit multiplication simultaneously.

This is especially troubling for people like myself who are not visual thinkers. Less than 35% of the population are primarily visual thinkers and less than 65% think visually at all.  If you are a teacher, more than a third of your students will have issues with this unless they were filtered out by the educational system. I personally have no visual memory or visualization capability at all. My thinking is entirely auditory, spatial and linguistic. Icons in applications are exceptionally confusing to me.  I get around much faster with text menus. When I do learn an icon-based system, as soon as you rearrange them, I’m lost again because I remember not which icon does what I want, but where (spatially) I have to click to get the desired result. Greeks, single letter variables (particularly if they aren’t even mnemonic) and a proliferation of other specialized symbols makes it almost impossible for me to get through a mathematics course.

That doesn’t mean I can’t do math. I quite enjoy math and my intuition when it comes to geometry is excellent since that’s spatial as well as visual.  I’ve successfully implemented libraries for physics simulation, signal processing, compression, encryption, error correction, neural networks, genetic algorithms and statistical analysis among other things. All of this without formal training in those subjects and with the results rigorously validated. I scour the available information on the topic and painstakingly translate it into code.  It often requires trial and error as I test the outputs against known results to help clarify things that weren’t obvious in the information given.  Then, and only then, do I truly understand it.  It really doesn’t need to be as hard as it is, though.

I’m not saying that notation of various algebras is bad just because it’s different.  I program in over 30 langues and all of them have their strengths and weaknesses. Often which is best depends on the particular task or context much like different algebras.  The best ones help you avoid ambiguity and push you towards clarity.  Additionally we have developer tools now that help you avoid problems by analyzing your code for obvious problems and style violations.  Unless there’s a significant performance or capability cost, though, it’s better if the languages are designed to avoid common problems in the first place.

I’m also not saying get rid of all symbols. I’m just saying that the 95(!) of them found on most keyboards is probably enough.  Have a look at this List of Mathematical Symbols and note not only how many of them there are but how many different meanings each has listed.  It’s excessive and doesn’t even include most of them.  Follow most of the links in the long ‘See Also’ section at the bottom of that page for far more symbols, abbreviations and other cryptic shorthand notation used across the various disciplines.  I seriously doubt that’s all of them, either.

Now that the information age is fully upon us, whole generations of kids have grown up with a computer in their pocket and almost never using a pencil (or chalkboard) outside of school.  I think it’s about time for a reform. Greeks and funky symbols are painfully difficult to use on computers and often require special software and markup to use. Instead of trying to get every application in the world to support something like MathML and complex input mechanisms, wouldn’t it make more sense to just fix what’s wrong with mathematical notation?  You’re welcome to reserve radical abbreviations and shorthand for intimate informal communication with colleagues and personal notes.  I’m certainly not nearly as strict about function and variable naming or comments when I code a little one off tool or send a quick example to a colleague.

You no longer have the easy excuse of laziness to cling to and it’s certainly not economical.  The sheer volume of human knowledge is now such that no one person can know more than a tiny fraction of it. Science and engineering is increasingly becoming cross-discipline and only partly due to the necessary proliferation of specialty subjects.  It’s especially important that things be recorded for posterity in such a way that it isn’t impenetrable to everyone but experts in your particular niche subject.  This isn’t just about accommodating people who think differently or have learning disabilities.

I will say that I think the biggest thing preventing this is probably the classroom chalkboard/whiteboard.  With any luck someone will soon find an inexpensive way to do touch sensitive e-ink displays as large as a university chalkboard.  It will have handwriting recognition, gesture manipulation and the ability to distinguish multiple input devices like “colored” styluses and erasers. Maybe even add speech recognition so teachers don’t continue to spend half their valuable lecture time writing out what they just said.  The technology is mostly there. The displays need to get cheaper and speech recognition needs to improve but both are definitely happening even now.

If you are a visual-only thinker, please let me know if I’m wrong in thinking that words work almost as well as arbitrary symbols/icons for you.  I don’t mean word problems (though, the linguistic thinkers of the world would like that); just a few letters grouped together in a non-arbitrary way.  A grouping you probably already know since you’ve managed to get this far in my rant.

The current large corpus of information could be an issue but we’ve done it before.  Go back and read papers by James Clerk Maxwell and you’ll notice quite a bit of difference in terminology, variable names and notation — to the point it’s challenging to understand.  Hell, the famous equations named after him are completely different from what he wrote because he used Quaternion Algebra which is counter-intuitive and mostly only used by game programmers these days.  Every time the world agrees on a new greek, a new/simpler way of notating an existing method, or names something after the person who invented it, you are obsoleting earlier work.  There should always be some people who bother to learn the old way.  If not, we have documentation like Wikipedia’s to use as a reference.

What’s my suggestion?  Adopt a functional programming language based on lambda calculus like Haskell, ML, Caml, or F#.  Nearly everyone already uses that sort of thing in applied mathematics every day (MATLAB and other tools) so why not use code for teaching and published papers?  Additionally, promulgate a style guide that covers variable and function naming like programmers do for pretty much every other language.

The only other objection I can think of is that it’s arguably easier to see how you might be able to cancel things out or rearrange them with a shorthand notation.  I’m not convinced that’s a huge problem, though.

If you’ve got any suggestions for how to solve that or know of any other reasons not to reform mathematical notation, leave a comment.  Please refrain from citing tradition (they change, get over it) or technophobia (our generations will all be dead one day soon, get over it) but otherwise I’d appreciate feedback.

If you want to call me an ignorant fool, crappy writer, ugly butt-face, or accuse me of bestiality, fine, but please expand on your creative insult by providing detailed reasons of why you think so. Insults, sadly, don’t advance the argument toward a resolution on their own.


Comments: