Mathematics I use

Fan Cross:

Recently, on an online forum, a question was posed: How much, and what kind, of mathematics does a working programmer actually use? Here is my answer.

First, I and almost all programmers use a lot of boolean logic, from evaluating boolean expressions for conditionals and loop exit criteria, to rearranging the terms of such expressions according to, e.g., De Morgan’s laws. Much of our work borders on the first-order predicate calculus and other predicate logics in the guise of analysis of preconditions, invariants, etc (though it may not always be presented as such).

Next, I do a lot of performance analysis. The kind of data sets we process these days are massive. In 2010, Eric Schmidt made a comment at the Techonomy conference that we (humans) produce as much data in two days as ever existed world-wide in 2003. I want to be able to process large chunks of that and infer things from it, and understanding the space and time complexity of the operations we apply to the data is critical to determining whether the computations are even feasible. Further, unlike in much traditional big-O or theta analysis, the constant factors matter very much at that kind of scale: a factor of 2 will not change the asymptotic time complexity of an algorithm, but if it means the difference between running it over 10,000 or 20,000 processors, now we are talking about real resources. The calculations tend to be much more intricate as a result. Examples: can I take some linear computation and reduce it in strength to a logarithmic computation? Can I reduce memory usage by a factor of three? Etc.