E. O. Wilson has an interesting brief essay (excerpted from a new book entitled *Letters to a Young Scientist*) on the role of mathematics and mathematical expertise in science. “Most of the stereotypical photographs of scientists studying rows of equations on a blackboard,” he notes, “are instructors explaining discoveries already made.”

Real progress comes in the field writing notes, at the office amid a litter of doodled paper, in the hallway struggling to explain something to a friend, or eating lunch alone. . . . Ideas in science emerge most readily when some part of the world is studied for its own sake. They follow from thorough, well-organized knowledge of all that is known or can be imagined of real entities and processes within that fragment of existence.

Wilson acknowledges that “When something new is encountered, the follow-up steps usually require mathematical and statistical methods to move the analysis forward.” At that point, he suggests finding a collaborator. But technical expertise in itself is of little avail: ”The annals of theoretical biology are clogged with mathematical models that either can be safely ignored or, when tested, fail. Possibly no more than 10% have any lasting value. Only those linked solidly to knowledge of real living systems have much chance of being used.”

Paul Krugman concurs, but with a caveat:

[A]t least in the areas I work in, you do need some mathematical intuition, even if you don’t necessarily need to know a lot of formal theorems. . . . [T]he intuition is crucial, and not just for writing academic papers. If you’re going to talk about economics at all, you need some sense of how magnitudes play off against each other, which is the only way to have a chance of seeing how the pieces fit together. . . . [M]aybe the thing to say is that higher math isn’t usually essential; arithmetic is.

My own work has become rather less mathematical over the course of my career. When people ask why, I usually say that as I have come to learn more about politics, the “sophisticated” wrinkles have seemed to distract more than they added. Krugman’s comment seems to me to help illuminate why that might be the case. “Seeing how the pieces fit together” requires “some sense of how magnitudes play off against each other.” But, paradoxically, ”higher math” can get in the way of “mathematical intuition” about magnitudes. Formal theory is often couched in purely qualitative terms: under such and such conditions, more

*X*should produce more

*Y*. And quantitative analysis—which ought to focus squarely on magnitudes—is

*less*likely to do so the more it is justified and valued on technical rather than substantive grounds.

I recently spent some time doing an informal meta-analysis of studies of the impact of campaign advertising. At the heart of that literature is a pretty simple question: how much does one more ad contribute to the sponsoring candidate’s vote share? Alas, most of the studies I reviewed provided no intelligible answer to that question; and the correlation between methodological “sophistication” (logarithmic transformations, multinomial logits, fixed effects, distributed lag models) and intelligibility was decidedly negative. The authors of these studies rarely seemed to know or care what their results implied about the magnitude of the effect, as long as those results could be billed as “statistically significant.” Competing estimates differ (once their implications are unpacked) by orders of magnitude, with no indication from anyone that anything might be amiss. Of course, there is no reason why mathematically sophisticated analyses cannot be sensibly interpreted. Nevertheless, it seems clear that this is one corner of political science—and I believe there are many others—in which “higher math” is much less urgently needed than “arithmetic.”

I think logic plays a greater role in analysis than math. I believe it possible to teach an introduction to undergraduate statistics relying almost exclusively on logic (and maybe some arithmetic) and that students would benefit from such an introduction, particularly since consuming statistical information (like margins of errors reported in newspapers) requires an understanding of the logical intuition rather than the formal mathematical proofs.

At least when it comes to empirics, I must disagree that higher math is unimportant. Deeper knowledge of statistical theory (and mathematics) is, I think, a balm against the obsession with statistical significance, the lack of predictive model checking, etc, though it is clearly not a panacea.

MRP needs higher math.

How so? (Honest question from someone interested in the method, yet without anything more than intuition about its mechanics)

What is MRP?

To

runmultilevel regression and poststratification you don’t need math, any more than you need math to use an ipad. But to make new developments in this area, a bit of math helps, both in the modeling and in the computation. Our recent paper does not use heavy math, but a strong math background will be helpful in understanding it.I use your book with on multilevel modelling and have training in political methodology. Your paper seemed a bit more difficult to follow because of the multidimensional strata and its associated formal expression. That said, I just skimmed it and was thinking of ways to apply it in other countries. I was asking in case I needed to bone up on any math in particular.

Excellent example. The basic idea of MRP is straightforward and very appealing. Estimation is computationally intensive but, thanks to people like Andrew, easy to automate. And the results are often of very uncertain value because analysts spend too little time exploring and justifying the substantive assumptions underlying the model. I have never been tempted to employ MRP myself because, in circumstances where I think I know enough to be confident that it will work, I think I know enough to do without it. But that’s not to say that it can’t be employed with intelligence and substantive insight, or that it isn’t a valuable tool when it _is_ employed with intelligence and substantive insight.

This is spot-on. For example, one real world implementation of MRP is Value Added Models of teacher quality. At least in New York City’s implementation of this model, school quality and teacher quality were assumed by their model to be uncorrelated: a substantive assumption that requires some empirical justification rather than mathematical sophistication.

Zach makes a valid point; however, it is important to acknowledge the incentives that reward increased sophistication. Publish, publish, publish means having to continually contribute to a body of knowledge despite facing increased time pressures. Tweaking a model or using a different estimator or sampling distribution are inherently less time intensive than trying to ponder and re-conceptualize a problem. Moreover, as has been pointed out elsewhere, null results seldom merit publication (despite the fact that they could be substantively interesting).

Given that this issue is nothing new, it is important to realize that mathematics is a language capable of expressing things both meaningful and nonsensical. As a discipline, coming to terms with mathematical language involves engaging with it so as to better distinguish and praise the former, not demanding that scholars abandon or curtail its usage.

I very much agree. Publication bias is an important problem. I do think increased transparency in the form of public revision control archives and/or study registration (at the very least public data) goes a long way towards resolving the issue though (were they to be widely adopted).

I find rather hard to empathize with this sort of assertions, at least from my personal standpoint. So what follows is a statement of my own experience.

I agree of course that science is only possible with a mixture of “intuition” and “technique”. However, my experience is that the “intuition” side, not only can hardly be taugth (you can only acquire it by doing science, something for which I think you need to actually know math) but that is almost never the binding constraint. Let me state advantages abouth higher math

1. Stuff is teachable and communicable in a easier way; it is therefore more open to scrutiny, as long as you have the right skills.

2. Trying to model ideas you force yourself to state them more precisely, and therefore rather often discover that what was intuitive is actually nonsense; or you may actually discover unexpected causal channels.

3. I have myself, doing abstract math, become aware of things I did not even suspect. For example, learning linear algebra and multivariable analysis changes radically your spatial vision; most people, for example, does not think that there maybe non euclidean spaces. Or take game theory: once you start thinking about stuff in terms of equilibrium and strategies, you acquire several (in my view healthy) mental habits. This is something that, once you have learn it, becomes embedded in your informal reasonning and maybe you don’t actually need to make the math everytime; but if is really hard to learn it informally.

By the way, I think this is part of what lies at the heart of ideas from people who minimizes the role of math. You may already have become so acquainted with abstraction, models and so on that you may have forgotten how you actually learnt it initially.

Look what Mr P has to say the topic (pg 34 http://clacs.as.nyu.edu/docs/IO/2800/munck.pdf )

I totally agree with this. Studying math is the way to learn new intuitions, and intuitions are necessary to really grasp math (as long as you can look past them when you need to–just like intuitions in any other field.) That E.O.Wilson considers intuition and mathematics to be two separate things probably explains why he isn’t good at math.

It might be that biology and the social sciences don’t require “higher math”. But “higher math” is just a synonym for “complex intuitions”–so to say that biology doesn’t require higher math is to say that biology requires nothing but observation of facts and building of simplistic arithmetic relations between them.

My suspicion, though is that if higher math is of limited utility to the “messier” sciences, it’s because it needs to be

even higher–more complicated, messier topics will require kinds of math that haven’t been invented yet.I agree. Not only that, when you understand higher math, basic stuff are easier to understand. This is most clear, I think, with linear algebra. I know nothing of quantum physics or quantum computers, but once I watched a few videos about the subject, linear algebra helped me a lot to grasp what was going on.

Take, for instance, Markov chains. I was having a hard time understanding some stuff via linear algebra. Then, using probability things became easier.

however, I do agree with what Andrew said in his main blog, about economists focusing too much on being rigorous and forgeting about being right. See link: http://andrewgelman.com/2013/01/10/theyd-rather-be-rigorous-than-right/

Deirdre McCloskey has been complaining for a long time on the focus on “statistical significance” as opposed to actual significance/magnitude or “oomph”.

6? Pah! Everyone knows the answer is 42. These guys dropped a constant somewhere.

@mike3350, you wrote

“I think logic plays a greater role in analysis than math. I believe it possible to teach an introduction to undergraduate statistics relying almost exclusively on logic (and maybe some arithmetic) and that students would benefit from such an introduction, particularly since consuming statistical information (like margins of errors reported in newspapers) requires an understanding of the logical intuition rather than the formal mathematical proofs.”

This idea has already been done. There are some community colleges that have started to offer statistics as an alternative to pre college level algebra courses. The idea is that English, Arts, and other students studying humanities benefit more from this than a traditional algebra.

You can read more here.

http://www.carnegiefoundation.org/statway

I agree with Krugman’s coment: ” you need some sense of how magnitudes play off against each other, which is the only way to have a chance of seeing how the pieces fit together”.

Analysis tools are needed not only to provide a sense of magnitude and a sense of how the “pieces fit together”, but to provide a better understanding of how the moving parts interact with each other. How a system’s structure generates patterns of behavior and eventually produces specific events. Of course, I’m referring to Systems Dynamics, the modeling technology that evolved from control theory to address non-linear systems that include feedback and delay (e.g., Jay Forrester).

I think the key is the ability to embrace a discipline that forces the explicit articulation and modeling of our mental models about cause, effect and relationships and test the resulting models against our experience.

Increasingly, however, for many of the important issues, the system components have broader interdependence then the various functional chimney’s dealing with facets of the problem while the response horizons of the systems are longer then our attention spans. Absent a disciplined modeling technology we can not learn from experience, let along agree on what we’ve learned.