Intelligence just ain’t what it used to be. For much of human history, knowledge signified smarts. Those that remembered facts, recounted history, and memorized big vocabulary words presented as the formidable intellects in their towns, villages, or duchies. Simply knowing things is what made us smart.
But a funny thing happened on the way to the 21st century: a mere mastery of facts stopped being sufficient evidence of real intelligence. Knowledge remains necessary, certainly, but far from sufficient. Why isn’t remembering facts enough to be considered very smart? Albert Einstein, that immortal avatar of genius, described the distinction when asked to recall a simple fact, in this case the speed of sound:
“[I do not] carry such information in my mind since it is readily available in books. …The value of a college education is not the learning of many facts but the training of the mind to think.”
True intelligence manifests in the way we think. In fact, we need many ways to think in order to understand and interact productively with the wide world. If we were to analyze the particular way a person thinks about a particular activity or node of experience, we might find an automated routine of reasoning, representation, and calculation attached to a theoretical framework. In the vernacular of machine intelligence, this would be called an algorithm. However, the term mental model makes more sense. These models define how we think about the world. The better our models, the smarter we can be.
Education today works with facts, of course, but the best teachers also emphasize critical thinking, abstract reasoning, and informational synthesis to promote the building of mental models. Just as important, the best tests assess models, not the regurgitation of facts.
Calculators can perform phenomenal feats of calculations. The computers we carry around everywhere can connect us to the most obscure facts. What good really are either of these tools in the hands of someone who doesn’t know how to think?