By Kevin D. Williamson
Wednesday, January 27, 2016
Marvin Minsky, a seminal figure in the world of artificial
intelligence, has died at the age of 88. Given his life’s work, it is ironic
that he was betrayed by that three-pound ball of meat we humans still use for
thinking, felled by a cerebral hemorrhage. His father, also an extraordinary
man (a pioneering ophthalmologist) died of the same affliction. History is
short when viewed properly: The elder Minsky was born in the 19th century;
Marvin Minsky, though born in 1927, was a man of the 21st century, waiting for
it to catch up with him.
Minsky was a mathematician by training (Ph.D. from
Princeton, 1954) but a philosopher by temperament, who spent his life working
on the questions of what it means to think, to know, and to understand. The
computational tools available to him in the early years of his career were
crude — the telephone in your pocket is many orders of magnitude more powerful
than the most powerful computer of that time — but the relevant limitation was
not merely technological. We had not yet developed our ideas about what a
computer could do. Minsky set about that project, and technology has been
struggling to keep up ever since.
He invented some useful things along the way — the first
head-mounted graphical display, the cofocal microscope, tactile-feedback
gloves, and the first neural network “learning machine” — and helped found some
institutions, too, notable among them the Artificial Intelligence Laboratory at
MIT. He wrote important books in his field, generated at least one major
intellectual controversy, and influenced everything from robotics to the
structure and culture of the Internet.
His philosophical disposition is perhaps best captured
for the layman by a genre of humor that developed at MIT, the AI koan. My
favorite of these is one describing an interaction with another AI giant,
Gerald Sussman:
In the days when Sussman was a novice, Minsky once came to him as he sat
hacking at the PDP-6.
“What are you doing?” asked Minsky.
“I am training a randomly wired neural net to play Tic-tac-toe,” Sussman
replied.
“Why is the net wired randomly?” asked Minsky.
“I do not want it to have any preconceptions of how to play,” Sussman
said.
Minsky then shut his eyes.
“Why do you close your eyes?” Sussman asked his teacher.
“So that the room will be empty.”
At that moment, Sussman was enlightened.
He was, to be sure, a genius, and a rare one at that — he
also enjoyed improvising fugues at
the piano. But Minsky probably will not be remembered the way we remember Henry
Ford or Thomas Edison, even though his work probably will prove to be much more
significant to us — us humans — than either of those worthy men’s contribution
have been, impressive as they are. In simpler/cruder (your preference) times, a
man with a good general education might comprehend much of what was going on at
the higher levels of endeavor outside his own specialty: A doctor could keep
abreast of developments in science, music, theater, politics, etc., in a way
that is nearly impossible now. That might be pleasing in a way to Minsky, too,
who drew parallels between the way independent systems perform in the brain and
the way separate “resourceful agents” perform in society. If that sounds like a
familiar political idea, you won’t be surprised to find the works of F. A.
Hayek cited in Minsky’s.
And therein lies an interesting question.
***
Marvin Minsky was many things, but not much of an
entrepreneur. He was on the MIT faculty from the 1950s onward, and though his
work contributed to countless commercial endeavors, he apparently never felt
all that much pull to move to Silicon Valley and become a titan of industry. A
professor’s life is a pretty good one, after all. Because there is no Minsky
Inc. with a market capitalization representing his intellectual capital, it is
impossible to put a dollar value on his contributions. And one gets the feeling
that he was playing a much larger game than that.
But it is worth considering how much the individual
genius does contribute to economic life. The Randian conception of the
individual, heroic capitalist is in bad odor just now, in this populist moment,
the People being one of the great enemies of the person. Not long ago, I asked
a student with whom I was working what she wanted to do after college, and she
answered: “Work for a nonprofit.” I asked her what sort of nonprofit, doing
what sort of work, and she said she hadn’t thought about it. “So anything,” I
asked, “just so long as it doesn’t turn a profit?”
Maybe just a little profit: Even Bernie Sanders, the
howling socialist who might, with a bit of luck, be the Democratic party’s
nominee for the presidency this time around, will countenance a little profit.
Everybody likes small businesses: small and cuddly. The U.S. government
maintains a Small Business Administration, and there are hundreds (perhaps
thousands) of tax breaks, credits, exemptions, regulatory exceptions, etc.,
made on behalf on small business. Mrs. Clinton promises “tax relief to families
and small businesses,” and practically every politician walking this good green
earth will tell you: “Small business drives the economy.”
The problem is, as it so often is in politics, that that
isn’t really true.
Most businesses start off as small businesses, but the
ones that actually drive economic growth and employment are in the main the ones
that do not stay that way. Deduct the effect of a relatively small number of
businesses — Apple, Google, Facebook — and the economy of the United States (to
say nothing of the economy of California!) looks very different, and not in a
good way. A few years ago, the economist Enrico Moretti set about trying to
calculate the real economic footprint of technology firms such as Apple, which
at the time employed around 12,000 people in Cupertino, Calif., but was
responsible for some 60,000 jobs by his estimate. Facebook, at the time an
employer of around 1,500 people, was a critical factor in some 200,000 jobs. A
great many people are employed in small businesses, but an overlooked factor in
that situation is that the major customer of small business is big business. As
John Tamny put it in his writeup of Moretti’s findings in Forbes, “much of the small-business job creation that has the
commentariat so giddy is a function of those big businesses that the same
commentators have a tendency to demonize.”
It is easy to drink too deeply from the cup when toasting
the heroic entrepreneur, but the facts are the facts: No Larry Page and Sergey
Brin, no Google; no Steve Jobs, no Apple as we know it; no Bill Gates, no
Microsoft; no Mark Zuckerberg, no Facebook. Again, this isn’t a brief for Ayn
Rand–style CEO worship, or for doctrinaire free-market thinking, either: No
Minsky and much of the work that supported these developments might not have
happened, and he spent his life in a comfy professorship. (Yes, at MIT, a private
university.) No big-money military-industrial-complex spending and no ARPANET,
etc. There is no need to rewrite history to satisfy our ideological
preferences.
Our geniuses, including our entrepreneurial geniuses —
our “vital few” – are embedded in a society, one that has both a public sector
and a private sector. It is all well and good to argue over the relative size
and prominence of those — I myself spend a fair amount of time doing so — but
it is probably more important to understand how each of them works, and which
aspects of each of them works in what ways. This ought to inform, among other
things, the current debate about immigration: It is the case that our current
system is too chaotic and that overall immigrations level are too high, which
is a source of social and institutional stress; it is also the case that there
were no Zuckerbergs, Brins, Jobses, or Minskys on the Mayflower. If you have an ideological disposition that leaves you
ill-equipped to deal with that reality, then you need a new set of beliefs —
resist the temptation to seek a new set of facts.
It also should inform our thinking about one of the least
understood and most destructively misconstrued issues in American public life
at the moment: finance.
***
Just as everybody loves to love small business, everybody
loves to hate high finance. It is not coincidental that American finance, like
the American ruling class, is bicoastal, with Wall Street on one side and Sand
Hill Road on the other. When we talk about our geniuses, those rare birds who
seem to usher companies and industries into existence ex nihilo, we often hear (“You didn’t build that!”) such sentiments
as: “Yes, but where would they have been without the public universities?” And
that is a fair question: Our public universities, despite the sorry state of
their liberal-arts departments, are the envy of the world. We should cherish
them, and nurture them. (Which also means, on occasion, pruning them.) They
are, by and large, an institution that works. So does, to a remarkable extent,
government support of basic science and applied research, at both public and
private universities.
But considering Apple or Google, there’s another question
that should be answered:
Where on Earth would they be without venture capital?
Stanford’s Ilya A. Strebulaev and Will Gornall considered
that question a few months back in their straightforwardly titled article, “How
Much Does Venture Capital Drive the U.S. Economy?” Their findings are not
astounding — not if you’ve really been paying attention — but they are
dramatic:
Of the currently public U.S. companies we have founding dates for,
approximately 1,330 were founded between 1979 and 2013. Of those, 574, or 43
percent, are VC-backed. These companies comprise 57 percent of the market
capitalization and 38 percent of the employees of all such “new” public
companies. Moreover, their R&D expenditure constitutes an overwhelming 82
percent of the total R&D of new public companies. Given that the VC
industry has been in large part spurred by the relaxation of the Prudent Man
Rule, these results also provide an illustration of the impact that changes in
government regulation can have on the overall economy.
. . .
VC-backed companies include some of the most innovative companies in the
world. To get an idea of the importance of these companies, it is instructive
to look at research and development. In 2013, VC-backed U.S. public companies
spent $115 billion on research and development; up from essentially zero in
1979. These VC-backed companies now account for . . . 42 percent of the R&D
spending by U.S. public companies. That R&D spending produces value for not
just those companies, but also the entire world through positive spillovers.
If you are like me, the first thing you ask about this
is: What the heck happened in 1979?
Before I answer that, I want to offer up a parallel case
as an appetizer. (For your pre-consideration?)
In the darkest shadows of ancient telecom history, the federal government wrote
rules strictly regulating the use of certain low-power radio-communication
devices, out of fear that they would interfere with commercial and government
radio. But within a few years, the scientists and engineers began to suspect
that this wasn’t necessary. A report — a report written in 1938 — found that in
many circumstances “there would be no reason for suppressing their use.” And
all of the relevant gentlemen stroked their chins, and did — nothing. It wasn’t
until the Reagan administration and the unloosed deregulatory energies of the
1980s that these rules were relaxed, as a result of which we have ubiquitous
cellular communication and wi-fi. But that process started decades later than
it needed to. Government is what happens when the power to say no meets the
power to move slow.
What happened in 1979 had to do with the “Prudent Man
Rule.” Prior to the relaxation of this rule, the largest holders of investment
capital — pension funds — were effectively banned from investing in venture
capital. The same geniuses who sat on the emergence of wireless technology for
a generation or two sat on venture capital a lot more heavily. That
deregulation, Strebulaev and Gornall write, “led to a greater than tenfold
increase in the money entrusted to VC funds: VC funds raised $4.5 billion annually
from 1982 to 1987, up from just $0.1 billion ten years earlier.”
No Prudent Man Rule, no Google? No Apple? Alternative
history is a fiction genre, not a journalistic one, but the implications are
worth meditating upon. “Genius without education is like silver in the mine,”
according to the poster my fifth-grade teacher gave me. (Thanks, Mrs.
Shackles.) Genius without capital is like mining that silver with teaspoons (“Why
not use spoons?”), which would leave us all poorer. Strebulaev and Gornall:
“VC-backed companies play an increasingly important role in the U.S. economy.
Over the past 20 years, these companies have been a prime driver of both
economic growth and private-sector employment.”
Our current national mood is very grim: anti-immigration,
because we fear foreigners will steal our jobs; anti–Big Business, which wants
to help foreigners steal our jobs; anti-finance, because we fear Wall Street
will somehow figure out a way to make money stealing our jobs; anti-technology,
because we fear that robots will steal our jobs. And we are, too often,
anti-entrepreneur, too, resentfully suspecting that somehow the men and women
who create the things that we do not want to live without are somehow getting
over on us.
How much prosperity has that fear suppressed? And for how
long will we continue to suppress the creative energies of our best and most
innovative people — not all of whom are CEOs and venture capitalists — in
deference to that fear?
“Speed is what distinguishes intelligence,” Minsky wrote.
“No bird discovers how to fly: Evolution used a trillion bird-years to
‘discover’ that — where merely hundreds of person-years sufficed.” And after
that, 70 years from landing at Kitty Hawk to landing on the moon.
We ought to have enough self-interest — if not sufficient
courage — to get out of our own way.
No comments:
Post a Comment