
180 Park Ave - Building 103
Florham Park, NJ
David S. Johnson is Head of the Algorithms and Optimization Department at AT&T Labs-Research in Florham Park, NJ, and has been a researcher at the Labs (in its many incarnations) since 1973, when he received his Ph.D from MIT.
The author of over 100 technical papers, he is perhaps best known as the co-author of COMPUTERS AND INTRACTABILITY: A GUIDE TO THE THEORY OF NP-COMPLETENESS, for which he shared the Lanchester Prize of the Operations Research Society of America (1979). His research interests (in addition to the theory of NP-completeness) include approximation algorithms for combinatorial problems, and the experimental analysis of algorithms, with special emphasis on bin packing, graph coloring, and the traveling salesman problem. As one of the founders of ACM's Kanellakis Prize, he is particularly interested in recognizing and encouraging the interaction between theory and practice in computer science.

A brief history of NP-completeness, 1954-2012
David Johnson
Documenta Mathematica, Extra Volume: "Optimization Stories" Marrtin Grotschel, Editor, pp ,
2012.
[PDF]
[BIB]
DOCUMENTA MATHEMATICA Copyright
The definitive version was published in 2012]. , Volume Extra Volume, 2012-08-19, www.math.uiuc.edu/documenta/
{The year 2012 marks the 40th anniversary of the publication of the
influential paper ``Reducibility among combinatorial problems'' by Richard
Karp, which first demonstrated the wide applicability of the
concept now known as NP-completeness, which had been introduced the previous
year by Stephen Cook and Leonid Levin, independently.
It also marks the 100th anniversary of the birth of Alan Turing,
whose invention of what is now known as the ``Turing machine'' underlay that
concept. In this chapter, I shall briefly sketch the history and pre-history of
NP-completeness (with pictures), and provide a brief personal survey of the
developments in the theory over the
last 40 years and their impact (or lack thereof) on the practice and
theory of optimization.
I assume the reader is familiar with the basic concepts
of NP-completeness, P, and NP, although I hope the story will still
be interesting to those with only a fuzzy recollection of the definitions.
}

Network architecture for joint failure recovery and traffic engineering
Dahai Xu, Robert Doverspike, David Johnson, Martin Suchara, Jennifer Rexford
Sigmetrics 2011,
2011.
[PDF]
[BIB]
ACM Copyright
(c) ACM, 2011. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in ACM SIGMETRICS , 2011-06-07.
{Today's networks typically handle traffic engineering (e.g., tuning the routing-protocol parameters to optimize the flow of traffic) and failure recovery (e.g., pre-installed backup paths) independently. In this paper, we propose a unified way to balance load efficiently under a wide range of failure scenarios. Our architecture supports flexible splitting of traffic over multiple precomputed paths, with efficient path-level failure detection and automatic load balancing over the remaining paths. We propose two candidate solutions that differ in how the routers rebalance the load after a failure, leading to a trade-off between router complexity and load-balancing performance. We present and solve the optimization problems that compute the configuration state for each router. Our experiments with traffic measurements and topology data (including shared risks in the underlying transport network) from a large ISP identify a "sweet spot" that achieves near-optimal load balancing under a variety of failure scenarios, with a relatively small amount of state in the routers. We believe that our solution for joint traffic engineering and failure recovery will appeal to Internet Service Providers as well as the operators of data-center networks.}

Voting power and target-based site prioritization
Steven Phillips, Aaron Archer, David Applegate, David Johnson, Robert Pressey, Desmond Torkornoo, Matthew Watts
Biological Conservation,
2010.
[PDF]
[BIB]
Elsevier Copyright
The definitive version was published in Biological Conservation (Elsevier). , Volume 143, 2010-07-01
{Indices for site prioritization are widely 2 used to address the
question: which sites are most important for conservation of
biodiversity? We investigate the theoretical underpinnings of
target-based prioritization, which measures sites' contribution to
achieving predetermined conservation targets. We show a strong
connection between site prioritization and the mathematical theory of
voting power. Current site prioritization indices are afflicted by
well-known paradoxes of voting power: a site can have zero priority
despite having non-zero habitat (the paradox of dummies) and discovery
of habitat in a new site can raise the priority of existing sites (the
paradox of new members). These paradoxes arise because of the razor's
edge nature of voting, and therefore we seek a new index that is not
strictly based on voting. By negating such paradoxes, we develop a set
of intuitive axioms that an index should obey. We introduce a simple
new index, "fraction-of-spare," that satisfies all the axioms. For
ingle-species site prioritization, the fraction-of-spare(s) of a site
s equals zero if s has no habitat for the species and one if s is
essential for meeting the target area for the species. In-between
those limits it is linearly interpolated, and equals area(s) / (total
area - target). In an evaluation involving multi-year scheduling of
site acquisitions for conservation of forest types in New South Wales
under specified clearing rates, fraction-of-spare outperforms 58
existing prioritization indices. We also compute the optimal schedule
of acquisitions for each of three evaluation measures (under the
assumed clearing rates) using integer programming, which indicates
that there is still potential for improvement in site prioritization
for conservation scheduling.
}
Knuth Prize 2009.
David S. Johnson Named 2009 Knuth Prize Winner for Innovations that Impacted the Foundations of Computer Science
SIAM Fellow, 2009.
For contributions to algorithms and complexity theory.
INFORMS Computing Society Prize, 2007.
AT&T Fellow, 2005.
Application of Algorithms: Honored for fundamental contributions to the field of computer science and the application of algorithms to problems of importance to AT&T.
ACM Fellow, 1995.
For fundamental contributions to the theories of approximation algorithms and computational complexity, and for outstanding service to ACM.