0:04

In recent years a lot of people have asked me just what the phrase analytic

Â combinatorics actually means. In this lecture I'm going to describe

Â what the field is, and tell the story of how it came into being.

Â I think this context is important for anyone who's interested in learning

Â anything about the field. this lecture is dedicated to the memory

Â of my friend and colleague Philippe Flajolet, who really was the driving

Â force behind the development of the field and who died suddenly in 2011.

Â I'll start off with a brief history to try to give some context for how we got

Â there. I first met Philippe in 1977.

Â my first research paper that I wrote, after getting my PhD, was on an algorithm

Â called odd even merging. and I, I went to in those days you would

Â get your paper typed by a secretary and go to a conference and present it.

Â and I was very proud to have developed this formula that involves the gamma

Â function and the zeta function and gives a precise description of the performance

Â of this particular algorithm. and just a few months later we had a

Â conference in Providence, Rhode Island where I was at the time and in, in those

Â days you go to the conference and the first thing you do in the proceedings is

Â go and and look at the table of contents of the proceedings, see what's there.

Â And there was another type paper and I was amazed to see a formula very much

Â like mine involving the zeta function and the gamma function, even though it was

Â studying a completely different problem. and just as I was realizing that Philippe

Â came up to me and said I believe that we have a formula in common and both of us

Â were very surprised to see the similarities among these formulas.

Â in, it might be said we spent the rest of our careers trying to understand why.

Â now it's worth it to think about wh, what the world was like at the time, that we

Â started our research careers. we were both, at that time, in that fa,

Â just the earliest part of our research careers.

Â and the world was changing in very important ways, all around us.

Â Without going into too much detail, it really was the case that when we started

Â school people wore coats and ties, coats and ties to dinner, and so forth.

Â But by the time we get our PHD's there was Woodstock and hippies and, and so

Â forth. And, but, but, with re, respect to the

Â technology there were huge changes. when we started school computers were big

Â expensive rare, there were physical devices for every switch, or for every

Â bit. but not that much longer when we started

Â research and teaching, we had integrated circuits and computers were becoming

Â ubiquitous, and fast and cheap. another big thing was the access to

Â computers. most of the time that we were in college

Â and in graduate school you would get to develop a program.

Â You had to put each line of the program on a punched card, and you had to give a

Â box of punched cards to a computer operator.

Â And you would get to run your program once a day.

Â not that much longer, we had later when we su, started research in teaching, we

Â had time shared terminals and we're always connected and have been connected

Â ever since. and as I mentioned, when we started

Â school my thesis was typed by a secretary.

Â so you present the result and six months later you sort of see what it looked like

Â and submit it. It might be might be a year between the

Â time that you get the result and somebody sees it.

Â but not that much longer we had word processing and and mathematical type

Â setting and we can have much quicker and much broader communication of our, of our

Â research results. And another important thing is that when

Â we were in school and graduate school the curriculum was about math.

Â Everybody learned lots of math. And I learned PDEs and abstract algebra

Â and probability and topology. that's what that, that's what people with

Â an interest in working in technical fields did.

Â but the, by the time we started researching teaching, there was computer

Â science. And people had to learn about compilers

Â and algorithms and data structures and graphics and operating systems

Â programming languages, numerical analysis.

Â and all kinds of fields related to computer science.

Â So these are huge differences in a relatively short amount of time and in

Â thinking about it when preparing this talk, I really came to understand and

Â believe that this was a really profound change in the way the world worked.

Â Maybe even more profound then thee evolution of PCs personal computing, or,

Â or even the internet. the world was a vastly different place,

Â when we started to get to work. so that's a context where, where this

Â story starts. now analysis of algorithm so, that's the,

Â field of study that, both Phillipe and I, were engaged in.

Â and it's actually, natural and each, questions.

Â and it actually started with Babbage. So, this is a quote from, from Babbage,

Â who's widely attributed to have one of the, maybe the first, designed the first

Â computational engine. It was a mechanical device that could do,

Â arithmetic computations. in what he said, even before building the

Â thing, as soon as an analytic engine exists that will necessarily guide the

Â future course of the science, because you'd be able to do computations.

Â but he said whenever any result is sought, the question will arise, by what

Â course of calculations can these results be arrived at, by the machine, in the

Â shortest time? That's in 1864, and you can see why it

Â was important to Babbage, this thing actually had a crank, and the only way

Â that it could compute things, was by somebody turning the crank.

Â Obviously, you want to minimize the number of times that you need, turn the

Â crank. And computers were, expensive, and slow

Â and, used energy and so forth, so minimizing the cost of computation was

Â always very important. even Turing who many who is, [COUGH] ,

Â the founder of theoretical computer science could see the importance of these

Â kinds of practical questions. We want to have a measure of the amount

Â of work involved in the computing process even though it might be a crude one.

Â We count up the number of times that elementary operations are applied in the

Â whole process. and in order to figure out how much work

Â it's going to take before to help in designing efficient computation.

Â But the field of analysis of algorithms was really initiated by Knuth in the

Â 1960's. and what Knuth told the world and there

Â was some debate about it at the time, was that classical mathematics is really got

Â the necessary tools that we need for understanding the performance of

Â algorithms. there's things like recurrence relations,

Â and generating functions, and asymptotic analysis.

Â That has the benefit of giving a scientific foundation for the analysis of

Â algorithm. And Knuth wrote a series of four books so

Â far, it, and the first one came out in the, in the late 60s and two more came

Â out in the early 70s. they really set out this scientific

Â foundation, that we really can use classic mathematics to understand the

Â performance of algorithms. and, with those mathematical models, we

Â can go ahead and accurately predict performance, and compare the efficiency

Â of algorithms. and that's what we found exciting.

Â we could use classical mathematics to understand the cost of a computation, and

Â then test out those results. And, formulate hypothesis about how long

Â it would take to do something. And then validate those hypothesis by

Â actually implementing and running the programs and checking them against the

Â math. There were many, many practical

Â applications where people needed to have these kinds of accurate math,

Â mathematical models and, and predictions. and Knuth's books were very densely

Â filled with information that helped us advance this science.

Â so that's a brief history of where we got started with analysis of algorithms.

Â