One of the recurring ideas about progress and research is the notion of technological singularity. Anirul had a post in his blog about this. The idea behind this term is that technological progress follows an increasing rate, and that at some point, research and discoveries are done by machines and outpaces human understanding.
This theory contains elements that in general make me quite suspicious:
- Future extrapolation of an exponential curve
- Artificial Intelligence
- An historical discontinuity only a fraction of the population is aware of
One curve that was exponential for a long time was the increase of population on earth. In the 70s, by extrapolating this curve one could calculate when the wall of flesh would pass the speed of light. Artificial Intelligence is also a blurry thing from the 70s, with crazy promises that were never fulfilled and a definition of the problem that has changed to many times to count. I remember my professor of Artificial Intelligence claiming the whole branch should be renamed to Advanced Informatics just so we could get rid of that legacy. As for the last point, history is full of apocalypses with various shapes and color. Often, the guys announcing such changes are not fundamentally wrong, but usually misjudge the other changes that surround their breaking point.
Still, the rate of technological progress has been increasing for a long time. What could stop this? Administrative complexity. I am not arguing that social, administrative or governmental decisions will slow down the rate of progress. While some government and societies are trying to control research whenever they will succeed or not is another discussion altogether. What I think will slow down the rate of progress is the problem of coordination between researcher and research subjects.
The cool thing about renaissance researcher was that they were everything: chemists, mathematicians, physicist, astronoms, strategists and philosophers. They could often had a good overview of cutting edge science in their heads and could know all eminent specialists of their time by name.
As the rate of technological progress has increased, researchers have become more and more specialized. At the same time they spent more and more time looking up information. The lone mad-scientist has been replaced by larger and larger teams. The infrastructure to do experiments has also increasing tremendously in size. The experiment to discover the speed of light can be done in one room with a few mirrors. Contrast this with the LHC experiments which involves ten-thousands of persons, has the size of small city and costs billions of dollars. At the same time, research is more and more specialized and fragmented, and communication between branches of research is more and more a problem.
The core problem is one of scaling. Research is an activity that does not scale well beyond a certain point. If ten scientist discover something in one year, hundred and twenty will not do it in one month. Instead, in that month they might have finished getting introduced and organized the next symposium. As the number of research grows, so does the need for librarians, assistants, secretaries, managers, conferences, surveys, summaries and symposiums. In short: administration.
As they grow bigger, organizations need a larger and a larger administrative body to support them. A nice metaphor I heard is that organizations are like spheres: the creative interaction happens at the surface, the interface to the outside world, but as they grow, they need more and more support. While the surface grows quadratically, the volume grows as a cubic function. As an object approaches the speed of light, added energy contributes less and less to the speed, and more and more to its mass. I feel research follows a similar trend: beyond a certain scale, energy is more and more converted into administrative dead-weight.
Of course those a human limits. The central idea of the singularity is that technological progress will not be researched by humans anymore, but by artificial intelligences. Still, computer systems suffer from scalability problems, while Moore’s “Law” states that the number of transistors doubles every eighteen months, the usable computing power does not double, there is more and more waste because of synchronization, communication and fault-tolerance. So even those system will tend to an asymptotic limit. A Sigmoid function looks a lot like an exponential function in the beginning…