Monday, April 30, 2007

Comments on the death of computing

This article is starts off as a complaint or a lament in the area of edge CS, and probably serves as a warning, though the conclusion is probably not as hopeful or optimistic as it could be. Or it could possibly be the lack of imagination. To start:
There was excitement at making the computer do anything at all. Manipulating the code of information technology was the realm of experts: the complexities of hardware, the construction of compliers and the logic of programming were the basis of university degrees.
...
However, the basics of programming have not changed. The elements of computing are the same as fifty years ago, however we dress then up as object-oriented computing or service-oriented architecture. What has changed is the need to know low-level programming or any programming at all. Who needs C when there's Ruby on Rails?
Well, part of it is probably a lament by the author--presumably a scholar--on the loss of status and the general dilution in the quality of people in the field. And the other part is about how there's nowhere interesting left to explore in the field.

To address the first part, it's well known that engineers, programmers (or any other profession) likes to work with great and smart people. Usually, when a leading field explodes you're going to attract these great and smart people to the field. However, the nature of the field of technology is to make doing something cheaper, faster, or easier. And as technology matures, the more the barriers to entry in the field lowers. And as a result, you'll get more people that couldn't make it before in the field and the average quality of people dilutes. People use to do all sorts of research on file access. But now, any joe programmer doesn't think about any of that and just uses the 'open' method to access files on disk. But that's the nature of technology, and it's as it should be.

The environment within which computing operates in the 21 century is dramatically different to that of the 60s, 70s, 80s and even early 90s. Computers are an accepted part of the furniture of life, ubiquitous and commoditised.
And again, this is the expected effect of technology. Unlike other professions, in engineering one is able to make technology which gives people leverage over those that don't use it. This gives the advantage of acceleration and productivity that's scalable that you won't find in other professions. If you're a dentist, there is an upper limit to the number of patients you can see. In order to be even more productive, you'll need to create a clinic--a dentist farm--to parallelize patient treating and you need other dentists to do that. If you're an engineer, the technology that you build is a multiplier, and you don't even need other people to use the multiplier.

But at a certain point, the mass adoption of a technology makes it cheaper, and hence, your leverage over other people isn't that great, and you begin to look for other technologies to make your life easier or give you an edge over your competition. But these are all applications arguments to CS; while important in attracting new talent, it doesn't address where the field has yet left to go on the edge.

As for whether CS is really dead or not, I think there's still quite a bit of work to be done at the edges. Physics in the late 1800's claimed that there wasn't much interesting going on there until General Relativity blew up in their face. Biology has had its big paradigm shift with Darwin, but there's still a host of interesting unknown animals being discovered (like the giant squid) and I'm sure alien biology or revival of Darwin's sexual selection would help open up another shift. Engineering suffered the same thing in the early 1900's, when people with only a background in electromechanical and steam powered devices thought there wasn't much left to invent or explore, until the advent of computing spurred on by the Second World War.

In terms of near-term computing problems, there's still a lot of work to be done in AI, and all its offshoot children, such as data mining, information retrieval, and information extraction. We still can't build software systems reliably, so better programming constructs are being ever-explored. Also, since multi-core processors are starting to emerge, so better concurrent programming constructs are being developed (or rather, taken up again...Seymour Cray was doing vector processors a long while back)

But I'm guessing the author of the article is looking for something like a paradigm shift, something so grand that it'll be prestigious again, and attract some bright minds again.

In the end, he is somewhat hopeful:
The new computing discipline will really be an inter-discipline, connecting with other spheres, working with diverse scientific and artistic departments to create new ideas. Its strength and value will be in its relationships.

There is a need for innovation, for creativity, for divergent thinking which pulls in ideas from many sources and connects them in different ways.
This, I don't disagree with. I think far-term computing can draw from other disciplines as well as being applied to others. With physics, there's currently work on quantum computers. In biology, there's contribution to biology from bioinformatics and the sequencing of genes, as well as drawing from it like ant optimization algorithms and DNA computers. In social sciences, there's contribution to it using concurrent and decentralized simulation of social phenomenon, as well as drawing from it like particle swarm optimization.

One day, maybe it will be feasible to hack your own bacteria, and program them just as you would a computer. And then, a professor might lament that any 14 year old kid can hack his own lifeform when it use to be in the realm of professors. But rest assured, there will always be other horizons in the field to pursue.

No comments:

Post a Comment