I was walking around Eureka today, taking some photos with my new camera, and as usual I got to thinking--this time about digital cameras. I've noticed them explode in popularity in just the last six years or so, concurrent with their meteoric rise in capability, to the extent that there is now hardly anything film can do that digital can't do better. The digital method is, although I admit some prejudice here, a much more logical (if not quite as straightforward) method of recording visual input, and it crossed my mind that if digital cameras came first, the analog method, when it surfaced, would seem quite unconventional and marvelously clever. Rather than interpreting the input and distilling it into data, the analog input is instead transferred directly to another physical format, one which is static and long-lasting. It is, when it comes down to it, another method of storage, but it is a method particularly suited to its domain.
This line of thought brought me to the wholly unoriginal insight that digital processing is designed to be domain-general. It still exists in the physical world, of course, but the physical has been distilled out of it to the greatest extent possible. The majority of processing is done by clever arrangements of a single physical object--the transistor--with domain-specific hardware needed only for the interfaces to the outside world (the buttons, the displays, the light sensors, etc.). "Processing" has been purified and commoditized, and in doing so has taken the concept of "economy of scale" to a level never before seen in human history. Like most commodities, there is a healthy amount of competition in processor development, and the product has been further refined and expanded upon... but it's still the same basic product.
Processing, as a concept, need not be limited to how we think of it today. Film cameras, after all, still process and store information in their own way, and although it is a highly domain-specific method, it is also highly efficient. By thinking of processing as a theoretical task, rather than a uniform commodity, I realized that the super-fast single-stream CPU that dominates computing today is not the only option--and that its monopoly may leave other potential approaches withering in the shadows. Multi-core processors are a step in a new direction, but they are still merely an adaptation of the existing paradigm, weighted down by the heavy burden of their pedigree. The tortured adaptation of popular programming languages in an attempt to ease parallelism is evidence that a fresh start is needed.
I'm aware, or at least I suspect, that stripping "processing" of some of its uniformity will injure one of computing's greatest strengths. Nevertheless I think it might be just what is needed to wring more concrete benefits from Moore's law. Even exponential advancement in single-stream processing will only produce minimal benefits for problems best suited to a parallel approach, and meanwhile the problems best suited to linear, step-by-step processing already have more than enough speed. Moore's law is creating a surplus of a currency we can't spend, and the exchange rates are usurious.
I should note, for the record, that this post was not researched, and there's a chance that all of this blathering might be nothing more than me reinventing a square wheel. I know that parallelism is not a new concept, and frankly I'd be surprised if a dedicated parallel programming language hadn't been invented by now (albeit perhaps an ancient, user-unfriendly language like Lisp or Haskell). I'm also not aware of the state of processing hardware, but there certainly seems to be something of a monoculture going on--at least in the commercial sphere. But I'm no stranger to intuiting existing insights, and it's actually kind of fun. If anything might be new in all of this rambling, it's the realization that processing is treated as a commodity, and that this unspoken assumption may be the root of today's difficulties in transitioning towards parallelism. Also, this post was not initially all about parallel processing--that's just the example that sprung to mind. My aim is to conceive of processing, even digital processing, as a broader concept than just the classic CPU. That's an insight... isn't it?