Tuesday, June 21, 2005

Does the OS really matter anymore?

In the realm of general purpose operating systems (not special purpose embedded ones such as those that run your cell phone/PDA) the commercial world usually runs either Windows or some flavor of Unix. I have spent most of my career developing server software that runs in a Windows and Unix environment. I had a brief stint in the embedded world working with special purpose operating systems. Years ago, most shops would not consider Windows seriously for 24/7 mission critical apps. Unix was designed from the ground up to run distributed server applications. Windows, on the other hand, started out specializing in the desktop realm. Windows has had its growing pains in the server world, but for several years now it has been generally stable and performance is good enough for a large class of server apps. Both operating systems support networking, simultaneous users, concurrent processes and threads, journaling and network file systems, and remote administration. At the 10,000 Ft level, you can view Windows and Unices as providing equivalent functionality. In the realm of programming languages/platforms there has been a steady shift from languages compiled down to native code (C/C++) to generating byte code for virtual machines (Java VM/MS CLR). The VM concept is certainly not new, but mainstream corporate software shops were not generally using such technologies until Java became a cool buzz word around 1996.
Java was officially made public in 1995, but corporations did not really start to evaluate it until 1996. Microsoft's .Net was started soon after this time when MS execs realized that Java was bad for their business model (I suppose it has not been that great for Sun in the long run either). Moore's law and the increasing demand for complex software running on multiple operating systems has brought VM technologies to the forefront of software development. Yeah...so what? Well, this commercialization of VM technology means that a large percentage of commercial software written over the last several years can run on multiple operating systems without being ported/recompiled. This should have happened decades ago, but this requires commercial acceptance and standardized VM specifications. Currently we have two: Java's VM, and Microsoft's CLR. Virtual machines are certainly nothing new in the realm of software, but it has only been within the last 8 or so years that most commercial software that is not native code runs on a VM implementation available from more than one vendor. This shift to "managed code" platforms is becoming mainstream at an interesting time with regards to the desktop. Google is in the process of showing the world that you can do all your daily tasks via their server-side software. They have already demonstrated that web-based user interfaces can be very usable and responsive. Once this is widely accepted, then all that really matters are the protocols involved. The meat of the computing will occur on the server-side. Thus it won't matter if it's running OSX, FreeBSD, Win2003, Linux, Solaris, or even BeOS. The OS has been a commodity for some time now. There are simply many companies in denial.