When it comes to computer software applications, in many ways we have come full-circle since the mainframe days of the 1960s and 1970s. Back then, corporate information was housed in central locations with strict rules for access and modifications. To use the applications, we used “dumb terminals”, whose job was nothing more than displaying information and accepting user data.
When the personal computer revolution came, much more power was given to individuals so that mainframe computers were no longer needed for simple tasks, such as word processing for example. This allowed for huge productivity improvements because it reduced the dependency on the IT department for anything other than centralized systems, such as billing or inventory control.
But then “islands of productivity”, as we called them, emerged with no central control, very little security, and no sharing of corporate data. The first step towards solving this problem was to implement local area networks which connected personal computers together and allowed for sharing of information.
Since then, many of the programs we use are what we call “client-server” applications, where a central database (the “server”) shares data with applications running on desktop computers (the “clients”).
It can be very difficult to support applications in this environment. The IT support group has to maintain the client applications on whatever types of hardware and operating systems they are running, and make sure all the users are running the same version of the software application. If there is a change made to the application programs, all the desktop computers have to be updated.
Now with newer technologies available, there is a move back to doing all the processing on powerful servers and distributing data back to the desktops via the Internet where, at the client end, only a web browser is required. Akin to the “dumb terminals” of the 1970s, these “thin clients” don’t need any specialized programs. This solves the problem of installing software upgrades because the software only changes at the server end, i.e. one copy instead of dozens, hundreds or even thousands of copies.
The other advantage of thin clients is they can run on just about any computer with just about any operating system. If the computer can run a web browser it can run the application. This can mean significant savings for the IT support group.
Earlier versions of these thin client applications were slow because whenever any new data was received from the server, the browsers would refresh entire screens at a time. Now, we are able to minimize network traffic by using special technologies that refresh only those parts that change. One such technology is called AJAX (which stands for “Asynchronous JavaScript and XML”—as if that clarifies things).
Along with speed improvements, AJAX allows much more refined screens—better than what you normally see on web sites. Customers can therefore implement secure centralized corporate applications that it can share with its employees or with the whole world if it wishes.
At Nicom, we’re finding that sometimes we are able to convert traditional client-server applications to web-enabled applications while retaining and reusing much of the programming residing in the old applications. The way data is displayed is different (i.e. via a web browser rather than via a program running on the desktop), but the processing often doesn’t have to change much at all. That way we are able to save significantly on the cost of converting from “client-server” to “web-enabled” applications.
The big return on investment for our customers comes later however, when the software is maintained in one place only, yet is usable by many people no matter what equipment sits on their desktops.