There’s a lot of talk regarding the concepts of thin clients which has existed for as long as 20 years. Initially it was envisaged that the model of computing would be a solution where powerful mainframes were connected to by dumb terminals that only displayed an image of what was on the user’s server session. This was in fact, more possible back then than it is now. The reason for this is that we were only piping down raw text into a terminal session, which would be a certain number of lines and columns such as 80×32.
We used to see ‘thin clients’ everywhere, in workplaces and libraries the Wyse terminals etc were ubiquitous. This was how business applications worked. It was the computing at the time. There were a ton of control advantages to centralised computing and potentially lower overhead support costs.
What killed the thin client
The question is: what happened to all of this?
The Graphical User Interface came in and made it impossible to work this way. Slow network speeds on the internet, and even on LANs (token ring networks running at a blistering 4/16 Mb/s). By this stage computers have internal transfer rates that are many, many times that of the best network speeds available. Combined with the falling costs of storage, the application power moved to the PC and network servers were reduced to simple storage, often the PC was able to cache information from the server locally, only querying the server when it was absolutely necessary.
Can this trend be reversed?
Fast forward to today we’re seeing more focus and excitement over shifting back to the older thin client model. While thin clients have been very feasible with lightweight image transfer technology, there’s always exceptions that cause problems with how they work. Firstly, there can be issues communicating with various remote devices attached to the thin client. There is less flexibility in the applications that can be installed as well. This is why thin client technology may be appropriate for organisations like banks, retail and financial institutions, but will never be appropriate for design businesses or professional services.
The other thing that will stop the trend from a complete thin client solution for many homes or businesses is the increasing demands of graphics and rich interface devices.
As we see technology such as Virtual Reality devices coming into play for PC’s we can appreciate that the bandwidth demands of such devices are going to make streaming all the content down a network difficult.
As long as there are latency issues on networks and the internet in delivering content, there will always be a space for rich client technology over thin clients. The ultimate solution lies in understanding that the benefits of both must be harnessed, using rich clients to access network information, using the lightest amount of data transfer possible. The network will always be slower and the rich client is here to stay.
Whichever form it takes, a personal computer with large amounts of processing power will always be something we need to be as productive as possible.
This article about desktop computing was brought to you by HirePulse, a site that links Australian business, consultants, contractors, freelancers and professional services. Produced with the help of Living Online.