Thin Client Has Failed

The idea of a ThinClient has failed for CrudScreen applications over all. JavaScript (JS) crept in and grew and now HtmlDomJsCss = FatClient. All the stuff that comes with a browser is as big as any VB/Delphi/PowerBuilder app engine ever was (minus some widget features we expect). Perhaps they should have turned VB/Delphi/PB into web browsers instead of the other way around. Whether thin could have succeeded may be debatable, but the people voted for fat regardless. -- top

Failed to do what, exactly?

It failed to be thin...

To replace the corporate desktop? For many apps, a fat client is better. But the thin client industry ain't exactly dead.

No... it is just becoming fat...

And JavaScript apps served up to a client over the web doesn't in anyway contradict the notion of "thin". What were you thinking - that nothing more sophisticated than a dumb xterm qualifies?

Well if JavaScript doesn't in anyway contradict the notion of "thin", then JavaApplets?, or WinThirtyTwo applications are also "thin" (and therefore everything is thin).

I generally consider "thin" to not rely on a TuringComplete application language on the client side. Thus, HTML and HTML forms is thin-client.


I think top has this one correct. The "thin client" mantra has kept web applications at the 3270 level for nearly a decade now. Replace the corporate desktop? The major corporations I'm familiar with don't allow any new desktop application to be deployed until it is has been demonstrated to be totally compatible with existing applications and "corporate standards", has been thoroughly tested in dozens of configurations, and is supplied by an "approved vendor" - a process that takes years. One practical result of this is that the corporate desktop is essentially impenetrable for new desktop applications from startups. A second, more pervasive and more important result, is the creation of an enormous demand for web-based applications, especially for technology-intense corporate users - scientists in big pharmas, for example.

So long as those web-based applications have been "thin client", all those corporate users have had to live with ancient, 3270 form-based user interaction. The FatClient (JS+DOM+HTML+AsynchronousComm?+CSS+XML) solves this problem. Google Map mashups are just the beginning.

This appears capable of finally breaking the log-jam that has made it so difficult for individual entrepreneurs to sustain themselves economically by providing innovative software.

-- TomStambaugh

I also agree that thin client efforts to date have been a dismal failure in terms of performance, though I would quibble with Tom's comment that current web-based efforts are equivalent to a 3270 capability. I believe many old telnet/VT100 applications over 9600 Baud out perform current Web applications at 56 KBaud or even over DSL and internal corporate LANs. I also note, however, web-based applications have been a success for wide deployment. For things I do not need to do on a day to day basis, I am willing to live with the degraded performance in order to by pass the hassles of obtaining, installing, and configuring client-side software. -- WayneMack

When I say "3270 level", I refer to the page-at-a-time interaction metaphor (as opposed to performance). The trade-off Wayne refers to ("the hassles of obtaining, installing, and configuring client-side software" versus web-based models) is the driver for the revolution towards AJAX-style FatClient web applications. -- TomStambaugh

Although I think AJAX will have some fringe effects, I still think it is a patch for the dismal state of client-side web interface logic. Why can't the client have a text editing function along the lines of a word processor (i.e., common formatting and spell checking functions) or a table manipulation along the lines of Excel or a simple database (i.e., why should the developer writing column sorting code)? Why can't the list boxes (a personal pet peeve) use a standard type-ahead selection rather that retyping the first character to cycle through the list? Where I fear AJAX will fall short is in caching and synchronization of mostly static code and tables. Instead of paying the penalty of a re-download of client code and fixed tables for each screen refresh, why not have the client provide caching with a versioning system to resynchronize when changes occur? Although I think AJAX may start to give more of the feel of a fat client, I don't think it will go far enough to get web-based applications near the performance of a desktop resident application. -- WayneMack

Have patience, my son, have patience. Desktop widgets did not acquire the behavior you desire overnight. The limitations you describe are issues of implementation, not theory. In the meantime, there is little "re-downloading" of client code on a screen refresh -- that's the point of it. Instead, the cached material is attached to a convenient hook in the browser (such as the Window object) and kept around. Not the javascript, but the objects that the javascript describes. I'm not sure what you meant by "near the capability of ClientServer", did you mean "near the capability of desktop", instead? -- TomStambaugh

Add "caching and versioning"??? You have no idea how HTTP works do you? There are fairly complex caching features, that a lot of people don't use. You have ETags (which you can use with version numbers or more complex stuff), modification date, and conditional requests ("send content only if you have something newer than I have"). And ways to keep all that working when content negotiation is involved. And rules on how caching proxies fit in all that. In addition, the over-use of cookies in today's web2.0 world makes all that harder (you have to avoid a caching proxy giving a user a cached version of a page as loaded by another user). -- Nicolas

Yes, "near the capability of the desktop" is a more correct description of what I mean and I have edited my statement to reflect that wording (Please let me know if it is still unclear). And, if I may deliberately misunderstand your previous comment, I agree that having patience is the key to working with web-based apps. :-) -- Wayne

Re: "have patience. Desktop widgets did not acquire the behavior you desire overnight."

They did happen pretty fast as I remember it. In about 6 years after the time Windows 3x became a de-facto standard around 1991, the power of widgets exploded. I remember the VBextras catalog growing and growing with fancier widgets (some for C++ also). Of course, they were proprietary and sometimes buggy.


We must walk before we run. At least an AjAx app can finally drag and drop an icon across a graphic image without server intervention. You mentioned the following:

If you mean can these be provided using AJAX, I think they almost certainly can be and probably are -- a Google search would be helpful. If you mean can these be provided as part of a "toolbox" (perhaps wired into the browser), I think there might be questions about how widespread their appeal is. In any case, these are precisely the sorts of things that AJAX is good at.

The remaining missing piece is a common-graphics style GUI toolbox wrapper that surfaces the underlying toolbox to the Javascript code. This explicitly violates the current javascript security provisions, so some dancing is required. I would think this piece could turn into a standard browser component.

I am not sure how extending the desktop primitives would violate anything. javascript already interacts with the HTML primitives, but the problem is the primitives do too little. It would be valuable to have a wider range of desktop primitives providing more capability. This would avoid having custom code developed and downloaded for common tasks and the resulting primitives could still be called and extended by javascript in a manner similar to what is currently done.


No mention of the horrible crippling browser UI? The performance or fat/thinnes is a very distant second in my book to good usability. Windows forms is unparalleled in usability. I don't have to refresh the whole page to update a grid column and I can exactly control the mouse/keyboard/display/etc.

The problem is that Windows Forms is highly proprietary. But it may spark ideas.


I know this is a quibble, but the "em-dash" is NOT the same as the "hyphen". In the following sentence (of mine): "The trade-off Wayne refers to -- 'the hassles of obtaining, installing, and configuring client-side software' versus web-based models -- is the driver for the revolution towards AJAX-style FatClient web applications", the parenthetical clause is correctly denoted with the ascii-equivalent of the em-dash ("--"). Please do not "correct" it to a hyphen. I've taken the liberty to parenthesize the clause instead. Please see RulesForUsingHyphensAndDashes. -- TomStambaugh


Our desktop is our ThinClient... we connect to the internet which is the FatServers?. At least relatively, our desktops are thinner than the fat servers. And the thin client hasn't failed because people use crumby programs like AIM/MSN/ICQ.. although I see the point that those are getting fatter each day.

Compared to an Elephant, Orson Wells is also a "thin client". For now, I would define a "thin client" as using protocols or interface languages that are not TuringComplete on the client side. HTML is thus thin-client until one adds JavaScript. --top

That sounds like a reasonable criterion, but I'd consider a 'thin client' any one where the logic and semantics processing is performed server-side, even if the protocols/interface language/communicated InteractiveSceneGraph is TuringComplete. JavaScript, I think, can cross both boundaries; merely using JavaScript doesn't mean you don't have a ThinClient.

Consider what 'ThinClient' might mean in terms of a videogame - it would be a system that displays what it is told to display, precaches what it is told to precache (possibly without keys to lock the data... precaching as optimization while forbidding cheating), allows certain interactions, and relays back interactions to the 'server'. It doesn't do AI work and such except, perhaps, insofar as it is allocated certain calculations to perform as part of the price for joining the game. Merely having a TuringComplete display language (heck, modern pixel-shader languages are TuringComplete) does not a fat client make... at least not in my opinion. You need to use the language in certain ways for it to be 'fat'.

Perhaps DataflowAnalysis would be the way to decide 'thin client' vs. 'fat client'. If you use a pixel-shader to make logic decisions and update a remote database (and it is doable...) then you've got one heckuva fat GPU. If you transmit high-level TuringComplete language information merely to render a pretty image to a ray tracer or printer, like PovRay or PostScript, then it is still ThinClient.

I think a ThinClient is where the server ultimately does a lot of the work with the thin client doing not as much. Some architectures seem to be server to server with both doing a lot of the work, but others are biased in that the server does much more work. Servers are also usually much more central than the client, so for example even if Ajax is now doing a lot of the work on the client side.. it still is server biased if it is connect to the web. If it is a localhost application then it is not even a full client per say. Then again ThinClient is kind of a BuzzWord too.

I don't believe that it is a matter of the amount of work being done. In terms of sheer computation, displaying a webpage (along with its JPEG images, SVGA, fonts, etc.) can be much greater than the processing cost prior to delivery. What matters must be the sort of work being done.

The total work done by one server is much more than an individual client. The clients can be divided by 10,000 and the server is 1 or 2 handling 10,000 requests. At each request per computer, the client does a lot of work.. but not in relation to the total server work being done for 10,000 clients or so.

Irrelevant, IMO. 'ThinClient' ought to have the exact same meaning whether the server is serving ten clients or ten million. By the way you're defining it, clients get 'thinner' as a service becomes more popular - a notion that I find ludicrous.

["Thin client" used to refer to a workstation of highly-restricted capability intended purely as a keyboard & display device on behalf of some external host(s), but with a more positive connotation than the deprecatory "dumb terminal". It usually has minimal (if any) local non-volatile storage, and cannot provide useful functionality without connection to a host. Think IBM 3270, Wyse 25, DEC 220, X terminals (maybe...), and so on. "Fat client" used to refer to a general-purpose workstation, with a hard disk and/or remote boot capability, and at least the theoretical possibility to host applications independently of any server, though policy or lack of installed hard drive may require it to load them from a server. Over time, the distinction between these terms has blurred into an indistinct and amorphous region of meaninglessness, to the point that they are now essentially useless -- especially when referring to software that only runs on machines that are, undeniably, fat clients. I suggest these terms be avoided. There *might* still be some value in regarding a "thin client" as one that is wholly dependent on a server to obtain *any* useful functionality, and a "fat client" is one that can (at least theoretically) provide useful functionality independently of any server, but I doubt it.] -- DaveVoorhis

I've always, in my mind, distinguished 'dumb terminal' (which is a hardware distinction) from 'thin client' (which is a distinction on the processing logic and communications role in the client-server model). And I believe the distinction still has merit. Whereas we once sent low-level display messages to hardware directly (RS232) and sent unbuffered keypresses back, we now use higher-level languages for what to display and accept higher-level semantic information back. Sure, it takes software to display the higher-level language, but that doesn't mean hardware can't do it; it would be more accurate to say that it is far more economical to do it in software. Whereas once we had 'minimal (if any) local non-volatile storage', we now have enough storage capacity to use for optimizations (like caching), but have 'minimal (if any) data storage' - by which I mean 'real' data (as per WhatIsData) - the stuff used in fact processing. For a thin client, the stuff stored locally is 'for display purposes only' and it is the server that does fact processing. Only a few pieces of configuration data about how the user likes his display configured or modified client-side are stored locally, and given the tendency to use many computers even keeping that data local is questionable. Client-side data should be either storage to support the display software or storage for optimization (caching, especially). If any 'fact processing' does happen on a ThinClient, it only happens because the server is leeching processor power from the remote machines; the client doesn't need to have any clue as to the context of the information it is processing. "Fat software" or "fat client" is like modern tax software or single-player video-games that carries with it the information for fact processing and is capable of making decisions on its own and sending back new facts or controlling the server. That, to me, is the most relevant distinction - where the logic and fact processing is happening. Sure, there is software that blurs the lines, but I'd hesitate to agree with a statement that there there is a great deal of it.

{Perhaps move definition issues to ThinClient, or at least a definition-related topic if too big.}


Silverlight (fat client) sits on a browser (thin client) which accesses an application (fat client) which consumes resources via web services/ODBC/etc (thin client). Every step in this hodgepodge of technology is full of bugs, exploits, differing tool/IDE maturity, data format transformations, proprietary learning curves and integration nightmares. It's time for one consistent scalable environment/language/storage.

The presentation of something (web, desktop, PDA, OS,etc), access to something (URI, connection string, stream) and storage of something (array, serialized XML, database, object, file) should just be configurable attributes optimizable by the run-time.

--BrianG

AlternateFatAndThinLayers??


See also GoogleGears, HtmlFive


MarchZeroSix

CategoryWebDesign, CategoryUserInterface


EditText of this page (last edited October 22, 2012) or FindPage with title or text search