Xwindow Protocol Should Be Stabbed And Burnt

ShouldXwindowsBeReplaced? The thesis of this page is "Hell, yes!"

Strong EditHint: This is a ThreadMess. If you do not resolve it yourself, I will do a SurfaceRefactoring (because I do not want to become involved into useless cross talk).

Almost all of it centers on rants that are provably, factually incorrect, as pointed out multiple times below, but of course ignored by the ranters. Any refactoring should bring things together by topic, to make it clear that each had been rebutted.


Initially in HowToQuashMicrosoft now branched into its own area:

A good start to helping Linux become a better user-oriented OS would be: Take X. Stab it repeatedly until it dies. Then burn it, and jump on the ashes.

Yes

You forgot to bury the ashes at a cross-roads. Unless you do that it'll come back.

I've never seen anything fill up a vacuum and still suck... -- The X-Windows Disaster, Chapter 7 of TheUnixHatersHandbook: http://www.art.net/~hopkins/Don/unix-haters/x-windows/disaster.html


And replace it with what?

A new windowing system - obviously. Without all the baggage.

Is the baggage really such a big problem? The performance isn't much of a problem any more and you're not likely to have to use the X APIs directly today.

Many of the critiques of "baggage" and "bloat" are rebutted on http://linuxfinances.info/info/xbloat.html. And that was in 2002 before the XRender extension and other enhancements became widespread.


Lots of people talk of replacing X11 as if it were a GUI system, and don't realize what they are really replacing. The argument for a *different* single-user stand-alone GUI server for workstations is completely separate. Pretending you can replace X11 with something like that is, however, ill-conceived at best. All the 'baggage' is completely necessary for many people. What the complainer is usually really saying is 'I don't want all of X11's capabilities, so why can't I have something simpler'..."


SteveJobs is famous for saying "X Windows is brain-dead", way back when he was starting up NextStep, Inc. On MacOS X today, we've got a minimum of three GUI APIS - Carbon (based on Classic MacOS APIs), Cocoa (based on NextStep / OpenStep), and AWT/Swing (Java). How's GnuStep for Linux coming along?


I should hope not. I still don't understand how the client/server nature of the whole thing works, but I do remember people trying to explain it to me, and me asking them "um, isn't the XwindowServer doing what the client usually does?"

One of the reasons they say Microsoft Windows is so crashy and kludgy is because it's based on a legacy GUI from the late 80's, and a legacy OS from the early 80's. But at least its legacy GUI and legacy OS were designed with mortal users in mind. Most X-Windows applications I've seen, with their big windows and small fonts, think they're running on a 22-inch monitor at 1280x1024. And every application seemed to use the mouse buttons for something different. I do hope this has changed since 1999.

I don't buy that excuse for Windows. Windows 2000 is steady as a rock. There is little excuse for the dreadful 9x series.


TomStambaugh proposes a SmalltalkExtensibleWindowServer as a replacement for X windows, along the lines of the NetworkExtensibleWindowSystem.


I'm hoping that the FrescoFramework (formerly known as TheBerlinProject) will eventually replace X.

<Digression> I have a feeling that the story of X is something like the story of FortranLanguage. When I had to learn Fortran II, it was a terrible letdown from AlgolLanguage -- it had no block structure, no if then else, it required line numbers (IIRC), etc. Over the years, Fortran incorporated most of the advances that other languages innovated and is now very little like the Fortran I learned, significantly better than Algol, and has characteristics of PascalLanguage, CeeLanguage, etc. (IIUC, there seems to be one defining underlining characteristic that makes people who are aware of it choose between C and Fortran, but I forget what it is.) I suspect X has gone the same way, continuing to incorporate improvements without a significant enough name change (for me) to facilitate discourse about the characteristics of X. </Digression>

IIRC, X does (or did) one thing that I think is bass-ackwards. If it needs to render, e.g., a combobox on a CRT display, the application generates the bits and X carries them from the application to the display. In today's world, where bandwidth is a bigger bottleneck than processing power on the computer driving the display, it would be more effective for the application to tell X that it needs a combobox displayed, and let X just carry that message and generate the combo box at the computer that is driving the CRT display (whether you call it the server or the client). IIUC, this is what Fresco is trying to do.

This was the fundamental premise of NeWS and STEWS -- the application requests that the display perform a graphical operation. The argument with the X community, at the time, was about what graphical vocabulary the display server offered the application. The X community insisted that the vocabulary should be fixed, that specific "extensions" could be installed into the server, and that applications choose from whatever server they were connected to. The NeWS/STEWS community argued that any application should be able to load any vocabulary it wished into the display, and then invoke it. The approach of NeWS was to use DisplayPostscript for both image model and behavior. The STEWS approach (unbuilt, though) was to specify behavior with Smalltalk and us DisplayPostscript as the image model. It looks to me as though Berlin has perhaps rediscovered the principal that it is often compellingly more efficient to move the program to the data then attempt to deliver processed data, especially when bandwidth is more precious than processing power. It appears to me that DisplayPostscript is still a more robust and portable image model than the baroque pixel-oriented approaches we still live with at the moment. --TomStambaugh

PS: To argue that there is no room for confusion about the client/server terminology used by X is silly or worse <not sure of the word I want -- non-perceptive, intolerant, condescending, ..., or some combination thereof>. In the ordinary user's mind, the client is the computer he is using (touching, typing into, viewing the display) -- the server is a remote computer that is providing data or services for his computer. At some level of detail, the opposite terminology of the X definition of client/server is technically correct in some contexts, but just needlessly confuses everyone -- why bother -- we should be making things easier, not more difficult.

-- RandyKramer

In the best of all possible worlds, I suppose they would start calling it the "X Window Service", since it analogous to a service in MicrosoftWindows.


"I'm hoping that Berlin will eventually replace X."

Don't bet on it. Berlin embeds the toolkit into the server. Imagine where X11 would be today if it had embedded 1980's widgets into the server. Do you really believe that X11 would have survived with Athena widgets?

[This is a misunderstanding. In Berlin, a toolkit is just an object factory. Since objects are network transparent, it doesn't need to be part of the display server, though it should be colocated on the same machine. In fact, had Berlin started with Athena widgets, it would have been easy to derive the GTK+ interface from the Athena interface and then transparently replace the toolkit for all applications without recompiling anything. It's also easy for an application to supply completely new widgets, though that could incur more network roundtrips. However, Berlin is dead (again), so none of this matters any more.]

"If it needs to render, e.g., a combobox on a CRT display, the application generates the bits and X carries them from the application to the display."

Wrong. X11 has high-level graphics primitives and display-side storage, and it is getting display-side stored vector graphics.

"It appears to me that DisplayPostscript is still a more robust and portable image model than the baroque pixel-oriented approaches we still live with at the moment"

Postscript's imaging model was never in doubt--it's a great imaging model, provided you have the resolution and color depth to display it; what sucks about Postscript is the language. Fortunately, X11 now has the Postscript imaging model without the language (the defacto-standard Render extension). Same, incidentally, for Macintosh, which also got rid of the language and adopted PDF instead.

"To argue that there is no room for confusion about the client/server terminology used by X is silly"

Of course, there is room for confusion. But if you called the XwindowServer what it actually is called, namely the "DISPLAY server", you would not be confused.

Over X11's design decisions have stood the test of time. All its contemporaries are dead, while X11 has been able to keep up with changes in tastes, imaging models, toolkits, and hardware (all the while remaining backward compatible). GUIs built on top of X11 are every bit as modern as those built on MacOS or Windows.

Is that supposed to be a hilarious joke? Have you even heard of the developments in graphics technology in the last 25 years? I suggest educating yourself with some GTA3 or something.

[X11 has its problems, but 99% of the people who complain about it, and about 99.9% of the people who complain about the X11 protocol, have never studied the protocol in question, so typically the two sides of the discussion are talking past each other. Nor has X11 stood still over time, so implying that it is 25 years behind the times implies lack of knowledge of enhancements made over time.]

[The impressiveness of certain apps, like GTA3, doesn't mean such things are impossible under X11 with direct rendering and good vendor drivers, they're just expensive to develop, and for the most part companies don't see enough market share under Unix/Linux.]

[Most of the "Let's replace X11!!!" projects that have cropped up over time begin by deciding to outright delete important features of X11, such as network support. -- DougMerritt]

"Let's replace X" rants seem to come from one of several quarters:


People who defend X completely misunderstand how graphics work in the modern world. (Modern, as in "developments in the last 25 years".)

More precisely, you need to differentiate the different and separate layers of a graphics architecture:

All four parts are absolutely necessary and at the same time are built from different principles and use completely different APIs.

X tries to meld all four into one steamingly monstrous pile of crud. While this is a typical approach for a college sophomore term project like X, trying to build a modern system capable of "supporting GTA3" on top of X is simply laughable to anybody with any experience in the industry at all.

In short, if you defend X, you really do need to go educate yourself: it is a sure-fire sign of misunderstanding how multimedia software works. -- IvanTkatchev

[I don't know how old the above comments are but I just wanted to mention that I can run GTA3 fine on X (KDE) under wine, and ut2003 is playable on my old pentium 633 under fluxbox (it woulden't even start under windows). I'm sure X has problems, but being able to support modern applications is not one of them.] It should be pointed out that X has most of these things, in a well-defined place in the SoftwareStack?. Low-level bit-twiddling with your frame buffer, GPU, or whatever-else is handled in the XwindowServer. X provides many low-level 2D primitives for things like lines, circles, etc. The Render extension, once an add-on but now part of X, allows server-side composition of more complicated documents, including full support for modern scaleable, anti-aliased fonts. OpenGl handles 3-D rendering; done in HW on platforms that support it; or on the server in SW if the hardware doesn't have good 3-D acceleration. The network protocol is a strength of X; the asynchronous nature of the X protocol makes X usable even over high-latency links. Numerous good GUI toolkits exist on top of Xlib (a common complaint is that X doesn't force a particular app framework on the user, instead it gives the user a choice; hence the KdeVsGnome situation). As Doug points out, X doesn't have a "physics engine"; but that's probably not a major liability.

I'm curious--what do you consider a good example of a GUI architecture? The Mac? DirectX?

I haven't had much experience with the Mac, but I do consider DirectX a better alternative to X. (Though it has its own problems, of course.) That said, if somebody took the time to refactor X, throw out old cruft and integrate extensions and third-party libraries into the standard core, it just might work as a viable alternative. Oh, and the frame buffer driver must reside in the kernel; I consider this to be a major blocking point for X adoption. -- IvanTkatchev

Why must the frame buffer driver reside in the kernel? For performance reasons? Seems to me if you want direct access to the hardware, then mmap()ing it into user space - which is what X does. For "normal" X clients, the frame buffer is mapped into the server process. For DRI, the clients can mmap it directly.

Keeping the graphics subsystem out of the kernel is rather nice for system security/stability. When Microsoft released Windows NT 4.0; system stability went down compared to NT 3.5. One major reason is they moved the frame buffer support into kernel space; with the attendant result that buggy drivers and the like were now in kernel space and thus able to crash the whole system. X server crashes generally don't bring down Unix boxes. (That does assume that users know how to change virtual terminals and/or log in remotely; as the primary UI may be hosed).

Are you aware that every other single driver resides in the kernel? Microkernel designs and whatnot are nice, but all important consumer OSs (including various Unixes) that I am aware of are not microkernels.

Anyways, your comment perfectly highlights the incredibly braindead design decisions of X. (Read: complete lack of any semblance of architecture.) The framebuffer/VRAM driver needs to be completely separate from the graphics engine, living in separate spaces; the framebuffer needs to be part of the kernel, whereas a graphics engine like X is simply a user process. The only way to crash a framebuffer driver is by crashing the GPU, and there is no way a user process could accomplish something like that through sending commands to the composition library.

Of course X, in their infinite wisdom, decided to act on opposing principles and amalgamated all the code they could get their hands on into one huge mass of a spaghetti-code super-user process. Brilliant, what can I say. -- IvanTkatchev

These things are not "necessary" - rather, they highlight how a proper graphics architecture stack must work. You can ignore the last 25 years of industrial experience that made modern graphics possible and persist in bolting new ugly hacks on top of old ugly hacks, (c.f. "direct rendering" or whatnot) but please be aware that in doing so you basically lag several years behind decent architectures in features while at the same time costing significantly more and requiring disproportionately larger resources.

There is a real reason why everybody in the world does things differently from X. The Unix crowd really needs to swallow its pride and face reality, otherwise the next generation of GPU technology will relegate Unix to the dustbin of history for good. -- IvanTkatchev

The Unix crowd that is living in an arrested-development fantasy of 1970's hardware "boxen". Like it or not, modern "application servers" and "office productivity machines" are running on modern hardware, and supporting modern hardware means absolutely supporting things like GPUs. Currently, it is still viable to emulate a 1970s-fantasy-land just to satisfy Unix bigots, but there will come a time when paying through the nose to placate bigotry and ill education will become unprofitable. And at that point Unix will die, unless there comes along someone smart enough to understand reality. Unfortunately, for most "geeks" wearing their unixness as a badge of merit is more important than solving real problems efficiently. Such is life, and I don't really want to dedicate my life towards the goal of saving Unix. It is high time for progress already anyways. -- IvanTkatchev

I think you've missed my point, which is that the majority of applications for which Unix/Linux/*x boxes are used do not require "next generation GPU technology." Indeed, many Unix boxen are run headless in racks. Do you plan on plugging a 27 inch plasma screen into every 1U rackmount server so the computer room techs can reconfigure Apache in full 3D? Do you believe every application demands high performance graphical capability? The world, in fact, does not consist solely of games, simulators, and scientific visualization. There is a big place outside of these domains, and most of it is non-graphical -- let alone something that demands "next generation GPU technology." Certainly modern hardware will come with such things on-board, but will it be needed for anything except displaying a minimal UI? For data processing and serving (I'll make a wild guess that's the majority of Unix boxen are doing) X is sufficient for the task and nicely handles the common case where the host is headless and the display is geographically remote.

Now I'm not defending the limitations of X, and I agree that X is behind the times for (say) running games on Linux, but I doubt that's an area of interest except for a small handful of zealots. I hardly think failing to solve their "real problems" is going to matter even half a whit to the rest of the Unix community, and it certainly isn't about to relegate Unix to the dustbin. At some point in the future, it may come to pass that a minimal UI (for some new definition of "minimal") will require advanced GPU technology, but I'm sure Unix developers will accommodate it when it happens, just as they have accommodated real needs in the past instead of catering to unrealistic fantasies about the death of Unix because GTA3 won't run quite fast enough. -- DaveVoorhis

I think you missed my point. The reason things like GPUs exist is not for playing games or doing visualizations; they exist to:

No matter how "minimal" your UI is, it will always run more effectively on a GPU; after all, two processors is always better than one. The point is that modern computing is moving towards modern hardware architectures (read: advanced parallelism) and Unix is quickly losing pace thanks to bigotry and failure to understand basic industry practices. -- IvanTkatchev

This is patently and absurdly incorrect. Right now, all research into massive parallelization of programming languages, and exploiting GPUs to perform scientific-grade computation are all being spearheaded on Unix-based systems. --SamuelFalvo?

[Your use of rude phrasings is irritating; IsYourRudenessNecessary? You're on the verge of triggering some of my own.]

[Also, you are simply factually incorrect. First off, modern GPUs are supported with X11 (go look at Nvidia's site, for instance, amongst many other places that prove similar points). Secondly, consider that no practical GPU (read: ATI or Nvdia) is as generally capable as a general purpose CPU, which means that GPUs CANNOT run an entire UI by themselves.]

Turing says that any program is capable of supporting any other program. (More or less :)) That's not really the point, really. The point is that the X 'architecture' is not at all equipped for dealing with things like GPUs. (Read: using X comes at a significant resource cost without any gain whatsoever.) As for the second point -- well, duh. That's exactly why it's called a GPU and not a CPU.

[I happen to be pretty well acquainted with the entire history of graphics technology, hardware and software both, and you are talking about what was first called by Ivan Sutherland "the great wheel of reincarnation". If you are unaware that current GPUs are a matter of history repeating itself, or what that first history was, and therefore what inevitably follows, then you are ill-equipped to call the rest of us ignorant about the whole topic.]

Ah, the "great wheel of reincarnation". Nice name, but does having a fancy name excuse the inability to exploit modern hardware fully?

[What you are actually doing is just tossing out flaming opinions and calling them fact. You would only be doing that if you were assuming that you know more than your audience, otherwise you'd realize you can't get away with that, it won't work. -- DougMerritt]

Not really. X lacks a well-defined graphics architecture stack. I think this is a fact, no? -- IvanTkatchev

Ivan, you wrote "... using X comes at a significant resource cost without any gain whatsoever." On the notebook I'm using right now, X is consuming 3.0% CPU and 4.3% RAM. It's certainly consuming some resources, and I'll let you decide whether they are significant or not, but I consider the gain in productivity -- at least over the immediately available alternative, which is a raw text console -- to be worth it. What do you suggest I replace X with that would improve this situation? -- DaveVoorhis

No disagreement here from me. I'm speaking more along the lines of an ideal not-yet-existing system that would be designed along proper guidelines. IMO, this is a serious issue and the Unix people should start thinking about it now. -- IvanTkatchev

Nice idea, but who are "the Unix people" that should start thinking about it? Sun? The SCO Group? Redhat? XFree86.org? Linus Torvalds? IBM? An ambitious new OpenSource project inspired by this WikiPage? The phrase "uphill battle" comes to mind... This sort of "you guys oughta build a better mousetrap because the existing mousetrap sucks"-type complaint/recommendation seems to be popular these days -- especially among folks who probably can't code their way out of wet tissue paper -- but the answer is always the same: If you don't like it, you should do something about it. If you're convinced the world needs an X++ (Y?), maybe you should stop whinging and create it. -- DaveVoorhis

No, not really. DirectX suits me fine for now, thanks. It's not ideal, but at least it lets me access the hardware I bought. -- IvanTkatchev Is anyone else bothered by this comparison? It seems rather silly. X may have problems, one of them possibly being the lack of widget standards leading to different applications looking different, depending on the API they use, but that has nothing to do with DirectX vs X. In fact the whole comparison is somewhat meaningless. DirectX is an API for accessing direct rendering capabilities, input devices and the like. In the X world the nearest comparison would be with the direct rendering interface, but even that does not make much sense. A more sensible comparison would be to SimpleDirectmediaLayer + OpenGl. Both of which work fine with X. -- KristofferLawson


The X11 design breaks a very important principle: optimize for the most common case. 99% of the computers display graphics on the local machine. Windows GDI, Direct3D, OpenGl are good designs for this situation, while X11 sucks big time. Performance numbers and market forces have spoken and consumers voted with their feet. The rest pro-X11 arguments are largely a matter of handwaving and sheer speculation.

The common case is rather optimized (at least on Unix machines): Unix-domain sockets, which the X11 protocol runs over when client and server are on the same machine; is a very fast IPC mechanism. OpenGL is an orthogonal issue--it happily runs on top of X. If you think that the performance bottleneck in a X deployment is the socket layer, you ought to look elsewhere. Unix-domain sockets are, AFAIK, only used for notifications if you also use the shared-memory extension. That makes things even faster still!

Pray tell, what do "domain sockets" have to do with pumping texture data to VRAM? Welcome to the '90ies, dude. We have these magical things called "GPU"s now.

You miss the issue. X lacks any concept of architecture. The issue of using or not "domain sockets" resides in a completely different and separate architectural layer. (c.f. above.) -- IvanTkatchev (Just a thought -- maybe the reason people get angry with X11 is precisely because it supports so many different architectures at once. They fail to see the forest (the meta-architecture) for all the trees (architectures). --SamuelFalvo?)

On modern Windows machines; most graphics rendering is done in the kernel--which also requires a context switch. [[ Actually, that's false. Windows rendering requires only a privilege elevation, from user to kernel mode, which is far cheaper than a context swap. The most expensive thing about a context swap is the TLB flush -- the VM hardware has to dump all of its cached VM mappings and rebuild them on each context swap. This does not occur with U/K or K/U transitions. Thus, Windows GDI U/K/U transitions cost quite a bit LESS than even a single process-to-process-and-back context swap. -- ArlieDavis ]]

Which "performance numbers" are you referring to? If you are going to mention any statistics/research; a pointer to same would be nice. As far as customers voting, might I suggest that the selection of Mac|Windows|UNIX/Linux has little to do with X; and has much more to do with other concerns, such as availability of applications, (perceived) ease of use, security, political concerns, and compatibility/familiarity? I know of nobody who says "I was thinking about Linux but switched to Windows because the GUI is slow.

Linux can hold on to this nonsense at its own peril. One can certainly claim that Gnome or KDE are good window managers, but people who tried the latest Gnome or KDE on a 1Ghz PC with lots of RAM and felt the pain of immensely bloated and performance insensitive software packages burning CPU cycles mindlessly just to move a few bytes around to no avail, surely know better.

One would be foolish to claim that either Gnome or KDE are WindowManagers--they're not; at least not how the term is defined in X11 parlance. (Both provide window management functionality; but both are intended to be session managers; with Gnome at least you can replace the default WM with one of your own.) I've got SUSE 9.3 installed on my PC, it runs fine. Perhaps the performance issues are elsewhere?

KDE (more specifically Qt), it should be pointed out, will happily run on top of Windows as well as X.

Actually, one thing that has long caused performance problems with KDE/Gnome on Linux has nothing to do with X. Microsoft, for all their faults, has developed a very fast method of loading DLLs (at the expense of complicating their generation) and other binary components; the dynamic loader on Linux (ld.so) has been, for a while, behind the curve. Recent Linux distros have improved the toolchain greatly. However, if you think that applications take too long to start under KDE/Gnome; chances are it's ld.so and not X11 that's at fault.

One other longstanding problem with Linux is chipset manufacturers who don't support Linux with drivers. This is also going away as Linux increases it's market share; but there are a few chipsets out there that still have better performance under Windows than Linux because the vendor won't write a driver or publish the specs.

How you update the screen and get user interaction at a distance is an altogether different problem, and one of the classic papers coming from Unix heritage dispels the myth that you can design the same for distributed computing as you design for local computing - AnoteOnDistributedComputing. X11 violates this design principle just the same as it breaks the other. Maybe, just maybe, X11 is a good design for the networked UI problem.

Perhaps. In the case of X; I think the correct decision was made. The distributed nature simply isn't a performance bottleneck for X applications. It's just not.

[What is the claim here, that there is an alternative for Unix workstation users? It's not just about Linux/BSD hobbyist home users, after all. What alternative is that? I don't see that a difference of opinion constitutes "trolling".]

I was unclear about one of my major points in the above: this page's title includes the word "protocol", but there's extremely little discussion of X11's protocol per se here, and what little discussion there is, is not only wrong but also essentially completely uninformed.

The rest of the discussion is about things that are not necessarily related to X11's protocol at all (except that some have expressed the clear but unsubstantiated opinion that perhaps it shouldn't even exist). There's a lot concerning the inappropriateness of X11 for top speed highest quality 3D games, and that is partially true, but not completely true, since those who say so are apparently unaware of the direct rendering interface, yet any criticism of X11 in this area should single that out directly for a technical critique.

Similarly there seems to be a misunderstanding of OpenGl versus X11, whereas actually it has been incorporated into X11. So has e.g. DisplayPostscript (which does indeed have some very nice features, and like Tom, I was sad when Sun's NeWS failed the marketing war). Etc, etc, etc. X11 has a vast number of extensions, and certainly has not stood still for 20 years.

As I keep saying, X11 does indeed have problems (from the start, I've always been unhappy with the sheer size and also clumsiness of the overall API), but it also has strengths that are misunderstood by non-X11 programmers.

And then finally, even if it did completely suck, there is no complete replacement for it suitable for the same range of purposes on Unix/Linux systems, and telling people to change their hardware platforms and OSes just doesn't cut it; on those platforms, it will not be dead unless and until there is an alternative.

Note that one of X11's strengths is that it is uniquely platform independent, and that is not similarly true of any of X11's competitors on any platform. To miss this point is to miss the entire point of X11 from the beginning of time. -- Doug


I know little of the internals of X11. My opinions of it have been formed mostly as an end user. Every instance I've worked with has seemed slow and clunky compared to other window systems on similar or identical hardware. It's hard not to conclude that the fault lies in X itself and not specific implementations or uses of X. Can anyone point to modern X implementations that don't suffer by comparison with Windows? -- EricHodges

I'm currently running Suse 9.3 on an AMD Athlon 64 of respectable-but-not-great speed. X runs fine. KDE runs fine as well; my only complaint is that OpenOffice takes forever to start up; which I suspect has little to do with X. Unfortunately, X often gets blamed for things which are not really its fault (such as slow load times for applications, poorly-tuned virtual memory systems, inefficient dynamic loaders, etc). The Linux community (and this means both the KDE and Gnome camps, as well as those working on distros, the Gnu toolchain and the kernel itself) could do quite a bit to improve the performance of interactive GUIs on Linux. I don't think that scrapping X itself is the answer. At any rate, the situation has improved quite a bit in the last couple of years; there's a night and day difference between my previous PC and my current one. Part of that may be MooresLaw; but part of that is improvements on the SW side. Of course, all of this is anecdotal evidence and speculation; but your experiences are also anecdotal. (Which doesn't make them invalid; when a customer complains it isn't good to tell him "but 80% of customers are happy; what's wrong with you?")

Is that XFree86? What graphics card? How does it compare to Windows? Ignore start up times and look at basic responsiveness. I agree that X is "fine" compared to X a few years ago, but it never seems "fine" compared to Windows today. -- EH

[At the moment, I'm using an ancient HP OmniBook XE2 with a PII 400. I have Fedora Core 4 on one partition and XP Pro on another. I mainly use it to do 'net things or program in Java using Eclipse, so I can use either XP or FC4. I have no particular operating system bias -- as far as I'm concerned, from an OS architecture point of view the similarities between XP and Linux far outweigh the differences. Yet I prefer to use FC4 for two reasons: (1) The vast gaggle of utilities, applications and goodies that come pre-installed with FC4 make it worth enduring the relatively slow start-up time and sluggish OpenOffice load-time. (2) The fonts look better under FC4. The difference in graphics performance between XP and FC4, on the built-in Silicon Motion Lynx EM, is not sufficiently different for me to notice, let alone make me choose one OS over the other. -- DaveVoorhis]

Interesting. Do you notice a difference between Java Swing apps and native Windows apps? -- EH

I'd like to hear about that, too. Also, I'm surprised that fonts in general look better under FC4; I'm aware that they could, but historically there are issues about which particular fonts are included in a distribution, so I'd like to hear more about that, too. -- Doug

[Java Swing apps are noticeably sluggish compared to native Windows apps and, for example, GTK apps under FC4. I wish I could recall which RedHat/Fedora release significantly improved the font quality, but I can't. It's been there for a while, for some unknown value of "a while". If you want to see what I see, go to http://shark.armchair.mb.ca/~dave/FC4Desktop/ and tell me whether that's better, worse, or the same as XP. Maybe I've simply gotten used to it. -- DaveVoorhis]


See also: RemoteGuiProtocols LetsBlowUpTheUniverse


JulyZeroFive


CategoryRant CategoryXwindow


EditText of this page (last edited August 24, 2014) or FindPage with title or text search