MicrosoftWindows is beyond repair. It started out with a lax security model that allowed any program to clobber any file or resource. Vista tried to tighten this up by putting up all kinds of warning messages as different apps/drivers went about their usual business based on the lax security model. The warning messages are nearly useless as the user cannot change applications/drivers that assume a lax security model. The user can only upgrade their applications, assuming a Vista-friendly version is available, which may not be the case. Maybe in five years or so most new applications will be Vista-friendly such that they don't trigger warning messages. However, what is someone supposed to do in the meantime? MS is trying to remedy this by putting in all kinds of exceptions based on the product. Thus, the code is full of IF statements (or the equivalent) that hard-wire product-specific exceptions. This makes it a BigBallOfMud. The smart thing to do would perhaps be to start over. However, if people start over, they might as well use Linux or Mac the like, which MS clearly does not want. Thus, they throw armies of programmers at their BigBallOfMud to protect their turf. It might happen anyhow as the cost to maintain their ever-growing BBM grows so large that alternatives look both cheaper and less problematic such that the QwertySyndrome threshold for change is reached.
I encountered this article just after writing the above:
http://arstechnica.com/articles/culture/microsoft-learn-from-apple-II.ars
Quote:
And it's not just third parties who suffer. It causes trouble for Microsoft, too. The code isn't just inconsistent and ugly on the outside; it's that way on the inside, too. There's a lot of software for Windows, a lot of business-critical software, that's not maintained [changed] any more. And that software is usually buggy. It passes bad parameters to API calls, uses memory that it has released, assumes that files live in particular hardcoded locations, all sorts of things that it shouldn't do. If the OS changes underneath—to prohibit the reuse of freed memory, to more aggressively validate parameters, to stick more closely to the documentation without making extra assumptions or causing special side-effects—then these programs break.
So Windows has all sorts of bits of code which are there to provide compatibility with these broken applications. It's hard for MS to maintain and fix this code, because it means the code no longer does what it's documented to do; it does that plus some other stuff. It's hard to test, because there's no knowing exactly what broken things programs are going to try to do. And it makes things more expensive; Microsoft has all sorts of special behaviors it needs to preserve. This means that not only can it not make the API better—it can't even easily make the API's implementation better. It's all too fragile.
(end-quote)
"How Microsoft Lost the API War"
http://www.joelonsoftware.com/articles/APIWar.html
Excerpt:
"...the hit game SimCity, who told me that there was a critical bug in his application: it used memory right after freeing it, a major no-no that happened to work OK on DOS but would not work under Windows where memory that is freed is likely to be snatched up by another running application right away. The testers on the Windows team were going through various popular applications, testing them to make sure they worked OK, but SimCity kept crashing. They reported this to the Windows developers, who disassembled SimCity, stepped through it in a debugger, found the bug, and added special code that checked if SimCity was running, and if it did, ran the memory allocator in a special mode in which you could still use memory after freeing it. ... This was not an unusual case."
Thus, there's a lot of internal Windows code that does things like:
if (app_executable=="foopro.exe" && app_version < 4.5) { special_behavior_for_foopro(...); } else { typical_behavior(...); }(Kind of reminds me of developing in HTML-embedded JavaScript for different browser versions.)
Yes, but everybody says Webdevelopment is great, AJAX is the future, we love Javascript, and we hate windows because MicrosoftWindowsBeyondRepair. See the bias here? TheWebIsBeyondRepair? too, but if someone complains, everybody else looks at him/her with that "you are crazy" look
It goes on to say that MS has started moving away from this approach and will now willingly break compatibility and live with the consequences (See VbClassicMigrationConcerns). The consequence appears to be that it's driving away Windows developers. DotNet is popular, but it's far easier to clone because its UI uses mostly web standards not controlled by MS. And MS has to mostly give it away for free. Thus, MS is caught between a rock and a hard-place. Other MS attempts at controlling post-Windows standards with semi-web approaches has so far fluttered or flopped, and puts them in direct competition with Adobe and Java applets, which are better established.
Perhaps related to issues raised in ExtremeProgrammingForPlatformSoftware - good factoring and backwards compatibility are often competitors.
On the other hand, good factoring and religious use of unit testing necessarily implies isolation from the underlying OS too. Remember, the goal of unit testing is to isolate the unit; this includes isolation from the host OS if necessary. For example, when writing networking applications, if you don't mock the sockets API, you're asking for trouble. Mocking sockets is the only way you can inject a slew of controlled network scenarios, like, "oops, I just lost network connectivity!", or, "OK, I received the data loud and clear, but I'm going to inject a 3-bit error and a matching checksum."
PhlIps book on unit testing GUIs might do something similar, but from a GUI perspective; I'm not sure, not having read the book. But, conversations I recall having with him back in the CalDev days seems to suggests I'm on the right track here.
With this in mind, good factoring implies excellent forward compatibility -- if the underlying platform changes out from under you, you can write new shims that interfaces your cleanroom units to the filthy reality of the host OS, without having to affect the business logic of the software you're peddling. However, backward compatibility is something which you can plan for. As long as you think in terms similar to the Iriquois nation's "7 generations rule," you're golden.
In fact, I find no fault with software built with the philosophy where, for any feature F, you must support F for no less than three software generations:
Note that a generation of software might correspond to a handful of versions. For example, DOS 1 was 1st-generation, while DOS 2 through 4 was 2nd-generation, DOS 5 and 6 3rd-generation, and DOS 7 (aka Windows 95) obviously was a 4th-generation product.
For example, we're seeing this now with respect to HTML 1/2/3.x features going forward into XHTML 4 and 5. Another example: Commodore used this approach with its system software, and it worked astonishingly well. Some might cite the 1.3 to 2.0 transition as proof that it doesn't work; no, it's actually proof that software developers didn't believe Commodore was serious about their policy. I note that the transitions from 2.0 to 2.1, to 3.0, to 3.5, and now to 4.0 aren't nearly as rough as the 1.3 to 2.0 transition. That's because developers now heed the warnings.
Perhaps Windows can be compared to the Mars rovers: you can still get science out of them years after their golden-days passed, but you'll have to spend increasingly more time diagnosing and futzing with half-working parts as time goes on such that you reach a point where the futzing costs too much. -t
Another favorite quote: "You know if half your CPU cycles are spent preventing malware takeovers, something is deeply wrong." [paraphrased] -t
Very cute, topper, but still wrong. There is a reason why the entire world revolves around Windows desktops and everything else combined is a distant 8% or so. (Kinda hard to track these numbers any more.) Futzing, malware monitoring, updating, and all other maintenance put together is less than a few percent of Windows-based activity on any modern Core I3 or above machine. Windows is still the desktop environment of choice for businesses around the world and will continue to be so until a superior alternative is actively supported by second and third party software developers with the kind of commitment and depth that Windows-based applications get.
Oh, well.
In the above, replace "Windows" with "DOS" and "Core I3" with "486DX2", and you've got exactly what everyone was saying in 1992.
If the malware scanners are tuned just right, perhaps. But often they are not. Also, I find that the disk is becoming the bottleneck when you have gajillion cores; and malware scanners are disk intensive. Further, even the big-name scanners often miss stuff and thus one has to have multiple brands running. As far as Windows' "popularity", QwertySyndrome. Most would get Mac's if not for the price and business software compatibility. -t
Please, top. You are incorrect in thinking that the world would embrace Crapple's tightly closed, highly regimented, my-way-or-the-highway boxes and OSes were it not for the inflated price. In fact, you are not merely wrong but insane to think that the network-hostile, administrator-hostile, developer-hostile Crapple environment would survive in a modern market but for the hype and advertising that Crapple pours out to support these things.
See: ErrorsBecomeFeatures