After so much time and so much code, people deeply exposed to Unix tend to think about things in a UnixWay. Some programs are unixy, and some are not. PerlLanguage is (consciously) unicious, BasicLanguage is not. EmacsEditor is, MicrosoftWord isn't, Wiki is halfway.
Taken together, a few books capture most of it:
Those in this culture perceive TheNotUnixCulture as insensitive.
Programs are hardly ever ported from TheNotUnixCulture; instead, the assumption is that if you weren't some kind of weirdo you'd be using "not Unix" like everyone else.
"Unix is the latin of computer science." -- WardCunningham
Except it ain't dead yet; though there are many who wish it would be. Including yer boss. :)
Live Free Or Die
"...Armando Stettner, one of the early Unix developers, used to give out fake license plates bearing this motto under a large Unix, all in New Hampshire colors of green and white"
http://catb.org/~esr/jargon/html/L/Live-Free-Or-Die-.html
I am very pleased to have one of these, from the old days. The thing people don't understand about us UnixWeenies? is summed up, from our point of view, in this borrowed phrase. UnixOs isn't perfect, but it is freedom. All other operating systems before and since are fascist bondage and discipline "you can't get there from here" systems, no matter what individual nice features they have, or (these days) how flashy their GUI.
We value freedom, thus "Live free or die!"
I don't see this as a TermOfAbuse. There is a Unix culture, with it's own conventions and design principles. Even systems like DOS and NextStep have absorbed a certain amount of the Unix culture.
The history of UnixOs is rather like the history of British-style parliamentary democracy. We sort of invaded lots of other countries and left behind similar systems of government. Perhaps "porting" has done to Unix what colonization has done for UK-style government.
For example:
Australia ... quite a lot of influence. Some Australians even want the queen to remain head of state. BsdOs is in a similar situation... almost "vanilla unix".
US ... similar in origin. e.g. adversarial system, legal precedent matters. MicrosoftWindows/MsDos is like that - similar in origin but has departed on its own path. (No offence intended here, honest.)
India ... somewhere between AUS and USA in influence, but the common influence from outside is clearly grafted on top of a different existing culture. E.g., a BSD compatibility layer on top of the mach and OpenStep.
Then again, perhaps the CeeLanguage is more influential than the OS ... Sort of the "International English" or the Latin of its day? A LinguaFranca for writing on the metal? --
"sort of invaded"? Surely we invaded, or we didn't? -- RogerLipscombe
Nice to know that my gentle understatements are appreciated. ;-) --
UnixOs in the movies (sad but true):
- Wayne's World 2 ("The Unix book? <geekish smile> Cool.") http://us.imdb.com/Title?0108525
- Jurassic Park (Even little girls hiding from dinosaurs know how to use Unix.) http://us.imdb.com/Title?0107290
- TronLegacy? (we actually watch ps, kill, whoami, uname, login, history, and /bin/LLSDLaserControl being typed. Some of it in a shell running in Emacs.) http://www.imdb.com/title/tt1104001/
- TheSocialNetwork? ("...break out Emacs and modify that Perl script...") http://www.imdb.com/title/tt1285016/
- LaFemmeNikita? (TV show (The 90s TV show not the one from this century)) Nikita had to take down the computers in the evil base or something and had the computer geek guy on the ear phone. His instructions were "type ps ax and there'll be a number on that line... OK now type kill dash 9 and whatever the number was"
Apparently, megalomaniacs bent on world domination prefer OS/2 (if you believe the product placement in Goldeneye).
Then, we never have to ever worry about world domination, do we?
I wouldn't be so sure. There appears to be a new breed of supervillains: http://ubergeek.tv/article.php?pid=54
Unix can be evil, too! "curses(3), fold(1) again!" twiddling moustache.
It seems to me that the UnixCulture always gets excited when really cool new operating system ideas make the scene, like in PlanNineFromBellLabs. However, that excitement appears to evaporate quickly once it's discovered that it renders most of their toolbox obsolete.
If this keeps up we'll still be using UNIX for centuries to come. :-) -- MattBehrens
That implies that the UnixCulture crowd loves their tools. I'm not sure. They love the power and flexibility, but I've never heard anyone praise the elegance and style of the interface for, say, sed or find or curses!
Curses I can't speak to, but I love sed; find is not my favorite but is useful in far too many ways to count. Whenever I have to use a MicrosoftWindows box, the first thing to go onto it is CygWin. -- MattBehrens Even better: the first thing I install is VirtualBox and Debian
Consumer-based OSes like MacOs and MicrosoftWindows traditionally had a captive GraphicalUserInterface with no scripting (or v. poor scripting). Facilities like WindowsScriptingHost and AppleScript attempt to compensate for these weaknesses. It'll be interesting to see what happens with them. I doubt they'll have the same following: People had to learn UnixShells and got scripting as a bonus. I expect programmers will learn wsh because they just need to do some scripting. --
Unix culture is about each tool doing one thing very well and combining all these little tools to achieve your objective. Sorry but EmacsEditor is a refugee from the long dead culture of LispMachines. It's an asylum seeker in the UnixCulture. That's why it doesn't work like any other unix tools. Although Emacs has brought some interesting ideas to UnixCulture. Yes, this is EmacsAsOperatingSystem.
It's a real shame that none of the major tools are complete enough to do ANY things well. Using UnixOs commands is like being handed two bits of metal with 60-degree angles milled into them and told that hey, this is a universal socket wrench, just hold them apart at the right distance and you're set! (One of my favorite Unix memories was asking an experienced (correction: highly inexperienced; see below) Unix guy what the total size of all those .java files in that directory was, and watching as four people poked at the problem for half an hour. I finally gave up, logged in to the drive from a MicrosoftWindows box, and used "DIR".)
Is there some reason why they didn't just type "du -k -c *.java"? I wouldn't call myself an experienced UnixOs guy by any means, and I can't say what Unix-like OSs this would work on other than LinuxOs. But, a guy can't really be an expert if it takes that long to quickly check the ManPages. If, say, SunSolaris's version of du didn't support the -c flag, it certainly wouldn't have taken 30 minutes to use some other program to parse and sum the output of du.
Because they didn't know "du", most likely. I freely admit that as coders, these people were really great statisticians. The fact remains, however, that I didn't know about "du" either until I saw you mention it. While ManPages are a good way to learn more about a command when you only have an approximate idea of what it does, they are worthless if you don't know which one to use. (Some cross-references in the man pages would get around this... but there are none.)
You know, you could always just "man -k [command related word]," or "apropos [word]." The ManPages are not worthless if you know how/when to search them... I have found many useful commands this way. Of course, this gets into something strange, because how would you search the man pages for "how to search for a command" without knowing how to search the manuals in the first place? This is a typical Unix problem, I would agree: not knowing how to know how to do something. And this is where the oral tradition of Unix comes in, i.e., a Google search.
I cannot believe this story either. I am far from being a UnixOs expert, but "wc *.java" crossed my mind immediately. If you just want the relevant data, "wc -c *.java | tail -1" gets you there.
Of course the DIR command worked, because the example was trivial. But in a real-life project the source files are scattered along a directory hierarchy. In UnixOs I would do "find -name "*.java" | xargs du -b -c". And if I want the line count, I replace "du -b -c" with "wc -l". This is Unix culture at its finest.
This is why books like UnixPowerTools and a whole slew of other O'Reilly publications exist. I learned more about UnixOs from UnixPowerTools than any other book on Unix I've ever read. It has all the cross-references between different tools that any person could want.
Actually, it's worse than that. This story boils down to "the entire small connectable tools philosophy of UNIX is crap because there's one case where the default behaviour of something in UnixOs doesn't do what it does in MicrosoftWindows". This is the big problem with trying to make Windows people understand the philosophy. For almost everything you can type at a command line in UNIX, you can go download and possibly install something that will almost do it in Windows. DIR also sorts files. Not by any arbitrary criteria, just by 5. Now they /look/ like a full set. And almost always are. It's that 1 in a 100 case where the one you want is not on that list that's the killer, and it's where the tools approach saves you tons of effort. Sort the files on the criteria of alphabetically, by name, ignoring the first letter...
ls | sed "s/\(.\)\(.*\)/\2\1/g" | sort | sed "s/\(.*\)\(.\)/\2\1/g"
"Oh, " says the Windows user, "how often does THAT need doing?"
The answer is, actually, that things like that crop up annoyingly often if you're actually trying to work on things. I ended up with much more grossness sorting files that were named <random process number>.<date in MMDDYYYY format>.<extension> into date order.
-- KatieLucas
The command line up there is a sort w/ SchwartzianTransform, an algorithm implementable in under 15 lines of JavaScript, with the WSH FileSystemObject? model to provide the file names. Care for me to demonstrate?
Er, what's wrong with ls | sort -k 1.2 ??
- Good point for this particular example, but only a few tools support changing the operative fields like that, and the tool's definition of "field" may differ from the input data, requiring sed-massaging. So of course KatieLucas is correct that mildly complicated pipelines along these general lines do in fact pop up quite regularly, and it is maddening to be on a system that doesn't allow one to do such things.
Agreed. I have a better example. When I take a backup of my Psion, I simply copy the contents of the disk to a directory whose name is today's date. This gives me a large collection of directories, nearly one per day, whose contents are nearly identical. I then index each one using this comment:
find $new -type f -exec md5sum '{}' ';' | sort -k 2 > $new.ndx
Now I delete all the identical files from the older directory like this
diff -rqsb $old $new | gawk '/identical$/{print $2}' | xargs rm
It has a problem if there are filenames with spaces.
Real situation, real problem to solve, real code I use frequently (in a script). How would I do this in Windows? Perhaps people who use Windows simply don't do this sort of thing, and maybe that's because they can't.
- As a Windows person, I do not do these things. I plain don't. If I wanted to do that sort of file management, I'd use Subversion. (If I guessed the overall purpose of those voodoo incantations. SingleCharacterSwitchesSuck?.)
- You seem to be missing the point. I have a small number of tools with which I can do almost anything. The Windows domain has a vast number of tools; when you want to do something, you need to pick one. To me, having a small number of tools that I understand well and with which I can do everything is far superior to having to know vast numbers of tools, and even then be stuck if there's a task without an appropriate tool. -- AnonymousDonor
- Well said! That is precisely the point.
- [Edited to remove religious war/rant/"I don't like to learn" crud:] There's the point of "understanding well". The Unix tools do have a high learning curve - the number of tools isn't all that small, and they're anything but orthogonal to each other (TimTowTdi makes learning off examples much harder when the syntax isn't self-explaining - and abbreviated-ad-absurdum tool names and said single character switches AreNot?.
- The tool philosophy is one thing, but there may be a far better, although specialized solution to it. There is a trade-off in terms of time spent between learning to use complex (and cryptic) tools that can cater to any special case (but make handling normal cases as complicated as handling the special ones at the same time) as opposed to learning specialized (e.g. more intuitive - badly designed / documented tools excepted) tools that work 90% of the time.
- Power comes at a price, and I just don't see how paying the price in learning time will make me that much more productive when using my computer.
- Who cares whether you want to or not? Most people say the same thing about learning advanced mathematics. And if they're a professional mechanic in a garage fixing cars, they don't need advanced math, it's true. Do I want to hear a mechanic explain why advanced math is useless? Hell no. Do what you want, and screw the religious war inflammatory rants. -- DougMerritt
- Speaking of rants, I don't quite get what you're on to right now. Please prove to me why tool philosophy, not being useful for everyone / in all cases (as nothing can be) warrants such appreciation? Why does "You can if you want to" outweigh "You have to even if you don't want to"? And most importantly, why should I necessarily be an inflammatory ranter if I dare to point out that indeed sometimes (which might include quite prolonged periods of time) some things are not useful for someone?
- And no, I don't want to hear people insulting me just because I dared to point out there are opinions contrary to theirs on issues they're being memish about, and that claims are valid only in the context of personal experience and the current situation at hand. I didn't say the Unix approach is useless, I said it's not useful enough for everyone, and may lightning strike me if I'm lying now, it's not useful enough in a large enough number of cases to matter in this discussion. (Someone care to delete the IMO either flawed, or subtly insulting car mechanic analogy?)
- The simple Unix tools don't have a high learning curve, whereas the complex ones do. Personally, I find the number of Windows tools a high learning curve. I understand Python, gawk, diff, find, sort, regular expressions, and a couple of minor utilities or builtins (for example, echo). Between them I do everything I want. I use a very small number of tools that can be combined. Your comment (now deleted) about writing (sorry, "whack"ing off) a script seems to me to be making my point. Sometimes there are specialized tools that do what I want. Sometimes I know about them. Generally, it's faster for me to write a single, complex command line, or a small set of scripts, than to remember which tool to use and how to get past the user interface. Horses for courses, as they say. If you find that your set of specialized tools does what you want, great. If you can write a script to cover those other cases, even better. My work is sufficiently varied that the people in the office who use Windows can't tell me anything to use that will do it for me. HelpMeToLearn? - As a Windows person, how would you search a file system for files that are similar? Or even for files that are identical? -- AnonymousDonor''
- I'm the single user of a machine on which I store precious little in the way of documents I haven't created myself, courtesy of google and broadband - therefore I am quite able to remember what files I've got stored where, and I can live with the overhead of scrolling around a folder for 1.2 seconds to find the right file before reading it for three hours. As to your question, I've never had a need to do that, so my answer is: I do not know, I do not care, and I'll probably die happy that way. I repeat - until I have to solve a problem, why'd I go about finding a way of solving it? (P.S.: I'd go on and find myself a diff engine, and use Python to glue that with that 10 line function that iterates through a directory structure. On a personal note - Unix would make much more sense to me if the tools were merely front-ends to well-documented functionality accessible from shared libraries to be combined by more granular / readable / efficient ways than subprocess spawning and unstructured byte streams.)
- As for the varied tools - I manage. I don't have problems remembering which tools do what, and I can use their interfaces quite efficiently. I'd say the responsibility for this is shared between faults of the two approaches, between faults of their concrete implementations, and between the differences in how we process data. YourMileageMayVary, in the same way I'm unable to bind functional meaning to abbreviations, sigils, and unstructured data, you find it difficult to bind said meaning to visual artefacts of a GUI.
- The Unix tools by themselves might be simple. By themselves, they're also nigh on useless. It's combining them to achieve practical results that's useful, and also hard, especially since command line invocations aren't self-explanatory (e.g. not requiring looking up man pages to be able to take a rough guess at their functionality).
- Absolutely. Also, people appear not to be aware that Unix tools (e.g. in the form of the MKS toolkit) have long been in heavy use internally at Microsoft (I experienced this directly when consulting there), for precisely the sorts of reasons AnonymousDonor is expressing. Furthermore, Microsoft has gradually but inexorably been borrowing Unix tools (and/or features thereof) to incorporate into its own CLI set from the very beginning of time - nor are native DOS tools like dir any kind of improvement in user interface over the Unix equivalents like ls; dir command line switches are cryptic and single letter, for instance. -- DougMerritt
- Of course, no-one ever does things using the dir output since there's people around that don't find it natural to access the filesystem data, format the information therein into text, and then parse the text again into data as opposed to having an easy-to-use way to directly access the filesystem data right ahead. The native DOS interface is not a way to make much of a point, the preferred way to automate a Windows machine (if you find the need to do so and the role of the machine requires it to run Windows - UseTheRightToolForTheRightJob?) is using the scripting host and COM components, which are a different beast altogether, and their quality might be subject to discussion and maybe many flames too. But then again, that says little about the approach in itself.
I think it's David who is so eloquently putting the Windows point of view, and I suspect that you're arguing against something I never said. I pointed out that Unix allows me to have near complete command of a small number of tools, and to glue them together to accomplish what I want. I also observed that frequently on a Windows system I simply can't do what I want because the tools I have been given don't contain the required feature(s). You said that the Windows environment suits you, and 90% of the time what you want to do
isin the tools you have. You then added that the rest of the time you either don't bother, or you write scripts.
Is that a fair summary?
I also gave an example of something I had to do today, this morning, and you said that you've never had to, and if you did you were sure you could cobble something together. Noted. It seems that when you do so you are using what is often thought of as a Unix way of working, and not a Windowsy way of working.
Here, my point was that I don't see the sense in using a shell script or the like to solve a problem that a specialized tool already solves well. If I want to drive a nail into a board, I'll use a hammer without trying to find out how to do it with an electrical screwdriver and a Leatherman. Solving problems you encounter is not even remotely a Unix thing; it's common sense. It's just when I decide to solve them myself (which is when, and only when, they haven't been solved satisfactorily already) that [??] seems to be the difference.
- I completely agree. If there's already a tool - use it. I find that there is rarely a tool that does what I want without going through a full find/install/configure/learn/use once/discard loop.
- And to think above I've been accused of, and I quote, "I don't want to learn crud". If the find/install takes you more than fifteen minutes in the days of google and broadband connectivity, you ever need to do any configuration of a Windows tool to solve common problems, and it takes you any time at all to find out what to click, that's not a problem of either Windows or the tool. And if you're only performing a task once, you can as well do it by hand that one time.
This isn't a value judgement, it is an observation. It's also not a 100% correlation. I use the Gimp, KMail,
FireFox and other large applications that do a lot of what I want. My problem is that my "10%" is more like "50%", and I can never find a Windows tool to do what I need. Yes, my life is different from yours. I have different, complex problems to solve every week, and GUI designers of monolithic tools have rarely anticipated them. Hence I work in what is generally considered to be a
UnixCulture way.
The tool philosophy in its core doesn't seem flawed to me; it's just the many quirks that bother me - mainly the lack of consistency and orthogonality. And why use half a zillion different tools/languages that do almost the same things but not enough to avoid having to use three of them, glued by sh, a most flaky scripting language, when a Perl/Python/Ruby script would do the same in a far less confusing manner. An extra line of code or a longer identifier never hurt anyone... In the UnixCulture, I lack a tendency of creating shared, reusable, programatically embeddable discrete components managed on the OS level from which to build on, where the command line "tools" themselves would be front-ends to more generic functionality which I can use without the gyrations of marshalling data from/to byte streams and spawning subprocesses (a procedure call still is, and probably shall ever be more efficient and concise than spawning a subprocess). The world needs a BetterOs?, anyone? But that is a notion that is in Windows, even if in an extremely unknown way. The fact something isn't a full-blown process invokable per command line doesn't mean it isn't there, and I'd like to see a problem a combination of awk/sed/grep/find can solve, and Windows JavaScript cannot, although admittably with a few gyrations in the form of utility functions to do what said Unix tools do. (Diffing might require the use of a diff engine wrapped in a COM component, which would take slightly more effort. Yet, that should indeed be doable.)
Oh how I hate ThreadMode. Let me try again. (Indeed, this needs a severe refactoring, alas, I don't have experience with that.)
- For most of what I do, there aren't any tools that I know of in any OS.
- Unix systems let me take some tools I know really well and do what I need.
- Windows systems force me to install Ruby, Python or similar and write programs.
- On Unix systems, you've got shell scripts, which are similar, only less powerful for more complex programming (mostly because they lack structured data) and with shorthand for job control and input/output redirection. The difference is that Unix systems come with the shells already present, but they're not in any way a part of the OS, only the culture, which is a result of historical evolution. If Guido van Rossum came up with Python before sh, and that language ended up with shell-like shorthand constructs, you'd probably end up hacking that.
- I also believe JavaScript doesn't need installing, no idea if the WSH runtime objects come preinstalled though.
- On Unix systems, I can usually rattle off what I want on a single command line.
- See below, that too is programming. The command line constructs are indeed defined in terms of Unix kernel calls, and thus can be considered as functions or operators of the sh language. A rose by any other name... And terse doesn't imply better, more efficient; it only implies harder to read.
Then:
- For most of what you do, there are ready made, professionally finished, flawless programs that do exactly what you want.
- For the remaining things, you fire up Ruby, PHP, Python, something, write a quick program and do it that way.
I find Windows bloody annoying because I can rarely find a tool that does what I want. You complain that the Unix tools lack consistency and orthogonality. I'm not arguing with you about any of these points. It seems that the Windows way of working mostly suits you well. It doesn't suit me because most of what I do hasn't been anticipated by the suppliers of Windows programs.
Let me say it again. The Unix domain has a small number of tools with which I can do almost anything. The Windows domain has a vast number of tools, and when you want to do something you need to pick one. To me, having a small number of tools that I understand well and with which I can do everything is far superior to having to know vast numbers of tools, and even then to be stuck if there's a task without an appropriate tool.
And here comes the "far superior", which I consider is strongly subjective. Since ease of use is always considered when specialized tools are being made, I find it easier to learn a larger amount of them still. And when I am indeed forced to improvise, I can do that - the Windows domain -does- have tools, only their usage is different, and they don't come with the system (which is a result of historical evolution from a simple, single-user OS aimed at personal computers and marketing). Windows doesn't come with batteries included for the power user, which is indeed a shame, but nothing stops you from going out and buying some. Heck, getting them for free.
Said rattling off a command line is programming, you are transforming and processing data, Unix command line tools are functions/APIs, only more expensive to invoke, and requiring marshalling the implicitly structured outputs to match the expected input for the next tool.
You said:
-
- "In the UnixCulture, I lack a tendency of creating shared, reusable, programatically embeddable discrete components managed on the OS level from which to build on, where the command line "tools" themselves would be front-ends to more generic functionality which I can use without the gyrations of marshalling data from/to byte streams and spawning subprocesses (a procedure call still is, and probably shall ever be more efficient and concise than spawning a subprocess). The world needs a BetterOs?, anyone? But that is a notion that is in Windows, even if in an extremely unknown way."
Quite frankly I can't understand what you're saying here. Perhaps you could start another page and explain what it is that Windows has that Unix doesn't - this text seems rather opaque to me.
- ''Frankly, Windows doesn't either in a sense; what I had in mind is Windows managing ActiveX components which you can interface with on a procedure call level, not on a byte stream level. Truth is, there isn't such a culture amongst Windows developers, and I'm pretty much ranting there. That's what I meant by a "Better OS". To make things less opaque, an OS where reusable components would be more integrated into the OS than Unix command line tools are, and one providing higher level interfaces between these processes than just streams of bytes. Like being able to pass collections around, or exporting data structures."
Finally, you work well and are productive on Windows. Great - go for it and I wish you well. I wish I were too, but on a Windows system I find myself constantly trying to work out how to do things, and mostly give up and write a program. I'm more productive on a Unix system.
-- AnonymousDonor
I got my introduction to UnixCulture from Living with Unix by Don Libes.
ls | sed "s/\(.\)\(.*\)/\2\1/g" | sort | sed "s/\(.*\)\(.\)/\2\1/g"
This, and its cousins in
PerlLanguage and
AwkLanguage, are for me the distinguishing characteristic of
UnixCulture. If you find reading and writing strings like this fun, informative, exciting, rewarding, whatever, you are comfortable in the
UnixCulture. If, to the contrary, you find it revolting, tedious, stupid, monotonous, opaque, whatever, you are not. The
UnixCulture is created, populated, and maintained by people who belong to the first group.
YourMileageMayVary.
- Ah, but here's the question: what is the alternative to accomplish the same task? Everyone complains bitterly about how ugly this sort of thing is, and they use that as an excuse to use systems where such things can't be done at all. To my mind, it is better to be able to do things in an ugly way, than to be unable to do them at all!
- I HaveThisPattern. It's how I justify using templates in CeePlusPlus.
- Yeah, that's probably a reasonable comparison. GenericFunctions aren't as ugly in some other languages, but it's better to have them than not have them.
- I doubt anyone in UnixCulture thinks this is a particularly pretty solution, especially when dealing with sed's nasty and primitive RegularExpression syntax, and most UnixGeeks? find the task itself onerous and stupid, which is why they take delight in having tools to help automate such tasks (even if they're clunky and have mutually incompatible regex dialects).
See TermOfAbuse, TheNotUnixCulture
CategoryCulture, CategoryUnix