Analysis is about:
I recently discovered that science is the art of cutting (check your Proto-Indo-European roots if you doubt it). As is analysis. It's what it is, not what it's about. Everything else is something else. Analysis is science is cutting things into pieces. Period. -- WaldenMathews
I recently started arguing ExtremeProgramming in an email conversation with a "Software Process" group, and found myself in an awkward position - someone tells me you can't write programs without analysis, but I don't know what analysis is! So, what is it, you software development gurus out there? -- BillTrost
If people like that are making such statements, the smart money says: neither do they :-) -- JimCoplien
Interesting question. I don't know what it is either. To find out I'm attempting to refactor this and WhatIsAnalysisContinued. (I'm afraid that the two pages as they stand are useless to someone who doesn't know. I'm sure that statement won't rub any of the contributors the wrong way - all of them are presumably people who know a lot about Analysis.) I'm putting my tentative synthesis of Analysis in WhatIsAnalysisAttempt. -- LaurentBossavit
A very simple model:
I may be oversimplifying a bit... -- BT
I'll give you my personal take on "analysis", which has stood me in good stead for some years now.
I belong to Group 0, people who say, "analysis" is the study of something that already exists, to answer some questions about it (implications: something already exists to study; you have questions to ask). That means, the development phase many software people call "analysis" is not analysis in my mind - in the sense I just mentioned. With some discussion I can usually get the people to agree so far. However there is this thing "analysis", a term that has been floating around for decades now. What is it? So I go back to where I feel safe: there is RequirementsGathering - determining what the system must or should have or do. Something is or isn't a requirement [if it's a FunctionalRequirement?], and we have to get a yes/no decision on each such thing.
After requirements, EVERYTHING is design - invention and the selection between choices for satisfying the requirements. In my book, design starts right away, very early. So there are requirements, and there is design. That's all there is. (e.g., Programming language can be a requirement, or, it can be a design choice.)
At this point, analysis fans go to one of three places, and that is OK by me, as long as I know where they are going.
Group 1 says, Oh, analysis is really refining the requirements. That's fine by me, that is easy to live with. To them, analysis means higher precision, more accurate requirements gathering. Good.
Group 2 says, Oh, analysis is really initial design. I am not so happy with this, because there is an implication of "truth" to the speaking of this initial design, as though it will somehow be correct without feedback. And so I discuss their use of "analysis" in the context of "early design decisions", which still need feedback, etc. And there is a lot of ground to cover in this discussion, and perhaps some hard choices.
Group 3 says, Oh, analysis is "implementation-independent design." And this is the only one I really disagree with, because I don't believe in such a thing, and especially don't trust giving it a truth-implying label. So another long conversation follows.
Group 4, says, determining what factors lurk around and in-between the Requirements (thanks to PeterMerel for bring this up, below)
Group 4 falls to the side of "refined requirements", and does imply being alert, and is valuable if done while alert (and it is rare to find people doing it).
"Analysis defines the ProblemDomain in engineering terms" carries no meaning to me. What are the proper parts of the ProblemDomain to define, and what is an engineering term? I don't know. I do know that if an Analyst labels an idea in the conceptual space, and gives that conceptual spot validity on a piece of paper, then that item blocks from consideration a lot of other, nearby spots in the conceptual space - and does so without the benefit of software design feedback - i.e., is a Group 2 or 3 use of "analysis". If an Analyst states that the domain involves an Agreement as a relation between two Parties, then a subsequent Designer will have to work extra hard to convince people that Agreement and Party are not good abstractions for the software system at hand (pretending for the moment that such is the case) - quite possibly, good software abstractions are also close at hand and also would model the domain nicely - only, the analysts didn't know enough to select them. The term "analysis" lends an unearned aura of truth and sanctioning to the abstractions. I vote for just calling that work a (domain-) design from the start, and make it pass two tests - is it a valid model of the domain, and does it make good software?
p.s. "In XP, analysis and design activities are mostly done only when..." presupposes agreement on the meaning of analysis, which we don't have. I actually do believe XP people actually do DesignBeforeCoding (see there) [historical note, this paragraph was written back in 1998, before the XP people admitted they did design ... since then they have announced loudly for all to hear that they actually do design virtually all the time]. -- AlistairCockburn
Personally, I almost never generate the term "analysis" in my own speaking. The occasions in which I say it are either: "requirements analysis", meaning, staring at the requirements, examining them for flaws and omissions, and getting extra details (RequirementsAnalysis is part of the requirements gathering work); or I say "performance analysis", "load analysis", "complexity analysis", using the original meaning of analysis as studying a design carefully to answer possibly important questions about it.
Anyhow, that's where I go with the word "analysis". and interestingly enough, I (long ago) once had no trouble getting Jim Odell and Ivar Jacobson to agree, they said they also would prefer not to use it so casually, but had used the term analysis because of the pre-existing vocabularies of the industry. -- AlistairCockburn
In Summary,
"Analysis defines the ProblemDomain in engineering terms" might mean "Breaking the UserStorys down into EngineeringTasks. -- AnonymousDonor
A Coleman-influenced definition that I think evades Alistair's categories: Requirements defines the problem in the client's terms. Analysis defines the ProblemDomain in engineering terms. Design defines a solution to the problem in architectural terms. The problem is a surface inside the ProblemDomain that is coterminous with the external functionality of the solution.
So Analysis is a useful activity when the clients haven't got a clear notion of what they're asking you to do. The client says: "I want fingernail polish that matches the color of my eyes". The analyst says: "On average, your eyes reflect a spectroscopic wavelength of 0.5 nm, except in low-light conditions when they glow bright red". The designer says: "We need to combine Max Factor fingernail gloss number 3, which has that same reflective wavelength, with these handy fingernail-implanted laser beams."
Hmm. Somehow that example grew too big for its boots. Oh well. In practice you may well have the same person doing more than one of these things, but Analysis as an activity that determines what factors lurk around and in-between the Requirements seems like a worthwhile thing to me.
I believe that XP does this kind of analysis when it transforms UserStory(s) into EngineeringTask(s) -- PeterMerel
I find I'm surprisingly close to PeterMerel's description. The only tweak I'd make is that design is the transformation from the problem to the solution. I include coding as part of design. To me, analysis and design cover it all - and even they are interleaved, considered in parallel, in good designs -- JimCoplien
Coleman describes Requirements as a separate activity, and these days Architecture as something separate too. Actually in the new Fusion these are not so much interleaved as recursive. Then Coding and Testing are separate activities too.
I think I agree with Cope though - this is getting too nice about things. In practice people work in fits and starts and without any orderly progression through a set methodological spiral.
In XP, if I understand it, analysis and design activities are mostly done only when testing the deliverables actually require them. This is arsey-versey to the other methodologies, which test deliverables only when analysis/design have signed off. I really like this. -- PeterMerel
As per usual I find myself agreeing with everything Alistair says when he disagrees with me. This is regrettable, because it means I have to think. Oh well.
Engineering terms: terms that make sense to an engineer. If you have a requirement stated in terms that your engineers want clarified, then it's not stated in engineering terms. Often the process of clarification is iterative, engineers being fickle and pedantic beasts, so what was engineering terms yesterday may become hand-waving today. Oh well.
XP and test-driven analysis: DesignBeforeCoding, sure, XP seems to do that in its bite-sized chunks. But XP also does CodeUnitTestFirst. So is it chicken or egg? I think now XP goes UserStory -> PlanningGame -> Carding -> EngineeringTask(s) -> UnitTests -> Coding -> Refactoring. I understand that a step can be left out if it's not needed, and I have a sneaking suspicion that the arrows can go the other way sometimes.
But PlanningGame is a ProblemDomain clarification, hence "analysis". And Carding is clarifying solution, hence design. And I imagine they only gets done if the path through to tested deliverables isn't clear, so that sounds like testing is driving analysis and design activities to me.
If those hairs are split, I really like Alistair's point that each model entity obscures the problem space immediately around it. A model entity is a place where people have stopped thinking. Since our heads only so big, we have to stop thinking some place. The trouble with the big models that come out of Analysis and BigDesignUpFront is that where we stopped thinking two months ago bears little relation to where we stop today. Worse, having a chart of where we stopped thinking two months ago can really screw up the understanding that we have today, because we spend more time trying to reconcile the two than we do getting closer to deliverables.
I guess what I'm saying here and on AnalysisParalysis is that Analysis qua ProblemDomain clarification is useful in small chunks, but that BigModelsAreUseless. -- PeterMerel
A "big model" is also an oxymoron, since model is our adaptation of the Italian modello, which means "a small measure", as in a small measure of something too large to deal with all at once. (Food for SoftwareEtymology.) -- WaldenMathews
I know it is often useless to apply dictionary definitions to technical terms, but in this case it would seem useful to at least remember the definition of analysis: separation of a whole into its component parts. And the etymology: New Latin, from Greek, from analyein to break up.
By adopting that definition, and considering design to be synthesis, then there is a fairly clear separation of the two activities, and they can be interleaved in a project from beginning to end.
Also, wouldn't that definition mean that requirements are design? -- Bob Haugen
I used to feel strongly that analysis is the discovery of the "conceptual model" for the software... the initial reification of "things" in the ProblemDomain. I.e., "there are accounts and schedules out there, we just need to discover them." I was a bit of a Platonist that way. PeterCoad and that early OOA school spoke of how you could "discover" things and move the initial model seamlessly from "analysis" to "design". I believe that this is just like the invention vs discovery thing in science and engineering. From one perspective, you are analyzing (discovering) and from another you are inventing (designing), and it doesn't matter who gets the credit (you or the universe) you are doing the same thing.
I ended up in a course on semantic modeling this term. It is just staggering how much of OO methodology comes from there. Even more staggering is the idea that the emphasis is slightly different. Different enough to cause trouble. In a semantic databases, you use categories (classes) to describe things. In OO, behavior is often more important than description as a basis for conceptualization. In other words, the initial concepts that people recognize in a problem aren't where it's at. It's in any objects that you can imagine collaborating to solve the problem.
I think that analysis is whatever process you use to know what stories to write. It's figuring out enough about the problem so as to be able to write detailed requirements. As such, it is my official opinion, which is no more likely to be correct than my everyday opinions, that XP offers few if any practices addressing analysis. Exceptions are:
Just a couple of months ago, it became clear to me what the difference between Analysis and Design are by having a concrete example of two people I know. One is a stronger Analyst than Designer and the other is a stronger Designer than Analyst. I had worked with both of them for a year and had an intuitive idea of their individual strengths, but when the label was applied, it all clicked.
Basically the Analyst had strong "people" skills -- he was the primary contact to the customer. He was also good at high level conceptual design. A particular class he designed became a cornerstone of the application, but it took me months before I really understood what the class meant because it was the interface to the business domain (which I didn't understand). The Designer was our language lawyer. He had read (and remembered!) the Java Language Specification. He had designed an elegant, extensible framework for manipulating and transforming stream based relational data, which was another cornerstone of the application. I went to the Designer when I had a question about how the code worked. I went to the Analyst when I had a question about why the code needed to work a particular way.
This distinction became even clearer on a later project these guys were involved in. The Analyst had a first run at code for a new application. It had a serious scalability problem even within the required number of threads, so it was time for some major refactoring. The Designer stepped in to help and with his greater understanding of the details of threading in Java, he offered some minor modifications to the design that allowed much better scalability.
So, in summary, I guess I subscribe to the idea that Analysis is a conceptual high level model of an application in the business domain. It can be explained to the customer in terminology they easily understand. Design is a detailed lower level model of an application suitable only for communication between people familiar with software development (and, of course, it helps to know the language used). -- GregVaughn
I asked this question on OTUG but I don't think I stated it very well and didn't really receive much enlightenment. A lot of the guys who do a lot of Up Front Analysis so they are creating a simulation of the ProblemDomain. This still doesn't make any sense to me. Is that what analysis is? I can see it might help you later if it really worked but most analysis I have seen is wrong. Perhaps its a case of "No analysis model survives the first contact with the user (of the subsequent system)". -- TomAyerst
Analysis in software projects is (or should be) the process of enumerating all the things about which you, as developer, have no choice. All else is design. -- JohnDaniels
(i.e. Group 1)
Dilbert online had the following appropriate comment:
Engineer: "Your req'ts doc is the biggest I've ever seen. It's too big to read but I can guess from its weight what must be in there." Customer: "A multi-user global system, right?" Engineer, shaking the document: "No, I'm not getting that."
However, if you ask JamesOdell directly, he would say that "analysis is the mapping from some perception of reality to some representation of that reality." He would then go on to say that "design is the mapping from that representation of reality to a particular implementation." Analysis and design, then, are two different mappings.
I interpret that as Group 3. There is no such mapping, except that it is an artifact of the mapper's selection process, i.e., it is very much a designing activity. And, he is asserting the truth of the mapping that is really a design activity. I, personally, have no trouble if someone does this to discover unnamed or inconsistent requirements (Group 1), but I have a great deal of trouble if someone does this activity and then passes it on to someone else as being "true". cheers -- Alistair
As Bob Haugen suggests above, I believe it's valid to start with dictionary definitions and etymologies to get at least a "ballpark" sense of the word, then proceed into specializations and excursions from there. Etymologically, analyze means to break apart. And you will find this core meaning repeated in dictionary definitions that are broken out (analyzed) by discipline:
Previously, I had said that "analysis is refactoring", but I want to back that out now, because I see it creates more confusion than understanding. What I meant was that the strong overlap may provide some insights. The word 'factor' carries the hint of this. Before you arrive at a better synthesis of your code, you have to separate its parts out. The search for the "right" separations is key, and transcends the particular application.
I think we may have forgotten that it's not just "analysis", it's "systems analysis". Systems are complex by definition, and often confusing and misleading on first impressions. They are full of non-intuitive matter and effects. If we are going to work with them, we have to understand them. Hence, I also see analysis as being primarily a means for gaining understanding, as Alistair said above. Group 0, slide over and let me in.
Alistair uses the phrase "the study of things that already exist", whose irony caught my attention. I'd like to augment the thought, and perhaps clarify it a bit. While a given product may not exist yet, some description of it may, and we can analyze that. In this information business, it's hard to imagine that which doesn't exist, because by imagining it, we make it exist. And then we have something to analyze. Even though designs and plans may be thought to be in the "solution space", their study is no less analysis. I think Alistair was implying that invention is not a species of analysis, and I agree.
Harkening back to the notion of "system", there are a lot of systems encountered in developing software. There is the system in which the client works and has a problem. There is the system of information that captures that problem. There is the system of software that can be specified and then built to change the problem. There is the system that is comprised by any significant attempt to describe or specify something (a spec is a system). All of these things are systems because they have parts that relate in complex ways. All of them confuse, and prompt us into analysis.
The requirements system (both the environment whence requirements spring, and the system of description that relates them) is often thought to be the most important one on the map. But it's not the only one, and hence a major ambiguity in our engineering language: when someone says "analysis", and there's no other context, I think they mean in the requirements area, but they may not. To be safe, we should cultivate the habit of qualifying the word "analysis" by at least saying what's being analyzed, and perhaps also by saying what sort of criteria will be used to form the requisite separations.
Remember on Babe, when the pig got the brown chickens to stand in one group and the white ones to stand in another? Chicken analysis: Spectral. I was impressed, but I wondered how it would help, and I often wonder that about other analyses. Clearly, some are better than others. When it comes to handling complexity, which is the sine qua non of our business, a good analysis is always one that offers a decent power ratio between the amount of "component" knowledge you have to assimilate and the amount of pertinent questions you can simulate your way to answers to. There is an unspoken need to get to essences, otherwise analysis fails to move us forward. The TOP-DOWN movement ran afoul of this problem.
In systems with autonomous agents and emergent behaviors, you see analytical power taken to the max. The number of different components is small; the number of interesting combinatorial and reactive effects is off the map. The system behaves almost as if it is alive, yet we can understand and control it (barely). I think somewhere around here a distinction between generative patterns and gamma patterns has been made. The same kind of classification could be used for analysis. The generative analyses are really interesting, if not sometimes a little too powerful: when the parts don't imply the results in an intuitive way, what happened to understanding?
I think there may be something between Group 0 and Group 1 or maybe and extension of Group 0 that is focused on aligning the "what is happening now" kind of analysis with "what we should be doing". This is not requirements specification but a gut-check look at the business process to make sure that the development team is not polishing the deck chairs on a sinking ship. In trying to implement XP as a methodology, I have spent a lot of time reading here, the colored books and other XP sites trying to get a grip on the political reality that the client (internal for me) may have a vested interest in maintaining the status quo and therefore the developer/client interaction to create UserStories may result in a less than optimal solution from the business perspective.
Ron Jeffrie's Business Analysis in Extreme Programming at http://www.xprogramming.com/xpmag/BizAnalysis.htm provides good insight on the role of a BusinessAnalyst and on managing the interaction between the BusinessAnalyst, the programmer and the client, but there may be a more important role for them . . . and that is as a change agent for improving the process before development begins. This kind of analysis may present "What if" scenarios to the client to help them think about alternative solutions and bridge jurisdictional boundaries where two or more organizational entities (client's) participate in the business process.
We are trying very hard to incorporate the role of BusinessAnalyst to address these issues without degrading the developer/client relationship or falling into the BigDesignUpFront trap.
-- RonDace
I used to feel strongly that analysis is the discovery of the "conceptual model" for the software... the initial reification of "things" in the ProblemDomain.
Agreed. Inspiration may come from analysis of the platonic world, but that output is really input into the blender of software development concerns like: dependency management, reuse, testing, release, etc that may make the original inspiration unrecognizable. -- AnonymousDonor
And to synthesize Group 0 and Group 2... Analysis is the process of studying requirements to determine whether a system can be built that satisfies them. A by-product of this process is one or more conceptual models of systems that can be built that will satisfy those requirements. -- HowardFear
Analysis is a rectification of requirements - it is only needed as a separate step when you have RawRequirements. When you organise InteractiveRequirementsGathering and make people use BusinessDomainModel? or DomainDictionaly?, or especially if you use PairModeling - Analysis pretty much becomes blended into Requirements Gathering.. -- AlexJouravlev
Coming from a very different background originally (military, with a strong engineering element to my job) my definition of analysis is fairly precise: Systematically evaluating criteria that balance a system or course of action against a desired outcome.
In the case of target analysis, for example, we looked at what the intent and stated requirement is (sometimes two very different things*) and then look across our possible courses of action and compare outcomes based on rated criteria we have for things like that. That is analysis. The "rated criteria" can be almost anything so long as its relevant, but its best to keep the list as short as possible (more than 3, less than 10 elements).
In the case of software development I replace "course of action" with "architecture" or "component" or another relevant thing and rate the concept based on tradeoffs towards the goal. The point of analysis isn't to get "the right answer" but to discover both how well an array of possibilities rates** against a defined problem and deepen understanding of the problem itself. Sometimes the most interesting part of the whole process is finding that your intuitions about how you are rating are screwed up, and this usually indicates a misunderstanding of the intent, a misalignment between the intent and stated requirements, or that you just don't have a proper grasp of the problem yet -- so rethinking both the rating system and the conceptual intent and requirements is in order. By far the most important result of defining and following an actual analysis process is that it kills your darlings with fire and reveals how emotional most of your (even technical) choices are -- its also a good way to crush religious debate brewing on a development team if you're in charge.
The biggest thing to carry away from this is that analysis is a process designed specifically to remove or at least reveal the emotions involved in what would otherwise be snap decisions. It is not brainstorming or just "thinking really hard" which is most of what goes on when people claim to be "doing analysis". But brainstorming and deep contemplation are necessary prerequisites for actual analysis and drive information collection, criteria determination, scoring, scale creation, and everything else that goes into actually doing the analysis.
You have to be careful to make sure that your analysis process itself is balanced against the scale of the need or problem -- I mean don't let the analysis process grow into some institutionally contrived beast that paralyzes you. For 95% of your decisions a quick mental, unwritten gaming of choices is all that's needed -- but when determining architecture on a large project its a really good idea to start writing things down and scoring your available choices. There is no manual for this that fits every situation and there never can be, as each problem is different; in my view this lies at the core of why there is no silver bullet.
[* Special operations and software are often the same in that the customer rarely knows what they want or understands secondary effects.] [** Literally, like score it using points -- if it feels funny or stupid you're probably on the edge of discovering some faulty assumptions which is the whole point of the exercise.]
--CraigEverett?