March 18, 2005
Analysis vs. Evaluation vs. Assessment
As planners start to develop work plans implementing the new rule, we're thinking of the major "cost centers" of activities we're engaging in. One thing we're struggling with a bit, and I'd like some feedback, are the differences between "analysis", "evaluation", and "assessment." Although the 1982, the 2000, and the new rule all contain these technical processes, they are emphasized in different ways. In the 1982 rule, early steps focused on "analysis" while "evaluation" was the seventh step. For most planning efforts, I think people were too exhausted as they stumbled toward the end to really do a good job with "evaluation." In the 2000 rule, we put a lot of emphasis on "assessments" in the early steps - (we even had to change some words - one of the authors told me once that we changed the name of "local assessment" to "local area analysis" to make it appear less dominant.)
The new rule I think properly emphasizes a planning system, and uses evaluation as a way to look at the recent past as a springboard for making changes to the plan. The draft directives also mention elements of analysis and assessment, and I wonder if it will be important to make a distinction in how we develop planning processes.
Here's my understanding. Evaluation is a process of looking for meaning of monitoring data and detecting early warning signs. In an adaptive management approach, evaluation is sometimes used to review information according to an original design. Assessment is a process to obtain information through surveying, characterizing, synthesizing, and interpreting primary data sources. I think most of the sustainability requirements of the new rule and draft directives might be met through assessment processes. Finally, analysis, is a process to search for understanding, by taking things apart and studying the parts. Analysis is problem solving - you develop a question, and try to figure out an answer.
There may be important differences in these processes. They may need different expertise. They may differ in how the results are communicated and used in the process. For some phases of planning, it might not be important to separate a technical process this way. But when we start to think about the "rigor" of a process, that probably means different things depending upon whether you're looking at analysis, evaluation, or assessment.
Posted by John Rupe on March 18, 2005 at 02:49 PM | Permalink
TrackBack URL for this entry:
Listed below are links to weblogs that reference Analysis vs. Evaluation vs. Assessment:
Posted by: Dave Iverson
Adaptive Management Word Soup
Like any other organization, the Forest Service loves to co-opt words and to create meanings specific to procedures and processes. Perhaps that’s inevitable. But maybe it is too-often more unfortunate than inevitable.
John says, “Evaluation [in the FS adaptive management lexicon] is a process of looking for meaning of monitoring data and detecting early warning signs. In an adaptive management approach, evaluation is sometimes used to review information according to an original design.”
In common usage, evaluation means something like: 1) To ascertain or fix the value or worth of. 2) To examine and judge carefully; appraise.
We “evaluate” things all the time.
John says, “Assessment is a process to obtain information through surveying, characterizing, synthesizing, and interpreting primary data sources. I think most of the sustainability requirements of the new rule and draft directives might be met through assessment processes.”
In common usage, assessment means something like: 1) The act of assessing; appraisal. 2) An amount assessed, as for taxation.
We “assess” things all the time.
John says, “…analysis, is a process to search for understanding, by taking things apart and studying the parts. Analysis is problem solving - you develop a question, and try to figure out an answer.”
In common usage, analysis means something like: 1) The separation of an intellectual or material whole into its constituent parts for individual study. 2) The study of such constituent parts and their interrelationships in making up a whole. 3) The separation of a substance into its constituent elements to determine either their nature (qualitative analysis) or their proportions (quantitative analysis).
We “analyze” things all the time.
And we do all three (evaluate, assess, analyze) at various times throughout any process. But we have to be careful how we describe what we do in the quasi-legal context of government regulation compliance. Remember too, that the quasi-legal context becomes the legal-context once we go to court. But that’s another topic for another time.
I like the definitions I stole from Dictionary.com better than the ones John offered up, but a blend of the two might improve upon the ones I found. Or maybe I’m enamored with the ones I found simply because I found them. We should also see what the “rule” and “directives” serve up.
I do not like narrow focus on “evaluation” in what the FS often calls the “monitoring and evaluation” phase of, say a forest plan—or some other big-deal process. But I guess I have to get over it, while trying to get folks to focus there AND elsewhere (practically everywhere) when attempting “evaluation.”
John, why do you “think most of the sustainability requirements of the new rule and draft directives might be met through assessment processes”? Isn’t sustainability something that we must approach through careful interweaving of all process, policy, law, action?
And why do you think that, “Analysis is problem solving - you develop a question, and try to figure out an answer”? Often analysis is a part of problem solving, but certainly not the whole of it.
To test my thinking on analysis, I grabbed one book off my shelf, THE LOGIC OF FAILURE, by Dietrich Dörner.
“Roth found that the bad problem solvers tended to use unqualified expressions: constantly, every time, without exception, absolutely, entirely, completely, totally, unequivocally, undeniably, without question, certainly, solely, nothing, nothing further, only, neither … nor, must, and have to.”
“The good problem solvers, on the other hand, tended more toward qualified expressions: now and then, in general, sometimes, ordinarily, often, a bit, in particular, somewhat, specifically, especially, to some degree, perhaps, conceivable, questionable, among other things, on the other hand, also, moreover, may, can, and be in a position to.”
“… good problem solvers favored expressions that take conditions and exceptions into account., that stress main points but don’t ignore subordinate ones, and that suggest possibilities. By contrast, bad problem solvers used "absolute" concepts that do not admit of other possibilities or circumstances.” (p. 175)
Somehow I believe there is much more than analysis going on in the minds of good problem solvers…
One last point:
Speaking of “other possibilities or circumstances,” I believe we need to talk more than we do in the FS about “compliance” (with law, regulation, policy, etc.) and about “disclosure.” But I’ve droned-on long enough here.
Dave Iverson | Mar 23, 2005 8:52:21 AM
Posted by: Sharon Friedman
As to evalution, it along with monitoring, needs to occur at a variety of different scales. For example, let's think about different kinds of monitoring and opportunities for evaluation and input to another process. Here are some thoughts:
Are there three adaptive loop levels at the LMP and below levels that an EMS would require linkages among to achieve continual improvement? This is just a first cut at these thoughts, so comments are welcome.
1. Day to day project administration (contractor did this, I told her not to, she didn't listen,I got another contractor.) (within project, mostly informal with contract documentation)
2. Lessons learned from one project (design and implementation) inform another project (within same type of activity) (maybe there isn't a formal mechanism for this yet? depends on individuals learning by experience with no particular documentation? EMS would require something more structured).
3. Lessons learned from the sum of projects plus LMP level monitoring informs need for change to LMP components (project plan project linkage)(not sure what the mechanism for the sum of projects to plan linkage is, but one is needed; LMP monitoring to plan should be fairly clear).
What do folks think of this?
Sharon Friedman | Mar 26, 2005 7:23:54 AM
Posted by: Sharon Friedman
Talking more about compliance...we are thinking that these ISO 14001 requirements will focus greater attention on legal requirements and compliance and making access to that information easier internally and externally.
"4.3.2 Legal and other requirements
The organization shall establish, implement and maintain a procedure(s)
a) to identify and have access to the applicable legal requirements and other requirements to which the
organization subscribes related to its environmental aspects, and
b) to determine how these requirements apply to its environmental aspects.
The organization shall ensure that these applicable legal requirements and other requirements to which the
organization subscribes are taken into account in establishing, implementing and maintaining its environmental management system." and
"4.5.2 Evaluation of compliance
126.96.36.199 Consistent with its commitment to compliance, the organization shall establish, implement and maintain a procedure(s) for periodically evaluating compliance with applicable legal requirements.
The organization shall keep records of the results of the periodic evaluations.
188.8.131.52 The organization shall evaluate compliance with other requirements to which it subscribes. The organization may wish to combine this evaluation with the evaluation of legal compliance referred to in 184.108.40.206
or to establish a separate procedure(s).
The organization shall keep records of the results of the periodic evaluations."
Sharon Friedman | Mar 26, 2005 7:57:34 AM
The comments to this entry are closed.