Cart
Free US shipping over $10
Proud to be B-Corp

Evaluation Theory, Models, and Applications Daniel L. Stufflebeam (Western Michigan University)

Evaluation Theory, Models, and Applications By Daniel L. Stufflebeam (Western Michigan University)

Evaluation Theory, Models, and Applications by Daniel L. Stufflebeam (Western Michigan University)


$51.99
Condition - Very Good
Only 1 left

Summary

The golden standard evaluation reference text Now in its second edition, Evaluation Theory, Models, and Applications is the vital text on evaluation models, perfect for classroom use as a textbook, and as a professional evaluation reference.

Evaluation Theory, Models, and Applications Summary

Evaluation Theory, Models, and Applications by Daniel L. Stufflebeam (Western Michigan University)

The golden standard evaluation reference text

Now in its second edition, Evaluation Theory, Models, and Applications is the vital text on evaluation models, perfect for classroom use as a textbook, and as a professional evaluation reference. The book begins with an overview of the evaluation field and program evaluation standards, and proceeds to cover the most widely used evaluation approaches. With new evaluation designs and the inclusion of the latest literature from the field, this Second Edition is an essential update for professionals and students who want to stay current. Understanding and choosing evaluation approaches is critical to many professions, and Evaluation Theory, Models, and Applications, Second Edition is the benchmark evaluation guide.

Authors Daniel L. Stufflebeam and Chris L. S. Coryn, widely considered experts in the evaluation field, introduce and describe 23 program evaluation approaches, including, new to this edition, transformative evaluation, participatory evaluation, consumer feedback, and meta-analysis. Evaluation Theory, Models, and Applications, Second Edition facilitates the process of planning, conducting, and assessing program evaluations. The highlighted evaluation approaches include:

  • Experimental and quasi-experimental design evaluations
  • Daniel L. Stufflebeam's CIPP Model
  • Michael Scriven's Consumer-Oriented Evaluation
  • Michael Patton's Utilization-Focused Evaluation
  • Robert Stake's Responsive/Stakeholder-Centered Evaluation
  • Case Study Evaluation

Key readings listed at the end of each chapter direct readers to the most important references for each topic. Learning objectives, review questions, student exercises, and instructor support materials complete the collection of tools. Choosing from evaluation approaches can be an overwhelming process, but Evaluation Theory, Models, and Applications, Second Edition updates the core evaluation concepts with the latest research, making this complex field accessible in just one book.

About Daniel L. Stufflebeam (Western Michigan University)

DANIEL L. STUFFLEBEAM, PHD, is Distinguished University Professor Emeritus at Western Michigan University, Kalamazoo.

CHRIS L. S. CORYN, PHD, is director of the Interdisciplinary PhD in Evaluation (IDPE) program and assistant professor in the Evaluation, Measurement, and Research (EMR) program at Western Michigan University. He is the executive editor of the Journal of MultiDisciplinary Evaluation.

Table of Contents

List of Figures, Tables, and Exhibits xiii

Dedication xvii

Prefacexix

Acknowledgments xxiii

The Author xxv

Introductionxxvii

Changes to the First Editionxxviii

Intended Audience xxviii

Overview of the Books Contentsxxix

Study Suggestions xxxii

Part One: Fundamentals of Evaluation 1

1 OVERVIEW OF THE EVALUATION FIELD 3

What Are Appropriate Objects of Evaluations and Related Subdisciplines of Evaluation? 3

Are Evaluations Enough to Control Quality, Guide Improvement, and Protect Consumers? 4

Evaluation as a Profession and Its Relationship to Other Professions4

What Is Evaluation? 6

How Good Is Good Enough? How Bad Is Intolerable? How Are These Questions Addressed? 17

What Are Performance Standards? How Should They Be Applied?18

Why Is It Appropriate to Consider Multiple Values?20

Should Evaluations Be Comparative, Noncomparative, or Both? 21

How Should Evaluations Be Used?21

Why Is It Important to Distinguish Between Informal Evaluation and Formal Evaluation?26

How Do Service Organizations Meet Requirements for Public Accountability?27

What Are the Methods of Formal Evaluation? 29

What Is the Evaluation Profession, and How Strong Is It?29

What Are the Main Historical Milestones in the Evaluation Fields Development?30

2 EVALUATION THEORY45

General Features of Evaluation Theories 45

Theorys Role in Developing the Program Evaluation Field47

Functional and Pragmatic Bases of Extant Program Evaluation Theory 48

AWord About Research Related to Program Evaluation Theory 49

Program Evaluation Theory Defined 50

Criteria for Judging Program Evaluation Theories52

Theory Development as a Creative Process Subject to Review and Critique by Users56

Status of Theory Development in the Program Evaluation Field 57

Importance and Difficulties of Considering Context in Theories of Program Evaluation58

Need for Multiple Theories of Program Evaluation58

Hypotheses for Research on Program Evaluation59

Potential Utility of Grounded Theories 62

Potential Utility of Metaevaluations in Developing Theories of Program Evaluation 63

Program Evaluation Standards and Theory Development63

3 STANDARDS FOR PROGRAM EVALUATIONS 69

The Need for Evaluation Standards71

Background of Standards for Program Evaluations73

Joint Committee Program Evaluation Standards 74

American Evaluation Association Guiding Principles for Evaluators80

Government Auditing Standards83

Using Evaluation Standards97

Part Two: An Evaluation of Evaluation Approaches and Models 105

4 BACKGROUND FOR ASSESSING EVALUATION APPROACHES 107

Evaluation Approaches109

Importance of Studying Alternative Evaluation Approaches109

The Nature of Program Evaluation 110

Previous Classifications of Alternative Evaluation Approaches 110

Caveats112

5 PSEUDOEVALUATIONS117

Background and Introduction117

Approach 1: Public Relations Studies 119

Approach 2: Politically Controlled Studies 120

Approach 3: Pandering Evaluations 122

Approach 4: Evaluation by Pretext123

Approach 5: Empowerment Under the Guise of Evaluation125

Approach 6: Customer Feedback Evaluation 127

6 QUASI-EVALUATION STUDIES133

Quasi-Evaluation Approaches Defined 133

Functions of Quasi-Evaluation Approaches 134

General Strengths and Weaknesses of Quasi-Evaluation Approaches 134

Approach 7: Objectives-Based Studies 135

Approach 8: The Success Case Method 137

Approach 9: Outcome Evaluation as Value-Added Assessment 143

Approach 10: Experimental and Quasi-Experimental Studies 147

Approach 11: Cost Studies152

Approach 12: Connoisseurship and Criticism 155

Approach 13: Theory-Based Evaluation 158

Approach 14: Meta-Analysis164

7 IMPROVEMENT- AND ACCOUNTABILITY-ORIENTED EVALUATION APPROACHES 173

Improvement- and Accountability-Oriented Evaluation Defined 173

Functions of Improvement- and Accountability-Oriented Approaches 174

General Strengths and Weaknesses of Decision- and Accountability-Oriented Approaches 174

Approach 15: Decision- and Accountability-Oriented Studies 174

Approach 16: Consumer-Oriented Studies 181

Approach 17: Accreditation and Certification 184

8 SOCIAL AGENDA AND ADVOCACY EVALUATION APPROACHES 191

Overview of Social Agenda and Advocacy Approaches191

Approach 18: Responsive or Stakeholder-Centered Evaluation 192

Approach 19: Constructivist Evaluation197

Approach 20: Deliberative Democratic Evaluation202

Approach 21: Transformative Evaluation205

9 ECLECTIC EVALUATION APPROACHES 213

Overview of Eclectic Approaches 213

Approach 22: Utilization-Focused Evaluation 214

Approach 23: Participatory Evaluation 219

10 BEST APPROACHES FOR TWENTY-FIRST-CENTURY EVALUATIONS 229

Selection of Approaches for Analysis 230

Methodology for Analyzing and Evaluating the Nine Approaches230

Our Qualifications as Raters230

Conflicts of Interest Pertaining to the Ratings 231

Standards for Judging Evaluation Approaches 231

Comparison of 2007 and 2014 Ratings 236

Issues Related to the 2011 Program Evaluation Standards237

Overall Observations 237

The Bottom Line 240

Part Three: Explication of Selected Evaluation Approaches 247

11 EXPERIMENTAL AND QUASI-EXPERIMENTAL DESIGN EVALUATIONS249

Chapter Overview 249

Basic Requirements of Sound Experiments 250

Prospective Versus Retrospective Studies of Cause251

Uses of Experimental Design251

Randomized Controlled Experiments in Context252

Suchman and the Scientific Approach to Evaluation256

Contemporary Concepts Associatedwith the Experimental andQuasi-Experimental Design Approach to Evaluation265

Exemplars of Large-Scale Experimental and Quasi-Experimental Design Evaluations 269

Guidelines for Designing Experiments 271

Quasi-Experimental Designs280

12 CASE STUDY EVALUATIONS291

Overview of the Chapter291

Overview of the Case Study Approach 292

Case Study Research: The Views of Robert Stake294

Case Study Research: The Views of Robert Yin 297

Particular Case Study Information Collection Methods301

13 DANIEL STUFFLEBEAMS CIPP MODEL FOR EVALUATION: AN IMPROVEMENT AND ACCOUNTABILITY-ORIENTED APPROACH309

Overview of the Chapter309

CIPP Model in Context309

Overview of the CIPP Categories312

Formative and Summative Uses of Context, Input, Process, and Product Evaluations 313

Philosophy and Code of Ethics Underlying the CIPP Model314

The Models Values Component317

Using the CIPP Framework to Define Evaluation Questions319

Delineation of the CIPP Categories and Relevant Procedures 319

Use of the CIPP Model as a Systems Strategy for Improvement 332

14 MICHAEL SCRIVENS CONSUMER-ORIENTED APPROACH TO EVALUATION341

Overview of Scrivens Contributions to Evaluation341

Scrivens Background 343

Scrivens Basic Orientation to Evaluation343

Scrivens Definition of Evaluation343

Critique of Other Persuasions344

Formative and Summative Evaluation 345

Amateur Versus Professional Evaluation347

Intrinsic and Payoff Evaluation347

Goal-Free Evaluation 347

Needs Assessment 348

Scoring, Ranking, Grading, and Apportioning 349

Checklists 352

Key Evaluation Checklist353

The Final Synthesis 354

Metaevaluation357

Evaluation Ideologies 357

Avenues to Causal Inference361

Product Evaluation 363

Professionalization of Evaluation366

Scrivens Look to Evaluations Future 366

15 ROBERT STAKES RESPONSIVE OR STAKEHOLDER-CENTERED EVALUATION APPROACH373

Stakes Professional Background 374

Factors Influencing Stakes Development of Evaluation Theory 374

Stakes 1967 Countenance of Educational Evaluation Article 375

Responsive Evaluation Approach383

Substantive Structure of Responsive Evaluation390

Functional Structure of Responsive Evaluation 390

An Application of Responsive Evaluation392

Stakes Recent Rethinking of Responsive Evaluation397

16 MICHAEL PATTONS UTILIZATION-FOCUSED EVALUATION 403

Adherents of Utilization-Focused Evaluation 404

Some General Aspects of Pattons Utilization-Focused Evaluation405

Intended Users of Utilization-Focused Evaluation407

Focusing a Utilization-Focused Evaluation 407

The Personal Factor as Vital to an Evaluations Success408

The Evaluators Roles 408

Utilization-Focused Evaluation and Values and Judgments409

Employing Active-Reactive-Adaptive Processes to Negotiate with Users 410

Pattons Eclectic Approach411

Planning Utilization-Focused Evaluations411

Collecting and Analyzing Information and Reporting Findings 412

Summary of Premises of Utilization-Focused Evaluation413

Strengths of the Utilization-Focused Evaluation Approach414

Limitations of the Utilization-Focused Evaluation Approach 415

Part Four: Evaluation Tasks, Procedures, and Tools 421

17 IDENTIFYING AND ASSESSING EVALUATION OPPORTUNITIES 423

Sources of Evaluation Opportunities 423

Bidders Conferences 431

18 FIRST STEPS IN ADDRESSING EVALUATION OPPORTUNITIES 435

Developing the Evaluation Team 436

Developing Thorough Familiarity with the Need for the Evaluation437

Stipulating Standards for Guiding and Assessing the Evaluation 437

Establishing Institutional Support for the Projected Evaluation 437

Developing the Evaluation Proposals Appendix438

Planning for a Stakeholder Review Panel439

19 DESIGNING EVALUATIONS445

A Design Used for Evaluating the Performance Review System of a Military Organization 446

Generic Checklist for Designing Evaluations 462

20 BUDGETING EVALUATIONS 479

Ethical Imperatives in Budgeting Evaluations 480

Fixed-Price Budget for Evaluating a Personnel Evaluation System 483

Other Types of Evaluation Budgets 486

Generic Checklist for Developing Evaluation Budgets493

21 CONTRACTING EVALUATIONS 505

Definitions of Evaluation Contracts and Memorandums of Agreement 506

Rationale for Evaluation Contracting 508

Addressing Organizational Contracting Requirements511

Negotiating Evaluation Agreements 511

Evaluation Contracting Checklist512

22 COLLECTING EVALUATIVE INFORMATION 519

Key Standards for Information Collection519

An Information Collection Framework 540

Useful Methods for Collecting Information 543

23 ANALYZING AND SYNTHESIZING INFORMATION557

General Orientation to Analyzing and Synthesizing Information 558

Principles for Analyzing and Synthesizing Information559

Analysis of Quantitative Information 560

Analysis of Qualitative Information 575

Justified Conclusions and Decisions 580

24 COMMUNICATING EVALUATION FINDINGS 589

Review of Pertinent Analysis and Advice from Previous Chapters590

Complex Needs and Challenges in Reporting Evaluation Findings591

Establishing Conditions to Foster Use of Findings592

Providing Interim Evaluative Feedback 600

Preparing and Delivering the Final Report603

Providing Follow-Up Support to Enhance an Evaluations Impact619

Part Five: Metaevaluation and Institutionalizing and Mainstreaming Evaluation 629

25 METAEVALUATION: EVALUATING EVALUATIONS631

Rationale for Metaevaluation632

Evaluator and Client Responsibilities in Regard to Metaevaluation634

Formative and Summative Metaevaluations 634

A Conceptual and Operational Definition of Metaevaluation 634

An Instructive Metaevaluation Case 640

Metaevaluation Tasks 643

Metaevaluation Arrangements and Procedures 647

Comparative Metaevaluations662

Checklists for Use in Metaevaluations 664

The Role of Context and Resource Constraints 664

26 INSTITUTIONALIZING AND MAINSTREAMING EVALUATION 671

Review of this Books Themes671

Overview of the Remainder of the Chapter 672

Rationale and Key Principles for Institutionalizing and Mainstreaming Evaluation 673

Early Efforts to Help Organizations Institutionalize Evaluation 674

Recent Advances of Use in Institutionalizing and Mainstreaming Evaluation675

Checklist for Use in Institutionalizing and Mainstreaming Evaluation 676

Glossary691

References 713

Index744

Additional information

GOR013968364
9781118074053
111807405X
Evaluation Theory, Models, and Applications by Daniel L. Stufflebeam (Western Michigan University)
Used - Very Good
Hardback
John Wiley & Sons Inc
2014-11-28
800
N/A
Book picture is for illustrative purposes only, actual binding, cover or edition may vary.
This is a used book - there is no escaping the fact it has been read by someone else and it will show signs of wear and previous use. Overall we expect it to be in very good condition, but if you are not entirely satisfied please get in touch with us

Customer Reviews - Evaluation Theory, Models, and Applications