Cart
Free US shipping over $10
Proud to be B-Corp

Evaluating Practice Joel Fischer

Evaluating Practice By Joel Fischer

Evaluating Practice by Joel Fischer


$4.48
Condition - Good
Only 1 left

Faster Shipping

Get this product faster from our US warehouse

Evaluating Practice Summary

Evaluating Practice: Guidelines for the Accountable Professional (with FREE SINGWIN CD-ROM) by Joel Fischer

Now with a free SINGWIN CD-ROM, Evaluating Practice, Fourth Edition, makes understanding and applying data anaylsis even easier for students and instructors.

Unsurpassed among human service evaluation texts, Evaluating Practice, includes the innovative SINGWIN program, created by Charles Auerbach, David Schnall, and Heidi Heft Laporte of Yeshiva University.

The text instructs students and instructors on managing cases and charting and filling out scales. Although the authors are best known within the social work discipline, this text can also be used in other professional programs such as nursing, counseling, psychology, and psychiatry.

Table of Contents

Each chapter begins with a "Statement of Purpose" and concludes with "Summary," and most include "Introduction."

Preface.


Prologue.

I. WHAT ARE YOU GETTING INTO?

1. Integrating Evaluation and Practice: Introduction to Single-System Designs.

Introduction to Single-System Designs.

What Are Single-System Designs?

Single-System Designs and Classical Research: The Knowledge-Building Context.

Single-System Evaluation, Qualitative Research, and Quantitative Research.

Advantages of Using Single-System Designs in Practice.

A Walk Through the Evaluation Process.

II. CONCEPTUALIZING AND MEASURING TARGETS AND OBJECTIVES/GOALS.

2. Basic Principles of Conceptualization and Measurement.

What Is Measurement?

Definition as a First Step in Measurement.

Can Everything Be Measured?

Key Characteristics of All Measures.

3. Specifying Problems and Goals: Targets of Intervention.

Introduction: From General Problems to Specific Targets of Intervention.

Specifying Client Concerns: Identifying and Clarifying Problems and Potentials.

Specifying Goals and Objectives.

Using Goal Attainment Scaling (GAS) to Establish Goals.

Setting Goals in Groups.

Problems and Issues in Setting Goals.

4. Developing a Measurement and Recording Plan.

Steps in Developing a Recording Plan.

Charting: Putting Your Information on Graphs.

Other Recording Methods.

Computerized Recording.

Appendix: Installing CASS.

5. Behavioral Observation.

General Guidelines for Behavioral Observation.

Sampling Behaviors.

Instruments for Recording Behaviors.

Ensuring Accurate Observations.

Methods of Recording Behavior.

Analog Situations.

Recording Behavior in Groups.

6. Individualized Rating Scales.

Uses of Individualized Rating Scales.

Constructing and Using Individualized Rating Scales.

7. Standardized Questionnaires.

What Are Standardized Questionnaires?

Selecting a Standardized Questionnaire.

Administering a Standardized Questionnaire.

Some Available Standardized Self-Report Questionnaires.

Some Available Standardized Questionnaires for Practitioners.

Some Available Standardized Questionnaires for Relevant Others.

Some Available Standardized Questionnaires for Independent Observers.

Do-It-Yourself Questionnaires.

Using Standardized Questionnaires in Groups.

Computer Management of Standardized Questionnaires.

Appendix.

Computer Assisted Assessment Package (CAAP): A User's Guide.

8. Logs.

Types of Client Logs.

Putting Qualitative and Quantitative Information Together.

Introducing Clients to Logs.

Practitioner Logs.

Maximizing and Verifying the Reliability and Validity of Logs.

9. Reactivity and Nonreactive Measures.

Reactivity of Measures.

Unobtrusive (Nonreactive) Measures.

10. Selecting a Measure.

Considerations in Deciding on a Measure.

Use of Multiple Measures.

Selecting a Measure.

III. EVALUATION DESIGNS.

11. Basic Principles of Single-System Designs.

An Example Connecting Practice and Evaluation Designs.

Purposes of Single-System Designs.

Key Characteristics of Single-System Designs.

Causality in Single-System Designs.

External Validity and Generalizability.

Overview of Single-System Designs.

12. Baselining: Collecting Information Before Intervention.

Purposes of the Baseline.

Types of Baselines.

How Long Should Baselining Continue?

When Are Baselines Not Necessary?

Issues Regarding Baselining.

13. From the Case Study to the Basic Single-System Design: A-B.

Case Studies or Predesigns.

Design A-B: The Basic Single-System Design.

14. The Experimental Single-System Designs: A-B-A, A-B-A-B, B-A-B.

Basic Experimental Designs.

15. Multiple Designs for Single Systems: Baselines, Targets, Crossovers, and Series.

Multiple Baseline Designs: Problems, Clients, or Settings.

Multiple Target Designs.

Variations of Multiple Designs.

16. Changing Intensity Designs and Successive Intervention Designs.

Changing Intensity Designs: A-B1-B2-B3.

Successive Intervention Design: A-B-C, A-B-A-C, A-B-A-C-A.

17. Complex and Combined Designs.

Alternating Intervention Design: A-B/C-(B or C).

Interaction Design: A-B-A-B-BC-B-BC.

18. Selecting a Design.

Framework for Selecting a Design.

Needed: A Design for All Seasons.

Creativity in Single-System Designs: Making Your Own Designs.

Evaluation in Minimal-Contact Situations.

Single-System Designs in Managed Care: The Stretch Design.

Trouble-Shooting.

IV. ANALYZING YOUR RESULTS.

19. Basic Principles of Analysis.

Distinguishing Effort, Effectiveness, and Efficiency.

Significance: Practical, Statistical, and Theoretical.

Evaluating Goal Achievement.

Issues in Analysis of Data.

Computer Analysis of Data for Single-System Designs.

The Issue of Autocorrelation.

Tools in Analysis of Data.

20. Visual Analysis of Single-System Design Data.

Definition of Terms.

Basic Patterns and Implications.

Visual Inspection of Raw Data.

Interpreting Ambiguous Patterns.

Problems of Visual Inspection.

Creating a Chart with SINGWIN.

21. Descriptive Statistics.

Measures of Central Tendency.

Measures of Variation.

Computing and Graphing Measures of Central Tendency and Variation with SINGWIN.

Measures of Trend.

Measures of Effect Size.

Optimal Uses and Cautions for Specific Descriptive Statistics.

22. Tests of Statistical Significance for Single-System Designs.

Proportion/Frequency Approach.

Three-Standard-Deviation-Band Approach.

Chi-Square.

t-Test.

General Considerations in Using Tests of Statistical Significance.

Optimal Uses and Cautions for Specific Analytic Procedures.

Appendix.

23. Computer Analysis of Single-System Design Data: SINGWIN User's Guide.

Chapter Overview.

Starting SINGWIN.

Exiting SINGWIN.

Getting the Big Picture.

Using Specific Procedures.

Appendix: Installing SINGWIN.

24. Selecting a Procedure for Analyzing Data.

Framework for Selecting a Procedure for Analyzing Data.

Other Statistical Considerations.

Nonstatistical Considerations.

Limitations.

V. THE CHALLENGE OF SINGLE-SYSTEM DESIGNS.

25. Not for Practitioners Alone: Evaluation for Clients, Administrators, Educators, and Students.

Special Applications of Single-System Designs.

Recent Criticisms of Single-System Evaluation.

For the Client.

For the Administrator.

For Educators and Students.

References.
Name Index.
Subject Index.

Additional information

CIN0205342612G
9780205342617
0205342612
Evaluating Practice: Guidelines for the Accountable Professional (with FREE SINGWIN CD-ROM) by Joel Fischer
Used - Good
Hardback
Pearson Education (US)
2002-05-22
672
N/A
Book picture is for illustrative purposes only, actual binding, cover or edition may vary.
This is a used book - there is no escaping the fact it has been read by someone else and it will show signs of wear and previous use. Overall we expect it to be in good condition, but if you are not entirely satisfied please get in touch with us

Customer Reviews - Evaluating Practice