Skip Navigation Archive: U.S. Department of Health and Human Services U.S. Department of Health and Human Services
Archive: Agency for Healthcare Research Quality www.ahrq.gov
Archival print banner

This information is for reference purposes only. It was current when produced and may now be outdated. Archive material is no longer maintained, and some links may not work. Persons with disabilities having difficulty accessing this information should contact us at: https://info.ahrq.gov. Let us know the nature of the problem, the Web address of what you want, and your contact information.

Please go to www.ahrq.gov for current information.

Evaluating CAHPS® Quality Improvement Demonstrations (Text Version)

Slide presentation from the AHRQ 2008 conference showcasing Agency research and projects.

Slide Presentation from the AHRQ 2008 Annual Conference


On September 9, 2008, Donna Farley, made this presentation at the 2008 Annual Conference. Select to access the PowerPoint® presentation (128 KB).


Slide 1

Evaluating Consumer Assessment of Healthcare Providers and Systems (CAHPS®) Quality Improvement (QI) Demonstrations

Donna Farley
Senior Health Policy Analyst, RAND
AHRQ Conference
September 9, 2008.

Slide 2

Overview of the Presentation

  • Goals for evaluating CAHPS® QI demonstrations.
  • Conceptual Framework to guide evaluation.
  • Process evaluation approach and methods.
  • Outcome evaluation issues and options.

Slide 3

Goals for Evaluating CAHPS® QI Demonstrations

  • Generate information on implementation experiences:
    • Use by implementing organization to improve.
    • Use by other organizations in their QI work.
  • Assess effects of QI interventions:
    • CAHPS® scores.
    • Other outcomes and stakeholders.
  • Understand which factors contribute to observed effects (or not).
  • Compare results across demonstrations.

Slide 4

Major Evaluation Components

  • Process Evaluation:
    • Document and analyze QI intervention and implementation process.
    • Identify factors influencing progress in achieving desired process changes.
  • Outcome Evaluation
    • Analyze effects of QI interventions on outcomes of interest to implementing organization.

Slide 5

How the Evaluation Addresses the Evaluation Goals

  • Evaluation Goal—Evaluation Component.
  • Goal 1: experiences—Process evaluation.
  • Goal 2: effects—Outcome evaluation.
  • Goal 3: factors for effects—Process & Outcome.
  • Goal 4: comparison—Standard Methodology.

Slide 6

Conceptual Framework—An Evaluation Guide

Slide 7

Framework: CAHPS® Quality Improvement

The slide shows two nested rectangles. The inner rectangle represents "Organization Philosophy and Capacity." This involves:

  • Executive Leadership.
  • Implementation of QI Interventions:
    • Team Leads.
    • Members.
    • Involved Staff.
  • Other Units.

The outer rectangle represents the "External Environment."

Slide 8

Framework: Implementation

  • Core activities:
    • Training.
    • Change methods used.
    • Process changes & cycles.
    • Monitoring and feedback.
    • Sustainability.
    • Implementation synergies.
    • Implementation experiences.
    • Changes to clinical and operational processes (expected and actual).

Slide 9

Framework: Key Stakeholders

  • Implementation team—champion, facilitator, team members.
  • Higher level (e.g., organization leaders).
  • Horizontal (e.g. other departments, services that coordinate with intervention).
  • Directly affected or involved.
  • Implementers—physicians, nurses, other clinical staff, administrative staff.
  • End-users—patients, family members.

Slide 10

Framework: Organizational Philosophy

  • Policy:
    • Formal policies.
    • Human resource practices.
  • Roles/Positions:
    • Decision-making authority.
    • Reporting responsibilities.
    • Role expectations.
  • Philosophy/culture:
    • Culture of excellence.
    • Patient-centered focus.
    • Management approach and style.

Slide 11

Framework: Organizational Capacity

  • System-level:
    • Facilities.
    • Support service.
    • Coordination.
  • Position-level:
    • Supervisory.
    • Workload.
  • Individual-level:
    • Personal.
    • Performance.

Slide 12

Framework: External Environment

  • Policy:
    • Laws and regulations.
    • Credentialing policy.
    • Reporting policies.
    • Performance.
    • Payment incentives.
  • Market:
    • Competition.
    • Perceived quality, costs, access.
  • Information:
    • CAHPS® credibility.
    • Public reports.

Slide 13

Framework: Outcomes

  • Patient experience (CAHPS®).
  • Organizational change.
  • Program change.
  • Employee effects.

Slide 14

Process Evaluation Methods

Slide 15

Types of Data Collected

  • Descriptive (factual) data:
    • Organizational environment.
    • External environment.
    • Decision process leading to the QI interventions.
    • Strategy used to implement the interventions.
    • Timeline of the implementation processes.
  • Experiential data:
    • Differing views of stakeholders.
    • Perceptions of progress of the QI interventions.
    • How QI interventions affecting them.

Slide 16

Data Collection Instruments

  • Checklist of descriptive data to collect:
    • Structured according to the framework.
    • Multiple sources of data—written materials, discussions with QI leads, interviews.
  • Implementation timeline form:
    • Shows planned implementation schedule.
    • Updated as QI work proceeds.
  • Standard protocol for stakeholder Interviews.

Slide 17

Interview Grid for Comparative Data on Stakeholders Perspectives

The slide shows the framework for a table.

  • Framework Component:
    • Intervention.
    • Stakeholders.
    • Organization Philosophy.
    • Organization Capacity.
    • External Environment.
    • Effects on Outcomes.
  • Expectation.
  • Actual Progress: Successes/Challenges.
  • Effects on You.
  • Effects on Others.

Slide 18

Outcome Evaluation Issues and Design Options

Slide 19

Challenges in Measuring Effects of CAHPS® QI Interventions

  • Difficulty in "moving" CAHPS® scores:
    • Scores are composites of several items.
    • QI interventions often address only some items.
    • Time required to make practices change.
    • Time required to change patients' perceptions.
  • Difficulty in attributing effects to QI intervention:
    • Many initiatives are in just one organization.
    • Others are in many (e.g. medical practices).
    • External control groups may not be good controls.
    • Need for process information to interpret effects.

Slide 20

Design Options for Outcome Evaluations

  • Differences-in-Differences:
    • Use control groups to control confounding factors.
    • Allows attribution to intervention.
    • Controls may not control for confounders.
  • Differences by degree of implementation:
    • Classify participating groups (e.g. practices) by degree of implementation and compare.
    • May not measure implementation accurately.
  • Compare each entity to itself over time:
    • Control for confounders but not temporal changes.
    • Small N for analysis and power needs.

Slide 21

Closing Observations

  • Need for multi-dimensional information leads to complex evaluation requirements.
  • Ultimate goal is to learn how QI interventions affected patient experience, as measured by CAHPS® scores.
  • But implementers also need feedback to improve intervention actions.
  • Process evaluation must collect good comparative data to serve all these needs.
Current as of February 2009
Internet Citation: Evaluating CAHPS® Quality Improvement Demonstrations (Text Version). February 2009. Agency for Healthcare Research and Quality, Rockville, MD. https://archive.ahrq.gov/news/events/conference/2008/Farley.html

 

The information on this page is archived and provided for reference purposes only.

 

AHRQ Advancing Excellence in Health Care