# Anesthesia Wellness Pulse Survey — Interim Check-in Instrument

**Purpose.** A brief quarterly pulse instrument intended to track directional change between annual deployments of a validated burnout measure. This instrument is not a validated diagnostic tool and is not a substitute for the Mini-Z 2.0, Stanford Professional Fulfillment Index, or Maslach Burnout Inventory. It is designed for trend monitoring, resource-awareness assessment, and surfacing fatigue exposure.

**Audience.** All clinical staff in the anesthesia department (CRNAs, CAAs, anesthesiologists, SRNAs on rotation, residents and fellows where applicable).

**Completion time.** 3–4 minutes.

**Deployment window.** 10 business days. Two reminder communications at Day 4 and Day 8. Response rate target: ≥ 50% departmental; flag any deployment under 40% as a communication gap.

**Confidentiality statement** — include verbatim at the top of your deployment:

> Responses to this survey are anonymous. No individual response is linked to your identity, shift, subspecialty, or any demographic field. Results are reported in aggregate. This survey is not a substitute for clinical care, peer assistance, or institutional reporting pathways. If you are in crisis, contact 988 or your institution's emergency resource.

---

## Section 1 — Current state (5 items, 5-point Likert scale)

Response scale: Strongly disagree (1) — Disagree (2) — Neither agree nor disagree (3) — Agree (4) — Strongly agree (5).

1. Over the past four weeks, I have had enough energy at the end of a typical shift to engage meaningfully with my personal life.
2. Over the past four weeks, my clinical work has felt meaningful.
3. Over the past four weeks, I have felt supported by my immediate colleagues.
4. Over the past four weeks, my schedule has been sustainable given my current personal circumstances.
5. I know where I or a colleague would turn if we were in acute distress.

## Section 2 — Resource awareness (4 items, Yes / No / Unsure)

6. I am aware of the existence and contact path of our department's wellness committee.
7. I am aware of my state's peer assistance program for my licensure type (nurse peer assistance for CRNAs; physician health program for anesthesiologists).
8. I am aware of 988 as a national crisis resource and of any relevant profession-specific support lines.
9. I am aware of our department's second-victim or peer-responder team.

## Section 3 — Fatigue exposure (2 items, categorical)

10. Over the past four weeks, how many shifts have you worked while feeling fatigued to a degree that concerned you with respect to patient safety or your own safety driving home?
    - None
    - 1–2
    - 3–5
    - More than 5
    - Prefer not to answer

11. Over the past four weeks, how many call obligations have exceeded the duration threshold specified in your contract or institutional policy?
    - None
    - 1–2
    - 3–5
    - More than 5
    - Not applicable (no call obligation)

## Section 4 — Open response (2 items, optional free text)

12. What is one specific change — in scheduling, staffing, programming, or policy — that would most improve your day-to-day wellbeing over the next 90 days?

13. What is one resource you wish existed in this department that does not?

---

## Analysis protocol

### Quantitative

- **Section 1 (Current state).** Report the mean score for each item and for the section overall. Trend over time matters more than any single deployment's absolute score. An item with a consistently declining mean across three deployments warrants a programmatic response.
- **Section 2 (Awareness).** Awareness percentages below 70% on any item indicate an internal communication gap, not a wellbeing problem. Address through targeted communication rather than programming.
- **Section 3 (Fatigue).** Any response above "1–2" on Item 10 should trigger a schedule audit review. Item 11 responses are cross-referenced with the quarterly call-density audit described in the meeting agenda template.

### Qualitative

- Free-text responses (Items 12 and 13) are coded into themes quarterly using an inductive approach. Two committee members perform independent coding; discrepancies are resolved by discussion.
- Report themes only, never verbatim quotes, to leadership. Verbatim quotes risk identification and undermine the anonymity commitment.

### Reporting

- Preliminary results to the full Committee within 14 days of survey close.
- Aggregate report to department clinical staff within 30 days of survey close.
- The department-wide report must name what the Committee will change based on results. Surveying without reporting back destroys trust faster than not surveying at all.

## Deployment cadence and relationship to annual measurement

| Month | Instrument |
|-------|------------|
| Month 0 (baseline) | Mini-Z 2.0 or Stanford PFI |
| Month 3 | Pulse survey |
| Month 6 | Pulse survey |
| Month 9 | Pulse survey |
| Month 12 | Mini-Z 2.0 or Stanford PFI (year-over-year comparison) |

Avoid deploying this pulse survey in the same 60-day window as the annual validated instrument. Survey fatigue is real and depresses response rates for subsequent deployments.

## Known limitations of this instrument

- Not psychometrically validated. Cannot be used to assign individual diagnostic labels or to benchmark against published prevalence.
- Self-report instruments underdetect severe distress, particularly in populations with licensing-reporting concerns. Low Section-1 scores likely reflect a more severe reality than the numbers indicate.
- Response rates typically decline over successive deployments. Rotate delivery channels and deployment messaging to counteract.

## Before using this instrument

- Confirm with institutional privacy / compliance officers that the deployment platform (SurveyMonkey, Qualtrics, REDCap, etc.) is approved for this purpose.
- Secure department chief endorsement of the deployment — announcements signed by the chief produce notably higher response rates than announcements signed by the Chair alone.
- Decide in advance how you will report results if they are unflattering. Committing to transparent reporting before data collection is the single most important step toward institutional credibility.
