
There's a specific kind of exhaustion that comes not from hard clinical decisions, but from the volume of small, repetitive tasks that pile up around them.
Documenting the same discharge instruction set for the fourth time this shift. Drafting a prior authorization letter for a medication you've requested a hundred times before. Summarizing a 40-page transfer packet from an outside hospital before rounds. Writing a professional but firm reply to a patient message that needs careful wording.
None of these tasks require clinical expertise. They just require time — time you don't have.
This is where AI tools are showing real, practical value for clinicians right now. Not in replacing clinical judgment. Not in diagnosing patients. In handling the repetitive cognitive overhead that surrounds patient care.
Here's how to use it well — and how to do it safely.
Think about what you actually spend time on in a typical shift. A lot of it is communication and documentation that follows a familiar pattern but demands enough of your attention that you can't truly multitask.
Drafting routine communications is probably the highest-value use case for most clinicians. Referral letters. Prior authorization justifications. Care coordination emails. Patient portal message responses. These follow predictable formats — you know the content that needs to be in them — but writing each one from scratch pulls you out of clinical focus.
AI can draft a solid first version in seconds. You review it, adjust the clinical details, and send it. What used to take 8 minutes takes 90 seconds.
Summarizing information is another strong use case. Lengthy transfer records. Discharge summaries from outside facilities. Long policy documents or clinical guidelines you need to understand quickly. Paste in the text (see the privacy rules below before you do this) and ask for a structured summary of key findings, active medications, or relevant history.
Creating templates and education materials is underused. If you regularly explain a diagnosis, procedure, or medication to patients, you've probably given a version of the same explanation dozens of times. Use AI to draft a written version — a patient-facing handout, an after-visit summary template, a pre-procedure instruction sheet. You build it once, customize per patient, and stop reinventing the wheel each time.
Drafting policies, protocols, and training content takes a disproportionate amount of time relative to its actual complexity. The structure is always similar. If you're a charge nurse, informatics analyst, or unit educator, AI can handle the scaffold — headings, standard sections, boilerplate language — so you can focus on the clinical specifics.
This is the part that determines whether your AI use is an asset or a liability.
Consumer AI tools — ChatGPT, Claude.ai, Gemini, and others accessed through a standard web browser or app — are not HIPAA-compliant by default. Even if the company has good intentions, the standard terms of service for consumer products do not provide the covered-entity agreements your institution needs.
The rule is simple: no patient information in public AI tools.
That means:
A useful test: if the information could appear in a medical record or would be covered by HIPAA, it doesn't go into a consumer AI tool.
This is not a theoretical concern. A single inadvertent disclosure — a pharmacy student pasting a medication list into ChatGPT, a nurse copying clinical notes to get help with a response — creates a reportable breach. You may not face personal consequences, but the institutional consequences are real, and so is the patient harm.
The practical workaround is straightforward: use AI to build the template, then fill in the clinical details yourself.
Need to write a prior authorization for a specialty medication? Ask AI: "Draft a prior authorization letter for a specialty medication requiring medical necessity documentation. Include sections for diagnosis, clinical rationale, treatment history, and requested medication." You get the structure and professional language. You add the patient-specific clinical information yourself, in your EHR or in a document that never passes through an external tool.
Get pieces like this in your inbox every week. No spam, unsubscribe anytime.
Need to respond to a complicated patient portal message? Ask AI: "Draft a professional, empathetic response to a patient message expressing concern about a medication side effect, where the clinical team will follow up at next visit." You get the tone and framing. You don't paste the patient's actual message in.
This approach captures nearly all the time savings while keeping patient data where it belongs.
Beyond HIPAA, your institution almost certainly has its own policies on AI tool use — and those policies exist for good reasons beyond patient privacy.
Many health systems have approved specific AI tools for clinical use, often through enterprise agreements that include the appropriate business associate agreements and data governance controls. These are the tools you should be using for any work that touches patient information. Check with your informatics team, compliance department, or IT governance committee before using any AI tool for clinical work.
Some things worth understanding about your institution's stance:
What tools are approved. If your hospital has an enterprise Microsoft 365 Copilot deployment, for example, that operates under your institution's data agreements — meaningfully different from using ChatGPT in a personal browser tab. The tool may look similar. The legal and technical guardrails are not.
What's explicitly prohibited. Some institutions prohibit use of any external AI tool on hospital networks or devices, regardless of the data involved. Know the policy before you find out the hard way.
How to flag problems. If you discover that an AI tool is generating inaccurate clinical information — wrong drug doses in a template, incorrect guideline references, inappropriate advice — that's worth reporting through the same channels you'd use for any clinical safety concern. Informal workarounds that nobody knows about can't be improved or governed.
Before using an AI tool for a clinical task, run through three questions:
1. Does this involve patient information? If yes, is it in an approved, HIPAA-compliant tool your institution has authorized? If not, stop and use the template approach: build the structure with AI, fill in the clinical details yourself in your EHR or approved environment.
2. Is the AI output going to a patient or into a clinical record? If yes, you are responsible for its accuracy. Read every word before it leaves your hands. AI tools make confident-sounding errors. A hallucinated drug interaction, an incorrect dosing note, or a subtly wrong clinical statement can cause harm. Review it the same way you'd review work done by a new graduate — with the understanding that they did their best but need your clinical judgment before it goes anywhere real.
3. Does this fall within my institution's policies? If you're not sure, the answer is to find out before you proceed, not to assume it's fine. Informatics teams and compliance departments generally want to help clinicians use these tools well — they're not looking to police reasonable use, but they need to know what's happening.
AI is already saving clinicians meaningful time on tasks that shouldn't require a nurse's or pharmacist's full cognitive bandwidth. The institutions and clinicians who figure out how to use it thoughtfully — protecting privacy, staying within governance frameworks, reviewing outputs critically — will have a real advantage in managing workload and reducing the kind of administrative exhaustion that drives people out of clinical careers.
The tools will keep improving. The privacy rules won't change. The critical thinking requirement never will.
Use AI for the structure and the scaffold. Keep the clinical expertise where it belongs — with you.
Jason Potts, PharmD
Hospital pharmacist and health IT product manager. Writing about the intersection of clinical practice and technology at Clinical to Code.
Clinical insights delivered to your inbox. No spam.