What To Automate Vs What To Keep Manual
Examples showing where AI can safely prepare work and where a person should keep the decision.
Direct Answer
AI should usually prepare, summarize, classify, draft, check, or route. A person should approve customer commitments, pricing, legal language, account ownership, sensitive data actions, and final high-impact decisions.
How to use this
- Step 1: List every action in the workflow.
- Step 2: Mark preparation actions separately from final decisions.
- Step 3: Keep high-impact decisions under review until the workflow has real correction history.
- Step 4: Write the manual owner for every customer-visible, financial, legal, data, or record-changing step.
- Step 5: Revisit the boundary only after the workflow has performance data.
Decision examples
- Lead intake: Automate summary and routing recommendation. Keep disqualification and strategic account handling under review.
- Proposal review: Automate checks for missing sections and risky language. Keep price, scope, and send approval manual.
- Customer escalation: Automate context summary and risk flags. Keep final response and concessions manual.
- Reporting: Automate draft brief and variance detection. Keep interpretation, budget, staffing, and public claims manual.
- CRM cleanup: Automate duplicate detection queue. Keep merge, delete, ownership, and lifecycle changes under review.
- Vendor evaluation: Automate evidence packet and scorecard draft. Keep criteria, weights, risk acceptance, and selection manual.
Worksheet prompts
- Safe automation candidate: Summarize, classify, compare, draft, check, route, flag, extract, enrich, or prepare a packet.
- Keep manual by default: Approve, promise, price, contract, delete, merge, assign ownership, reject customer, or make a final high-impact decision.
- Conditional automation: Send reminders, update low-risk fields, create tasks, or draft messages only when evidence and review rules are clear.
- Stop condition: Missing evidence, conflicting records, unclear customer intent, sensitive data, high-value account, or unusual request.
- Review history: What corrections have owners made, and what do those corrections teach the next version?
- Expansion rule: Only reduce manual review after repeated outputs are accurate, explainable, and low risk.
The simplest rule
Let AI prepare work when the source evidence exists. Keep people in control when the action changes expectations, records, money, legal exposure, or customer trust.
Why this improves trust
Buyers do not need to believe AI is perfect. They need to see that the workflow has review points, stop rules, and accountable owners.
How to use these examples
Pick the closest workflow, copy the automate/manual split, then adapt it to the specific data, owner, risk, and output.
Quality bar
- The boundary is written before implementation starts.
- The team can explain why each manual step stays manual.
- The first version makes people faster without hiding judgment.
- The workflow stops when evidence is missing or risk is unclear.
- Expansion depends on correction history, not optimism.
Where This Helps
Research basis
- OECD AI Principles: Supports human oversight and safeguards appropriate to context.
- Microsoft Responsible AI Resources: Supports human review, quality, safety, and impact assessment.
- NIST AI RMF Core: Supports managing likelihood, impact, measurement, and ongoing controls.
Related Resources
FAQ
- What should AI automate first?: Preparation work: summaries, classifications, draft outputs, checks, comparisons, routing suggestions, and evidence packets.
- What should stay manual?: Final decisions involving money, legal exposure, customer commitments, sensitive data, record changes, or low-confidence evidence.
- Can the manual step be reduced later?: Yes, but only after the workflow has evidence, correction history, owner trust, and clear performance data.