A.D.A.

Back to Workflow Library

Function: Follow-up

AI Workflow for Proposal Follow-Up

Deployment Brief

Start with proposal sent date, scope summary, open question, decision date, engagement signal, owner, and a draft that requires review before any commercial change.

Related Field Report

  • Speed-to-lead AI workflow: A field report on faster lead response without losing evidence, routing, consent, or owner review.

Quick Answer

Proposal follow-up turns a sent proposal into a useful next-step task with scope, assumptions, exclusions, pricing basis, decision date, engagement signal, and owner review. AI should prepare the context summary and draft a helpful message. A person should review discounts, scope changes, timeline promises, legal terms, strategic accounts, and any follow-up that changes the commercial offer.

TL;DR

Proposal follow-up should help the buyer decide without changing the deal by accident. The workflow should keep scope, assumptions, exclusions, pricing basis, decision date, and engagement signals visible before a message goes out.

What is proposal follow-up?

Proposal follow-up is the process of moving a sent proposal toward a clear decision, question, revision, or close-loop outcome.

Who is this workflow for?

  • Service businesses, consulting firms, construction companies, SaaS teams, agencies, and professional firms with commercial follow-up volume.
  • Teams where good conversations still go stale because next steps are not owned.
  • Companies that need helpful follow-up without pressure, spam, or accidental promises.
  • Operators who want buyer context and stop rules before adding more automation.

What breaks in the manual process?

The manual process usually breaks when context disappears between the buyer signal and the next message:

  • the message ignores the actual proposal;
  • scope and exclusions disappear from the conversation;
  • tracking data is used in a way that feels invasive;
  • discounting starts before the buyer asks;
  • the decision date passes with no owner action;
  • commercial changes are made in a casual email.

The workflow should make the next action useful, specific, and reviewable.

How does the AI-enabled process work?

The workflow gathers proposal details, engagement signal, decision date, stakeholder status, prior attempts, and approval rules. It drafts a useful follow-up and flags any commercial change for review.

AI prepares the work. The accountable owner still approves anything that changes pricing, scope, timing, terms, ownership, or expectations.

What does this look like in practice?

Example scenario: A proposal has been viewed twice after the decision date but the buyer has not replied. The workflow checks proposal scope, assumptions, exclusions, pricing basis, engagement signal, decision date, and stakeholder status. It prepares draft follow-up, open question, owner review task, and a flag for any scope, pricing, or timeline change.

What decision rules should govern this workflow?

  • Follow up when the decision date is approaching, the proposal is viewed, or an open question remains.
  • Use proposal engagement as internal context, not creepy customer-facing language.
  • Route discounts, scope changes, legal terms, and timeline promises to review.
  • Stop after explicit decline, opt-out, expired opportunity, or confirmed competitor decision.
  • Move future-fit opportunities to nurture instead of repeating the same proposal prompt.

What are the implementation steps?

1. Trigger: A proposal has been sent and the buyer has not accepted, declined, asked a question, reached a decision date, or completed the next milestone. 2. Inputs collected: proposal sent date and delivery channel, scope, assumptions, exclusions, and pricing basis, proposal viewed or engagement signal, buyer objection or open question, decision date and stakeholder status, prior follow-up attempts, account owner and approval rules, approved follow-up language and commercial boundaries. 3. AI/system action: The system checks the required evidence, summarizes the buyer context, applies the follow-up rule, and prepares the next action. 4. Human review point: The account owner reviews discounts, scope changes, timeline promises, legal language, objections, strategic accounts, and any follow-up that changes the proposal or customer expectation. 5. Output generated: proposal follow-up task with owner and due date, context summary with scope and open questions, draft follow-up message for review, commercial-change exception note, measurement event for proposal response, stale proposal count, and decision reason capture. 6. Follow-up or next action: The owner approves, sends, routes, suppresses, nurtures, or closes the loop based on the evidence.

Required inputs

  • proposal sent date and delivery channel.
  • scope, assumptions, exclusions, and pricing basis.
  • proposal viewed or engagement signal.
  • buyer objection or open question.
  • decision date and stakeholder status.
  • prior follow-up attempts.
  • account owner and approval rules.
  • approved follow-up language and commercial boundaries.

Expected outputs

  • proposal follow-up task with owner and due date.
  • context summary with scope and open questions.
  • draft follow-up message for review.
  • commercial-change exception note.
  • measurement event for proposal response, stale proposal count, and decision reason capture.

Human review point

The account owner reviews discounts, scope changes, timeline promises, legal language, objections, strategic accounts, and any follow-up that changes the proposal or customer expectation.

Risks and stop rules

Stop when consent is unclear, the buyer declined, the lead opted out, the record conflicts with existing ownership, the follow-up would change commercial terms, or there is no useful reason to contact the buyer.

Best first version

Start with proposal sent date, scope summary, open question, decision date, engagement signal, owner, and a draft that requires review before any commercial change.

Advanced version

Add buyer engagement signals, account-level suppression, stakeholder mapping, nurture paths, manager review dashboards, and monthly exception review after the basic owner workflow is reliable.

Related workflows

Measurement plan

  • Time from proposal sent to first follow-up.
  • Proposal response rate by follow-up reason.
  • Stale proposal count.
  • Open question resolution rate.
  • Commercial-change exception rate.
  • Win/loss or decision reason capture rate.

FAQ

What is proposal follow-up?

Proposal follow-up is the process of helping a buyer move from proposal review to a clear decision, question, revision, or close-loop outcome.

What should AI check before proposal follow-up?

AI should check proposal scope, assumptions, exclusions, pricing basis, engagement signal, decision date, stakeholder status, prior attempts, and approval rules.

What should stay under human control?

Discounts, scope changes, timeline promises, legal language, strategic accounts, and commercial terms should stay under account owner review.

What is the simplest first version?

Start with proposal sent date, scope summary, open question, decision date, engagement signal, owner, and review flag for commercial changes.

How should proposal follow-up be measured?

Track time to first follow-up, response rate by reason, stale proposals, open question resolution, exception rate, and decision reason capture.