Skip to main content

✨ Prompt optimization guide

Tips and examples for writing clear, effective prompts that guide Scratchpad AI to deliver high-quality outputs

Updated this week

Write better prompts. Get better results.


Whether you're guiding AI to update Salesforce fields, extract insights from calls, or generate sales assets, strong prompts are key.

This guide walks through how to craft clear, specific prompts that improve accuracy, reduce errors, and deliver consistent output across your team - starting with how to test your prompt and troubleshoot common issues to get it working just right.

→ Learn more about AI CRM Update prompts here and Ask AI prompts here

Test and troubleshoot


Use the Test Prompt feature in Scratchpad to validate your instructions. Try different deal types, call structures, and field edge cases. Watch for consistency, clarity, and alignment with your intended output.

If your prompt isn’t working as expected, use the links below to troubleshoot common issues and improve results:

Best practices


1. Clarify what to include and exclude

Avoid ambiguity by defining exactly what should be captured and what should be left out. This improves precision and reduces errors.

Instead of a vague instruction like:

Capture the next step.

Use a more complete structure:

What to include:

  • Action and owner (e.g. rep will send revised proposal)

  • Timing (e.g. by 5/30)

Avoid including:

  • Action items owned by the prospect

  • Past events

  • Vague phrases like "follow up"

2. Provide specific examples

Prompts work better when you show - not just tell - what the output should look like.

Use examples (like good vs. bad) as shown below within your prompt to guide the AI toward the tone, structure, and level of detail you expect.

Field: Next Step

Good: Send revised proposal to Anji by 7/30/25

Bad: Follow-up next week

Field: Decision Criteria

Good: Needs to integrate with Salesforce, support Slack notifications, and have strong mobile UX for field reps

Bad: Wants something easy to use

Field: Pain

Good: Reps are wasting 2+ hours/week updating Salesforce manually; pipeline reviews lack consistent data quality.

Bad: Salesforce is annoying

3. Define output format and limits

Set constraints so the AI produces information in a consistent and usable format.

Examples of helpful constraints:

  • Limit to one sentence

  • Maximum of 200 characters

  • Use 2–3 bullets

  • If nothing is mentioned, respond with: Not discussed

  • Follow this format: [MM/DD] - [Initials] - [Update]

    • Example: 08/04 - AP - Send pricing follow-up

4. Focus on what the prospect said

Prompts should reinforce that you're only capturing what the prospect explicitly stated - not the rep’s interpretation or assumptions.

Instead of a prompt that allows for rep assumption:

Capture the decision criteria for choosing a vendor, including important

features and requirements.

Use a prompt that enforces prospect-stated context only:

Capture the specific, prospect-stated benchmarks or decision criteria they'll use to choose a vendor.

What to include:

  • Concrete Metrics: Explicit thresholds (e.g. "<5% data sync errors," "<2-second page load")

  • Required Features: Must-have capabilities the prospect named (e.g. "real-time reporting," "API integration with our ERP")

  • Business Outcomes: Outcomes they linked to success (e.g. "20% reduction in processing costs")

Avoid including:

  • Sales-rep promises or recommendations: Don't list features that the selling team (1) plans to deliver or (2) encourages the prospect to consider

  • Unstated Assumptions: Never infer criteria the prospect didn't explicitly mention

5. Add context when needed

Help AI understand your internal language and structure by embedding helpful background.

Examples of useful context:

  • Definitions for field labels like “Economic Buyer”

  • Picklist value explanations like "Upsell" or "Expansion"

  • Call timing cues like “Decision Criteria is often discussed during pricing or evaluation conversations”

Example prompts


Field

Prompt

Next Step

Description:

Enter the specific, sales-rep–owned action required to advance this deal.

What to capture:

  • Action & Owner: Clearly state what the rep will do next (e.g. “Send revised proposal,” “Schedule technical demo”).

  • Timing: Include a target date or time frame (e.g. “by 5/30,” “next week”).

  • Context (optional): brief, supporting context if needed (e.g. “after receiving legal’s redlines”).

Avoid capturing:

  • Prospect’s To-Dos: Don’t record tasks the customer must complete.

  • Vague Language: Avoid “follow up” or “check in” without detail.

  • Past Events: Do not summarize what’s already happened.

Output format:

  • Keep it concise - no more than one sentence or 200 characters.

Pain / Challenge

Description:

The prospect’s core business challenges driving their evaluation of our solution.

What to capture:

  • Impactful Challenges: State the pain in business terms (e.g. “50% manual data entry errors,” “20% drop in renewal rates”).

  • Root Cause: Briefly note underlying issues (e.g. “lack of automation,” “siloed reporting”).

  • Scope, Severity, and Timing: Highlight scale or urgency (e.g. “across 3 regional teams,” “blocking Q3 targets”).

Avoid capturing:

  • Minor or Cosmetic Issues: Skip trivial annoyances.

  • Solutions or Speculation: Don’t prescribe fixes or guess at causes not mentioned.

  • Prospect Praise or Chitchat: Exclude compliments or small talk.

  • Past Resolved Problems: Avoid pains they’ve already fixed.

Output format:

  • Present your response in a concise list of a few bullets (maximum).

  • If no legitimate pains are discussed, respond "Not discussed".

Economic Buyer

Description:

Identify the person(s) with ultimate budget authority and contract-signing power discussed on the call.

What to capture:

  • Name & Title: Full name and official role (e.g. “Jane Doe, VP of Finance”).

  • Decision Authority: One sentence on why they’re the economic buyer (e.g. “She approves all vendor contracts over $100K”).

Avoid capturing:

  • Speculation: Don’t guess at authority if not stated.

  • Secondary Influencers: Skip people who advise but can’t sign.

  • Groups or Committees: Avoid listing multiple stakeholders unless each has clear signing power.

Output format:

  • Provide the name, title, and a one-sentence rationale.

  • If no clear buyer was mentioned, respond “No economic buyer identified.”

Decision  Process

Description:

Document the concrete evaluation steps the prospect’s team will follow to choose a vendor.

What to capture:

  • Defined Phases: Exact stages or steps named by the prospect (e.g. “technical pilot,” “legal review,” “executive sign-off”).

  • Participants: Who is involved at each stage (e.g. “IT team,” “procurement,” “CFO”).

  • Criteria & Timing: Specific evaluation criteria (e.g. “performance benchmarks,” “ROI analysis”) and any dates or deadlines mentioned (e.g. “pilot by June 15”).

Avoid capturing:

  • Sales-Rep Actions: Don’t include steps the rep will take, or actions recommended by the sales rep.

  • Vague References: Avoid “discuss with team” or “get feedback” without prospect context.

  • Internal Processes: Exclude your company’s approval steps.

  • Speculation: Don’t infer steps the prospect didn’t state.

Output format:

  • Present your response in a concise list of a few bullets (maximum).

  • If the decision process is not discussed, reply "Not discussed".

Decision Criteria

Description:

Capture the specific, prospect-stated benchmarks or requirements they’ll use to choose a vendor.

What to capture:

  • Concrete Metrics: Explicit thresholds (e.g. “<5% data sync errors,” “<2-second page load”).

  • Required Features: Must-have capabilities the prospect named (e.g. “real-time reporting,” “API integration with our ERP”).

  • Business Outcomes: Outcomes they linked to success (e.g. “20% reduction in processing costs,” “improve customer NPS by 10 points”).

  • Priority Order (if given): Any ranking or “must have vs. nice to have” distinctions.

Avoid capturing:

  • Vague Preferences: “Easy to use” or “good support” without detail.

  • Sales-Rep promises or recommendations: Don’t list features that the selling team (1) plans to deliver or (2) encourages the prospect to consider.

  • Unstated Assumptions: Never infer criteria the prospect didn’t explicitly mention.

Output format:

  • Present the decision criteria in a concise list of a few bullets (maximum).

  • If no decision criteria are mentioned by the prospect, reply "Not discussed".

Common issues and fixes


Seeing this issue?

Try this

Incorrect info or misinterpretation like "this was not their pain point"

Add explicit field definitions and counter-examples:

"Pain points are ONLY challenges the prospect states, NOT features they ask about or solutions discussed."

Missing key information from calls

Add prompts for multiple passes:

"Look for [X] throughout the entire conversation, including when discussing [related topic Y]."

Specify where info typically appears (e.g., "often mentioned during pricing discussions").

Incorrect dates or timing

Add temporal anchors:

"Next steps must be FUTURE actions. If a date is mentioned, verify it's after [today's date]."

Vague or generic outputs

Ask for specificity:

"Must include concrete details: numbers, percentages, specific systems/tools mentioned, exact titles, precise timelines."

Tell AI to ban generic phrases in your prompt.

Rep and prospect actions are mixed up

Add ownership clarity:

"Only capture PROSPECT-owned steps in Decision Process. Rep actions go in Next Steps. Use 'prospect will...' vs 'rep will...' to differentiate."

Incorrect picklist values selected

Provide context on each of the picklist options, including explanations of any internal vocabulary or how topics are discussed:

“Upsell - select this option if the prospect shows explicit interest in a new product when presented by the sales team member. Products include…”


Curious to learn more? Schedule a 1:1 session with our team!

Did this answer your question?