Home/Guides/How to Write AI Prompts: 10 Techniques That Actually Work (2026)

How to Write AI Prompts: 10 Techniques That Actually Work (2026)

Updated April 2026·11 min read

What Makes a Good AI Prompt?

A good prompt is like a clear brief to a colleague—it specifies what, how, and why, leaving nothing to interpretation. Most people treat AI like a search engine: they ask a vague question and accept whatever they get. But AI thrives on structure.

The difference between a prompt that gets mediocre results and one that gets exceptional results often comes down to clarity, context, and constraints. A bad prompt wastes your time iterating. A good prompt nails it in one or two tries.

This guide covers 10 techniques that consistently unlock better outputs across ChatGPT, Claude, Gemini, and every major AI model.

Technique 1: The CRISP Framework

CRISP is a mental model that covers everything a good prompt needs:

  • Context: What's the situation? What's the background?
  • Role: Who am I, and what expertise should you adopt?
  • Instructions: What exact task do you need to do?
  • Style: What tone, format, or approach?
  • Parameters: What are the constraints (length, audience, depth)?

Example: Weak → Strong

Weak: "Write about remote work"

Strong (CRISP):

  • Context: I'm a startup founder writing an article for first-time remote workers
  • Role: You're an experienced tech HR manager who's onboarded 200+ remote employees
  • Instructions: Write an intro section that covers the top 3 challenges new remote workers face and how to overcome them
  • Style: Conversational, encouraging, practical (not preachy)
  • Parameters: 400-500 words, use bullet points for the three challenges, include a real-world example

The second prompt guarantees better output. Here's why: the AI knows exactly who it's writing for, what problem it's solving, and the boundaries.

CRISP Template

Context: [Situation and background]
Role: [Persona or expertise]
Instructions: [Specific task]
Style: [Tone, format, approach]
Parameters: [Constraints: length, audience, depth, format]

You don't need to label everything—just make sure each element is clear.

Technique 2: Few-Shot Examples (Show, Don't Tell)

Telling an AI "be more concise" rarely works. Showing it an example of concise work is incredibly effective.

Example: Summarization

Weak: "Summarize this article in a concise way"

Strong (with examples): "Summarize the following article in 2-3 sentences. Here's the style I want:

Example 1 (article about AI in healthcare): Instead of: 'Artificial intelligence is being integrated into many hospitals across the country to improve patient outcomes through better diagnostic tools and operational efficiency.' Do this: 'AI helps hospitals diagnose diseases faster and run more efficiently.'

Example 2 (article about remote work): Instead of: 'Companies are increasingly allowing their workforce to work from home, which has led to benefits like reduced overhead and challenges like decreased team cohesion.' Do this: 'Remote work cuts costs but can hurt team connection.'

Now summarize this article: [paste article]"

The AI now understands exactly the brevity level and tone you want. Few-shot examples are one of the most underrated techniques in prompting.

Technique 3: Chain-of-Thought (Ask It to Think Step-by-Step)

Chain-of-thought prompting forces the AI to break down reasoning before answering. It dramatically improves accuracy on complex tasks.

Example: Problem-Solving

Weak: "Should I raise prices on my SaaS product?"

Strong (chain-of-thought): "I'm considering raising prices on my SaaS product. Before you give a recommendation, think through these questions step-by-step:

  1. What information do you need to make this decision?
  2. What are the potential upsides and downsides of a price increase?
  3. What risks should I be aware of?
  4. What questions should I ask before deciding?

Once you've thought through those, give me a recommendation."

By asking the AI to show its thinking, you get better reasoning and more defensible conclusions.

Technique 4: System Prompts and Custom Instructions

A system prompt is hidden context that shapes how an AI behaves across all conversations. Most people don't use this, but it's incredibly powerful.

Where to Use System Prompts

ChatGPT: Settings → Custom Instructions → "How would you like ChatGPT to respond?"

Claude (Web): Settings → Custom Instructions

Gemini: Settings → Custom Instructions (rolling out in 2026)

Real Examples

For executives: "You are an executive advisor. I ask you questions about business decisions. Always structure your advice as: 1) what I should do, 2) why, 3) risks, 4) next steps. Be direct and concise—never use more than 200 words."

For developers: "You are a senior software engineer reviewing code. When I paste code, always: 1) identify any bugs or performance issues, 2) suggest the clearest fix, 3) explain why it matters. Use TypeScript and modern best practices."

For writers: "You are a copywriter specializing in SaaS marketing. I'll give you product descriptions, and you'll make them punchy, benefit-focused, and conversational. Avoid buzzwords like 'revolutionary' or 'cutting-edge.'"

Setting this once saves you from repeating context in every single prompt.

Technique 5: Structured Output Formats

Tell the AI exactly how to format its response. This is especially useful for data, comparisons, or lists.

Example: Comparison Table

Weak: "Compare these three AI tools"

Strong: "Compare these three AI tools in a markdown table with these columns: | Tool | Best For | Price | Speed | Accuracy | Unique Strength | ...

Base your comparison on actual 2026 benchmarks, not marketing claims."

By specifying the format upfront, you get structured data you can copy into a spreadsheet or document instantly.

Other Structured Formats

  • JSON: "Format your response as JSON with keys: tool, pros[], cons[], pricing"
  • Outline: "Use a numbered outline with main points and 2-3 sub-bullets each"
  • CSV: "Output as comma-separated values: name, email, company, role"
  • Markdown: "Use H2 headings for sections, bold for key terms, bullet lists for details"

Technique 6: Constraint-Based Prompting

Constraints force creativity and focus. Instead of "write about X," try "write about X in exactly 100 words" or "using only these three ideas."

Example: Marketing Copy

Weak: "Write a headline for my SaaS product"

Strong: "Write 5 headline options for my SaaS product (TurboAnalytics). Constraints:

  • Exactly 6-8 words per headline
  • Focus on the benefit (save time analyzing data), not the feature
  • Use a verb in the first two options, a question in the next two, a statement in the last one
  • No exclamation marks
  • Avoid clichés like 'revolutionary' or 'next-gen'"

This produces five distinct, high-quality options instead of vague alternatives.

Technique 7: Persona-Based Prompting

Assigning a persona makes the AI adopt specific behavior and knowledge.

PersonaExample Prompt
Expert practitioner"You are a senior data scientist with 15 years of industry experience. Explain how to build a recommender system for a music streaming app."
Journalist"You are an investigative journalist. What questions would you ask about [topic]?"
Skeptic/critic"You are a skeptical investor evaluating a business pitch. What are the biggest flaws in this idea?"
Beginner"You are explaining this to a high school student with no background in the topic. How would you introduce it?"
Storyteller"You are a screenwriter. Pitch me this product or idea as a compelling story."

The persona triggers different reasoning styles. A skeptic gives you vulnerabilities. A storyteller gives you narrative hooks.

Technique 8: Negative Examples (What NOT to Do)

Show the AI what you don't want. This is sometimes more effective than describing what you do want.

Example: Tone

Instead of: "Write in a professional but warm tone"

Better: "Write in a professional but warm tone. Here's what NOT to do:

  • Don't be stiff: 'We kindly request your attention to the following matter'
  • Don't be casual: 'Hey, just wanted to give you a heads-up about this thing'
  • Don't be passive: 'It has been determined that action is required'

Do this: 'We wanted to make sure you're aware of this change and how it affects you.'"

Negative examples clarify your intent faster than positive ones alone.

Technique 9: Tool-Specific Optimization

Each AI model has strengths. Adapt your prompts accordingly.

ChatGPT

  • Strength: Conversational, good at explaining concepts
  • Prompt tip: ChatGPT likes being friendly. Try: "Let's brainstorm together" instead of "Generate ideas"
  • Weak area: Reasoning on novel problems
  • Hack: Use examples and ask it to identify the pattern before solving your problem

Claude (Anthropic)

  • Strength: Nuanced reasoning, long documents (200k token context)
  • Prompt tip: Claude responds well to explicit constraints. Be very specific about edge cases
  • Weak area: Rapid-fire back-and-forth conversations
  • Hack: Paste entire docs and ask Claude to analyze or edit them—it handles long context better

Gemini (Google)

  • Strength: Real-time web search, Google integration, image understanding
  • Prompt tip: Gemini appreciates specificity about sources. Try: "Answer based on current 2026 data"
  • Weak area: Complex coding or math
  • Hack: Ask it to search the web for context before answering complex questions
ModelBest ForAvoid
ChatGPTQuick answers, brainstorming, explanationsLong documents (context limit), novel reasoning
ClaudeDeep analysis, long content, reasoning, codingReal-time data, speed (slower processing)
GeminiWeb research, Google integration, current dataSpecialized coding tasks, reasoning chains

Technique 10: Iterative Refinement With Feedback

Most people write one prompt and accept the result. Better approach: treat it as a conversation.

Example Iteration

Prompt 1: "Write a job description for a marketing manager" Result: Generic, templated

Prompt 2: "That's too generic. Make it more specific to our company culture (startup, fast-paced, data-driven). What details are missing?" Result: AI identifies what's vague and asks clarifying questions

Prompt 3: "We're a series B SaaS startup, 50 people, focus on developer tools. This person owns demand gen. They'll report to the VP of Growth. Make it reflect our casual but professional tone." Result: Much better—specific, cultural, actionable

The iteration loop gets you from 60% to 95% in 2-3 refinements.

Common Mistakes That Kill Prompt Quality

  1. Being too vague: "Tell me about productivity" vs. "How can a solo founder manage task switching and maintain focus while handling sales, support, and product?"

  2. Forgetting context: The AI doesn't know your industry, audience, or constraints unless you say so.

  3. Asking for too much at once: Break complex tasks into steps instead of asking for a 2000-word essay plus a table plus code examples.

  4. Ignoring the output format: Always specify how you want the answer structured (list, table, essay, JSON, etc.)

  5. Not leveraging examples: Showing one good example beats describing quality in 100 words.

  6. Using vague adjectives: "Make it better," "more concise," "professional" are meaningless without context. Use "cut it to 150 words" or "explain it to a 5th grader."

  7. Treating it like a search engine: Asking "what is X?" gets surface-level answers. Ask "explain X in the context of my situation" for relevant output.

Pro Tips You Rarely See Shared

  • Use "Let me show you what I mean": Pasting context, examples, or draft work often produces better results than describing it
  • Ask the AI to find the gap: "What information am I missing to make this decision?" gets the AI to identify blind spots
  • Request iteration formats: "Generate 3 versions: one for executives, one for the team, one for customers" produces multiple angles efficiently
  • Use "What if" prompts for brainstorming: "What if we removed [constraint]? How would that change the strategy?"
  • Ask for a rubric: Before getting an output, ask "What would an excellent [output] include?" to align expectations
  • Combine prompt techniques: Use CRISP + few-shot examples + chain-of-thought together for complex tasks

The Prompt Evolution Framework

Think of prompting as iterating toward clarity:

  1. First draft: Simple request
  2. Add context: What's the background and goal?
  3. Add constraints: Length, format, audience, tone
  4. Add examples: Show what good looks like
  5. Ask for reasoning: Chain-of-thought before the answer

Each step compounds. A bare request becomes a machine-readable specification.

Related Reading