What is prompt engineering?
Prompt engineering involves structuring your requests to AI models in ways that optimize their responses. Rather than treating the AI as a search engine, effective prompting treats it as a collaborative partner that needs clear context, goals, and constraints.Core principles of effective prompts
Be specific and clear
Vague prompts lead to vague responses. Provide concrete details about what you need.Provide context
Help the AI understand the broader situation and constraints.Example with context
Specify the desired format
Tell the AI how you want the response structured.- Code output
- Step-by-step
- Comparative
Set the appropriate tone
Indicate the expertise level and style you need.Prompt components in ZeroTwo
ZeroTwo constructs the full prompt sent to AI models by combining several elements:1
System prompt
The base instructions that define the AI’s behavior, capabilities, and constraints. This is set automatically but can be customized through custom instructions or assistants.
2
Custom instructions
Your personal or project-level instructions that persist across conversations. These augment the system prompt with your preferences.
3
Conversation history
Recent messages that provide context for continuing the conversation coherently.
4
Your current message
The specific request or question you’re asking right now.
5
Tool and file context
Additional context from attached files, images, or enabled tools like web search or code interpreter.
Prompt strategies by task type
For code generation
Specify language and framework
Specify language and framework
Always mention the programming language, framework versions, and any relevant libraries.
Include error handling requirements
Include error handling requirements
Specify code style preferences
Specify code style preferences
For analysis and research
Use the Deep Research tool for comprehensive research, or structure your prompt for focused analysis:For creative content
Provide style references, target audience, and key messaging:For documentation
Specify documentation standards and audience:Advanced prompting techniques
Chain of thought
Ask the AI to show its reasoning process:Role-based prompting
Assign the AI a specific expertise role:Iterative refinement
Build on previous responses:Model-specific considerations
Different models excel at different tasks. ZeroTwo supports multiple AI providers with varying strengths.OpenAI GPT-4
Excellent for complex reasoning, code generation, and instruction following. Works well with structured prompts.
Claude (Anthropic)
Superior for long-form content, analysis, and nuanced understanding. Responds well to conversational prompts.
Gemini (Google)
Strong multimodal capabilities. Excels at tasks involving images, videos, and code analysis.
Specialized models
Use reasoning models (o1, o3) for complex problem-solving. Use fast models (GPT-4o-mini, Claude Haiku) for simple tasks.
Common pitfalls to avoid
Testing and refining prompts
1
Start with a clear goal
Define exactly what you want to achieve before writing your prompt.
2
Try your prompt
Submit your prompt and evaluate the response quality.
3
Identify gaps
What’s missing? What’s unclear? What’s incorrect?
4
Refine and retry
Adjust your prompt based on the gaps you identified. Add more context, constraints, or examples.
5
Save successful patterns
Use custom instructions or create an assistant to preserve effective prompt patterns.
Next steps
Structured techniques
Learn advanced structuring methods for complex tasks
Few-shot learning
Improve results by providing examples
Custom instructions
Create reusable prompt patterns
Token limits
Understand context windows and token usage
Additional resources
For assistance-specific prompting, see Assistant Behavior and Tone. For setting persistent preferences, explore Custom Instructions.

