Prompt Engineering & Frameworks: A Practical Guide
- Prompt Engineering helps you structure instructions for AI models so they give you accurate, relevant answers.
-
Frameworks like TCREI, CRISP, SCQA, RTF, COT, and RASCEF offer different ways to organize your prompts.
-
These methods help you conserve tokens, optimize time, and ensure your AI outputs align with your goals.
-
Best Practices include focusing on complex tasks, integrating roles, and providing comprehensive context.
By
Javi, Developer and Community Organizer at
GDG Central Florida
This blog is based on my presentation at the “Gemini Prompting: AI For Data Mining” held on January 23, 2025, at the GDG Central Florida meetup event. If you want to explore the Python notebook code and presentation documents for these frameworks, check out the GitHub repository. Feel free to experiment and refine these prompts to see how each framework can help.
What is Prompt Engineering
First, let’s define Prompt Engineering. Essentially, it’s about creating clear, targeted instructions—known as prompts. Because models like Gemini, GPT, or Llama 2 interpret our words quite literally, writing a well-structured prompt can be the difference between an answer that’s vague or off-topic and one that’s accurate and actionable.
We use Prompt Frameworks to add clarity and consistency, acting as a roadmap that steers the AI toward the intended response. It’s both an art and a science, where the quality of the prompt directly influences the accuracy and relevance of the AI’s output. These frameworks remind us to include crucial information—such as the role the AI should assume, the context or constraints, the desired style or format, and the overall purpose of our request. This ensures we leave nothing out. Plus, correctly phrasing the prompt can help conserve tokens, which potentially reduces costs and time.
AI companies like Google (with their Gemini models) often charge for usage under a pay-as-you-go model, tied to the number of tokens processed, while AI services that are free may still impose limitations on how many tokens you can use.
Prompting Beyond Text-Based AI
While most of us associate prompt engineering with large language models like Gemini or GPT, the concept actually spans all types of AI—from image generation to music composition, code development, and more. Just as with text-based LLMs, well-crafted prompts are crucial for giving these models the context and direction they need. By specifying details such as scene descriptors for images, musical elements for songs, or core functionalities for code, we can guide AI toward exactly the output we envision. And, much like coding or design, the process is iterative—we prompt, evaluate, and refine until we achieve the result we’re after. This makes prompting a foundational skill for anyone eager to tap into the broader possibilities of artificial intelligence.
Why Prompt Engineering Matters
Imagine trying to solve a puzzle without knowing the rules. That’s often how AI feels when it receives unclear or incomplete prompts. Prompt Engineering is the antidote—it’s the practice of crafting well-structured instructions so the AI can respond with clarity and relevance. Additionally, using a well-defined prompt framework can significantly reduce AI hallucinations—inaccuracies or made-up details—by giving the model the right context and constraints.
When It’s Useful
-
Complex coding tasks: Debugging or explaining advanced algorithms.
-
Content generation: Writing technical docs, emails, or user manuals.
-
Data analysis: Summarizing complex data sets or generating insights quickly.
The goal is to maximize the AI’s ability to help, whether you’re a seasoned developer or just dabbling in creative projects.
The Top 6 Prompt Frameworks
Framework (Abbrev.) | Strengths | Weaknesses / Limitations | Good for Dev? | Good for Everyday? | Example Usage |
---|---|---|---|---|---|
TCREI |
- Structured iterative approach - Emphasis on evaluation & improvement - Reference-based |
- Requires high-quality references - Can be time-consuming - May limit creativity |
Yes | Yes |
Task: “Create a REST API endpoint.” Context: “E-commerce system.” Reference: “Here’s our API structure.” Evaluate: “Check best practices.” Iterate: “Refine based on feedback.” |
CRISP |
- Context, Role, Input, Steps, Purpose - Purpose-driven |
- Can be too detailed - Requires clarity |
Yes | Possibly | “You are a backend engineer. Given the system architecture, fix the login bug. Outline steps to debug.” |
SCQA |
- Narrative structure - Problem-solving stories |
- Less explicit about format - Not ideal for technical tasks |
Maybe | Yes | “Our payroll system is outdated. We’re missing deadlines. How to reduce errors? Provide an upgrade plan.” |
RTF |
- Straightforward - Quick |
- Minimal context - May need extra details |
Yes | Yes | “Role: ‘You’re a personal assistant.’ Task: ‘Draft an email.’ Format: ‘Short paragraph.’” |
Chain-of-Thought |
- Explicit reasoning - Complex processes |
- Can become verbose - May slow response |
Yes | Possibly | “Explain how you arrived at each conclusion for optimizing this code, step by step. Then provide the final solution.” |
RASCEF |
- Comprehensive - Multi-stakeholder tasks |
- Can be lengthy - Might over-specify |
Maybe | Yes | “Role: ‘You’re a career counselor.’ Audience: ‘Recent grads.’ Style: ‘Friendly.’ Context: ‘Tips for a job fair.’ Examples: ‘Here’s a sample intro.’ Format: ‘Numbered list.’” |
Practical Code Examples
1. A Quick RTF Prompt
prompt_rtf = """
Role: You are a senior front-end engineer.
Task: Refactor this login page code to improve accessibility.
Format: Provide a concise explanation and code snippet.
"""
print(prompt_rtf)
In this scenario, RTF quickly defines the role, the actual task, and the output format. Perfect for everyday code adjustments.
2. A CRISP Prompt for Debugging
prompt_crisp = """
Context: We have a Node.js API returning intermittent 500 errors.
Role: You are a back-end engineer.
Input: Here's the middleware code... (include snippet)
Steps: Detail how you would debug it and propose a fix.
Purpose: Eliminate 500 errors and ensure logs capture errors properly.
"""
print(prompt_crisp)
With CRISP, we ensure we’re providing enough background (Context), specifying the Role clearly, and stating the Steps and Purpose to fix the error effectively.
Best Practices for Optimal Usage
Actionable Tips for Effective Usage
-
Focus on Complex Tasks: Rely on the model for large-scale, intricate projects rather than simple or routine queries.
-
Integrate Roles: For tasks like coding, consider combining multiple roles (e.g., architect and engineer) to streamline workflows and boost efficiency.
-
Provide Comprehensive Context: Supply detailed instructions upfront to minimize clarification needs and ensure accurate results.
Additional Strategies
-
Master Prompt Engineering: Invest time in learning how to structure inputs effectively so the model delivers accurate and actionable outputs.
-
Set Realistic Expectations: Complex tasks may take longer to process; plan accordingly to avoid unnecessary delays.
-
Use It Collaboratively: The model is designed to augment human expertise rather than replace it—think of it as a partner in problem-solving.
Helpful Resources
-
Gemini Cookbook by Google – Great for advanced usage patterns.
-
ZDNet’s Best AI Chatbots – Overview of popular AI tools with pros and cons.
-
Watch the Google Prompting Essentials Course 1-4 – Learn how to craft clear, effective instructions for generative AI tools.
(Remember that token limits and usage policies can change over time.)
Final Thoughts
Prompt frameworks are a powerful ally. Whether you’re debugging an authentication issue or generating an outline for your next DevFest talk, having a structured prompt can be a game-changer. It saves tokens, focuses the conversation, and boosts your productivity.
If you’re just starting out, don’t get discouraged. You’ll get better
at it with practice—just like any other coding skill. I love seeing
people succeed, so if you have any questions or want to share your own
experiences with prompt engineering, feel free to reach out at any GDG
Central Florida meetup.
Keep the conversation going
by joining our
GDG Central Florida Discord server, and be sure to keep an eye on our upcoming events at
GDG Central Florida!
Stay curious and keep on creating! ¡Hasta la próxima!
–
Javi
(Software Engineer & GDG Community Organizer)
Comments
Post a Comment