Back to blog
getting-startedtutorial

Getting Started with PromptCask

PromptCask Team·

Getting Started with PromptCask

Whether you're a solo developer experimenting with large language models or part of a 50-person AI team, PromptCask gives you a structured way to write, test, and manage your prompts. This guide walks you through everything you need to know to get up and running.

Creating Your First Workspace

After signing up, you'll land on the dashboard. The first thing to do is create a workspace — think of it as a shared folder for your team's prompts.

  1. Click New Workspace in the top-left corner.
  2. Give it a descriptive name (e.g., "Customer Support Bot" or "Content Generation").
  3. Choose a visibility setting: Private (invite-only) or Team (visible to all org members).
  4. Hit Create.

Your workspace is now ready. Every prompt, template, and version history entry will live inside this workspace, making it easy to keep things organized even as your library grows.

Writing Your First Prompt

Navigate into your workspace and click New Prompt. You'll be dropped into PromptCask's rich editor, which supports:

  • Slash commands — type / to insert variables, templates, or formatting blocks
  • Template variables — use {{variable_name}} syntax to create reusable placeholders
  • Markdown formatting — headers, bold, italic, lists, tables, and code blocks
  • Version comments — annotate each save with a note about what changed

Here's a simple example to get started:

You are a helpful customer support agent for {{company_name}}.

The customer's name is {{customer_name}} and they are asking about: {{topic}}.

Guidelines:
- Be friendly and professional
- If you don't know the answer, say so honestly
- Always offer to escalate to a human agent if the issue is complex

Respond in {{language}}.

Once you've written your prompt, click Save (or press Cmd+S). PromptCask automatically creates a version snapshot so you can always roll back later.

Organizing with Folders and Tags

As your prompt library grows, organization becomes critical. PromptCask offers two complementary systems:

Folders

Create nested folders inside a workspace to group related prompts. For example:

Customer Support/
  ├── Greeting Templates/
  ├── Escalation Prompts/
  └── FAQ Generators/

Tags

Tags cut across folder boundaries. Tag a prompt with production, draft, or experimental to quickly filter across your entire workspace. You can also create custom tags like gpt-4o, claude, or high-priority.

Collaborating with Your Team

PromptCask is built for teams. Here's how to bring your colleagues on board:

  1. Invite members — Go to Workspace Settings and invite teammates by email. Choose between Viewer, Editor, and Admin roles.
  2. Real-time editing — Multiple people can work on the same prompt simultaneously. You'll see live cursors and changes appear in real time.
  3. Comments and reviews — Leave inline comments on specific parts of a prompt. Use the review workflow to request approval before promoting a prompt to production.
  4. Activity feed — Every change is logged. See who modified what, when, and why (if they left a version comment).

Testing Prompts

Before deploying a prompt, you'll want to test it. PromptCask's built-in playground lets you:

  • Fill in template variables with sample values
  • Choose which LLM provider and model to test against
  • Compare outputs side-by-side across different model configurations
  • Save test results as snapshots for regression testing

To run a test, click the Play button in the editor toolbar, fill in your variables, and hit Run. Results appear in a split pane to the right.

Deploying via the API

Once your prompt is tested and approved, you can serve it through PromptCask's REST API:

curl -X GET https://api.promptcask.com/v1/prompts/your-prompt-slug \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"variables": {"company_name": "Acme", "customer_name": "Jane"}}'

The API returns the rendered prompt text with all variables filled in, ready to pass to any LLM. You can also pin a specific version to ensure production stability while your team continues iterating on the next draft.

What's Next?

Now that you have the basics down, explore these features:

  • Prompt chains — link multiple prompts together into workflows
  • A/B testing — compare prompt variants with real traffic
  • Analytics — track token usage, latency, and output quality over time
  • Integrations — connect PromptCask to Slack, GitHub, and your CI/CD pipeline

Welcome to PromptCask. We can't wait to see what you build.