Copilot Prompt Templates: Unit Test Generation
Table of Contents
- Overview
- File Location
- Prerequisites: Setting Up Copilot in VS Code
- Prompt Template Format
- Unit Test Prompt Format
- What Happens Next: How Copilot Responds
- Example Workflow
- External Resources
Overview
The file located at .github/prompts/generate-unit-tests.prompt.md serves as a Prompt Template. Unlike the global instructions file, this is a specialized “recipe” used to execute a specific task—in this case, generating robust unit tests.
This technique is often referred to as “Prompt Engineering.” It provides the AI with a structured workflow and examples (few-shot prompting) to ensure that generated tests match the project’s quality standards.
File Location
This prompt file is located under the .github/prompts/ directory in the repository root here.
Prerequisites: Setting Up Copilot in VS Code
View the setup instructions in the CopilotSetup.md file.
Prompt Template Format
See why we use templates and how to structure them in the CopilotPromptTemplate.md file
Unit Test Prompt Format
This is how the generate-unit-tests.prompt.md file specifically is structured:
Step 1: Understand the Code
Copilot first reads the selected class/function and identifies test-relevant behavior:
- Public API surface to test (methods, inputs, return values).
- Preconditions and invariants that should hold before and after operations.
- Observable behaviors versus implementation details to avoid brittle tests.
- Error paths and boundary conditions (empty collections, min/max values, invalid inputs).
- Dependencies or collaborators that may require fixtures or controlled setup.
Step 2: Create Scenario
Next, Copilot creates a concrete scenario set for unit testing rather than root-cause tracing:
- Happy path behavior.
- Boundary and edge-case coverage.
- Error handling and failure expectations.
- State preservation across operations.
- Repeated-call/idempotency behavior.
- Method interaction sequences (for example, add/remove or increment/decrement flows).
Step 3: Generate Test Cases
Finally, Copilot generates test code that follows Graphitti testing conventions:
- Uses Google Test with clear
TEST/TEST_Fnames in PascalCase. - Prefers behavior-focused assertions (
EXPECT_*/ASSERT_*) and AAA-style structure. - Places or appends tests under
Testing/UnitTesting/. - Uses fixtures only where setup complexity justifies them.
- Produces tests that are isolated and deterministic.
What Happens Next: How Copilot Responds
Because this prompt is configured with agent: agent, Copilot proposes direct workspace edits instead of
only chat text.
You should expect:
- Creation or modification of unit-test files (typically under
Testing/UnitTesting/). - A diff you can review and accept/discard in VS Code.
- Follow-up refinement by prompting for additional scenarios (for example, edge cases or regressions).
Example Workflow
Scenario: Generate tests for the Vertex class.
- Open
Simulator/Core/Vertex.cppin the editor. - Select the class methods you want tested (or press
Ctrl+Ato select all). -
Open Copilot Chat and type
/generate-unit-tests - Copilot reads the selected code via the
${selection}variable, follows the prompt’s three-step workflow (analyze → plan → generate), and creates a new file atTesting/UnitTesting/VertexTests.cppcontaining Google Test cases. - Review the generated diff. Click Accept to save, or type follow-up instructions in the chat to refine.