Skip to content

Hands-On Scenarios

This section pairs every concept from the study guide with a real, runnable exercise using a local SQL Server instance. You'll see actual timing numbers, real execution plans, and the satisfaction of watching logical reads drop from thousands to dozens.


Prerequisites

Step 1 โ€” Start SQL Server via Docker:

cd docker
cp .env.example .env          # optional: customize the SA password
docker compose up -d

Wait ~20 seconds for SQL Server to initialize, then run the init scripts.

bash init-db.sh

Step 2 โ€” Verify it's working (VS Code MSSQL extension or DataGrip):

Server:   localhost,1433
Login:    sa
Password: InterviewDemo@2026

Step 3 โ€” Run the C# demo app:

cd src/SqlDemos
dotnet run              # interactive menu
dotnet run -- all       # run all 6 scenarios and print timing table
dotnet run -- 2         # run scenario 2 only

The Database: InterviewDemoDB

Table Rows (approx.) Purpose
Customers 1,000 1 whale (50k orders), 999 normal
Products 25 Used in OrderItems
Orders ~55,000 Skewed data โ€” key to parameter sniffing
OrderItems ~120,000 Line items per order
Employees 15 Org chart for SELF JOIN demo
OrderAuditLog ~14,000 Wide table for SELECT * demo

The whale: CustomerID=1 ("BigCorp Ltd") has ~50,000 orders. Every other customer has 1-8. This skew is what makes Scenario 2 (Parameter Sniffing) interesting.


The 6 Scenarios

# Name Antipattern Typical Speedup
1 Cursor Catastrophe Row-by-row UPDATE in a loop 100โ€“1000ร—
2 Parameter Sniffing Ghost Cached plan wrong for data distribution 10โ€“50ร—
3 Non-SARGable Date Trap Functions on columns in WHERE 10โ€“20ร—
4 SELECT * Flood Pulling fat NVARCHAR(MAX) columns 5โ€“30ร—
5 Key Lookup Tax Non-covering index 5โ€“15ร—
6 Scalar UDF Killer Scalar function in WHERE clause 20โ€“100ร—

How to Measure on macOS

Every scenario has a query exercise block. The SQL commands themselves (SET STATISTICS IO ON, EXEC ...) are identical across both tools. What differs is how you enable the execution plan. Pick the one you have:

Note: Azure Data Studio was retired on February 28, 2026. Use one of the two tools below instead.


Option A โ€” VS Code with the MSSQL Extension

  1. Install SQL Server (mssql) by Microsoft (ms-mssql.mssql) from the Extensions panel if you haven't already.
  2. Open a .sql file, connect to localhost,1433 when prompted (or use the status-bar connection picker at the bottom).
  3. Paste the statistics preamble at the top of your query:
    SET STATISTICS IO ON;
    SET STATISTICS TIME ON;
    
  4. To get the actual execution plan, right-click anywhere in the editor โ†’ "Run Query with Actual Execution Plan" (or click the dropdown arrow next to the Run button โ–ถ and select that option).
  5. The results pane shows three tabs:
  6. Results โ€” the row data
  7. Messages โ€” STATISTICS IO / STATISTICS TIME output (logical reads, CPU time, elapsed time)
  8. Execution Plan โ€” graphical plan (click any operator node to see estimated vs actual rows, cost %, output column list)

Tip: The VS Code MSSQL extension uses the same .sqlplan XML format as SSMS, so all operators (Index Seek, Key Lookup, Hash Match, etc.) use identical names.


Option B โ€” JetBrains DataGrip

  1. Connect to localhost,1433 with the SQL Server driver (Login: sa, Password: InterviewDemo@2026, Database: InterviewDemoDB).
  2. Open a query console for that data source.
  3. Paste the statistics preamble at the top:
    SET STATISTICS IO ON;
    SET STATISTICS TIME ON;
    
  4. To get the actual execution plan, right-click in the editor โ†’ Explain Plan โ†’ Explain Analyzed. This runs the query and captures live row counts โ€” equivalent to SSMS's actual execution plan.
  5. For a quick estimated plan without running the query: right-click โ†’ Explain Plan โ†’ Explain Plan (or Cmd+Shift+E).
  6. Results appear in separate panels at the bottom:
  7. Output โ€” the STATISTICS IO / STATISTICS TIME text (logical reads, CPU, elapsed)
  8. Plan โ€” DataGrip's graphical plan tree. Click any node to expand its full properties (actual rows, estimated rows, cost, output columns).

Tip: DataGrip's Explain Analyzed view highlights the costliest nodes in orange/red. Start your investigation there.


The SQL Commands (Same in All Tools)

-- Paste at the top of every exercise query window:
SET STATISTICS IO ON;
SET STATISTICS TIME ON;

After running, look for this pattern in the Messages/Output tab:

Table 'Orders'. Scan count 1, logical reads 3842, lob logical reads 0, ...
SQL Server Execution Times: CPU time = 234 ms, elapsed time = 412 ms.

Logical reads is your primary metric. 1 logical read = 1 ร— 8 KB page read from the buffer cache. Dropping logical reads from thousands to dozens is the goal for most of these scenarios.

Good habit

Always run the bad version first, write down the logical reads, then run the fixed version. The contrast is what makes it stick.