sqldw-authoring-cli

community

>

>_microsoft/skills-for-fabric/skills/sqldw-authoring-cli·commit 339c328

name: sqldw-authoring-cli description: > Execute authoring T-SQL (DDL, DML, data ingestion, transactions, schema changes) against Microsoft Fabric Data Warehouse and SQL endpoints from agentic CLI environments. Use when the user wants to: (1) create/alter/drop tables from terminal, (2) insert/update/delete/merge data via CLI, (3) run COPY INTO or OPENROWSET ingestion, (4) manage transactions or stored procedures, (5) perform schema evolution, (6) use time travel or snapshots, (7) generate ETL/ELT shell scripts, (8) create views/functions/procedures on Lakehouse SQLEP. Triggers: "create table in warehouse", "insert data via T-SQL", "load from ADLS", "COPY INTO", "run ETL with T-SQL", "alter warehouse table", "upsert with T-SQL", "merge into warehouse", "create T-SQL procedure", "warehouse time travel", "recover deleted warehouse data", "create warehouse schema", "deploy warehouse", "transaction conflict", "snapshot isolation error".

Update Check — ONCE PER SESSION (mandatory) The first time this skill is used in a session, run the check-updates skill before proceeding.

  • GitHub Copilot CLI / VS Code: invoke the check-updates skill.
  • Claude Code / Cowork / Cursor / Windsurf / Codex: compare local vs remote package.json version.
  • Skip if the check was already performed earlier in this session.

CRITICAL NOTES

  1. To find the workspace details (including its ID) from workspace name: list all workspaces and, then, use JMESPath filtering
  2. To find the item details (including its ID) from workspace ID, item type, and item name: list all items of that type in that workspace and, then, use JMESPath filtering

SQL Endpoint Authoring — CLI Skill

Table of Contents

TaskReferenceNotes
Finding Workspaces and Items in FabricCOMMON-CLI.md § Finding Workspaces and Items in FabricMandatoryREAD link first [needed for finding workspace id by its name or item id by its name, item type, and workspace id]
Fabric Topology & Key ConceptsCOMMON-CORE.md § Fabric Topology & Key Concepts
Environment URLsCOMMON-CORE.md § Environment URLs
Authentication & Token AcquisitionCOMMON-CORE.md § Authentication & Token AcquisitionWrong audience = 401; read before any auth issue
Core Control-Plane REST APIsCOMMON-CORE.md § Core Control-Plane REST APIsIncludes pagination, LRO polling, and rate-limiting patterns
OneLake Data AccessCOMMON-CORE.md § OneLake Data AccessRequires storage.azure.com token, not Fabric token
Definition EnvelopeITEM-DEFINITIONS-CORE.md § Definition EnvelopeDefinition payload structure
Per-Item-Type DefinitionsITEM-DEFINITIONS-CORE.md § Per-Item-Type DefinitionsSupport matrix, decoded content, part paths — REST specs, CLI recipes
Job ExecutionCOMMON-CORE.md § Job Execution
Capacity ManagementCOMMON-CORE.md § Capacity Management
Gotchas, Best Practices & Troubleshooting (Platform)COMMON-CORE.md § Gotchas, Best Practices & Troubleshooting
Tool Selection RationaleCOMMON-CLI.md § Tool Selection Rationale
Authentication RecipesCOMMON-CLI.md § Authentication Recipesaz login flows and token acquisition
Fabric Control-Plane API via az restCOMMON-CLI.md § Fabric Control-Plane API via az restAlways pass --resource; includes pagination and LRO helpers
OneLake Data Access via curlCOMMON-CLI.md § OneLake Data Access via curlUse curl not az rest (different token audience)
SQL / TDS Data-Plane AccessCOMMON-CLI.md § SQL / TDS Data-Plane Accesssqlcmd (Go) connect, query, CSV export
Job Execution (CLI)COMMON-CLI.md § Job Execution
OneLake ShortcutsCOMMON-CLI.md § OneLake Shortcuts
Capacity Management (CLI)COMMON-CLI.md § Capacity Management
Composite RecipesCOMMON-CLI.md § Composite Recipes
Gotchas & Troubleshooting (CLI-Specific)COMMON-CLI.md § Gotchas & Troubleshooting (CLI-Specific)az rest audience, shell escaping, token expiry
Quick ReferenceCOMMON-CLI.md § Quick Referenceaz rest template + token audience/tool matrix
Item-Type Capability MatrixSQLDW-CONSUMPTION-CORE.md § Item-Type Capability MatrixShows read-only (SQLEP) vs read-write (DW)
Connection FundamentalsSQLDW-CONSUMPTION-CORE.md § Connection FundamentalsTDS, port 1433, Entra-only, no MARS
Supported T-SQL Surface Area (Consumption Focus)SQLDW-CONSUMPTION-CORE.md § Supported T-SQL Surface AreaRead before writing T-SQL — includes data types (no nvarchar/datetime/money)
Read-Side Objects You Can CreateSQLDW-CONSUMPTION-CORE.md § Read-Side Objects You Can CreateViews, TVFs, scalar UDFs, procedures
Temporary TablesSQLDW-CONSUMPTION-CORE.md § Temporary Tables
Cross-Database QueriesSQLDW-CONSUMPTION-CORE.md § Cross-Database Queries3-part naming, same workspace only
Security for ConsumptionSQLDW-CONSUMPTION-CORE.md § Security for ConsumptionGRANT/DENY, RLS, CLS, DDM
Monitoring and DiagnosticsSQLDW-CONSUMPTION-CORE.md § Monitoring and DiagnosticsIncludes query labels; DMVs (live) + queryinsights.* (30-day history)
Performance: Best Practices and TroubleshootingSQLDW-CONSUMPTION-CORE.md § Performance: Best Practices and TroubleshootingStatistics, caching, clustering, query tips
REST API: Refresh SQL Endpoint MetadataSQLDW-CONSUMPTION-CORE.md § REST API: Refresh SQL Endpoint MetadataForce metadata sync when SQLEP is stale after ETL
System Catalog Queries (Metadata Exploration)SQLDW-CONSUMPTION-CORE.md § System Catalog Queriessys.tables, sys.columns, sys.views, sys.stats
Common Consumption PatternsSQLDW-CONSUMPTION-CORE.md § Common Consumption PatternsReporting views, cross-DB analytics, temp table staging
Gotchas and Troubleshooting (Consumption)SQLDW-CONSUMPTION-CORE.md § Gotchas and Troubleshooting Reference18 numbered issues with cause + resolution
Quick Reference: Consumption CapabilitiesSQLDW-CONSUMPTION-CORE.md § Quick Reference: Consumption Capabilities
Authoring Capability MatrixSQLDW-AUTHORING-CORE.md § Authoring Capability MatrixRead first — DW vs SQLEP authoring scope
Table DDL (DW Only)SQLDW-AUTHORING-CORE.md § Table DDL (DW Only)CREATE, CTAS, ALTER, sp_rename, DROP, constraints, schema evolution, IDENTITY
DML Operations (DW Only)SQLDW-AUTHORING-CORE.md § DML Operations (DW Only)INSERT...SELECT, UPDATE, DELETE, TRUNCATE, MERGE
Data Ingestion (DW Only)SQLDW-AUTHORING-CORE.md § Data Ingestion (DW Only)COPY INTO, OPENROWSET, method comparison
Transactions (DW Only)SQLDW-AUTHORING-CORE.md § Transactions (DW Only)Snapshot isolation only; write-write conflict rules
Stored Procedures (Authoring Patterns)SQLDW-AUTHORING-CORE.md § Stored Procedures (Authoring Patterns)ETL procs, upsert, CTAS swap, cursor replacement
Time Travel and Warehouse SnapshotsSQLDW-AUTHORING-CORE.md § Time Travel and Warehouse Snapshots (DW Only)FOR TIMESTAMP AS OF; 30-day retention; snapshots GA
Source Control and CI/CDSQLDW-AUTHORING-CORE.md § Source Control and CI/CD (DW Only — Preview)Git integration, SQL DB projects, deployment pipelines
Authoring Permission ModelSQLDW-AUTHORING-CORE.md § Authoring Permission ModelContributor minimum for DDL/DML; Admin for GRANT
Authoring Gotchas and TroubleshootingSQLDW-AUTHORING-CORE.md § Authoring Gotchas and Troubleshooting17-row issue/cause/resolution table
Common Authoring PatternsSQLDW-AUTHORING-CORE.md § Common Authoring PatternsIncremental load, SCD Type 1, SQLEP view layer
Quick Reference: Authoring Decision GuideSQLDW-AUTHORING-CORE.md § Quick Reference: Authoring Decision GuideScenario → recommended approach lookup
Core Authoring via CLIauthoring-cli-quickref.md § Core Authoring via CLITable DDL, DML, data ingestion sqlcmd one-liners
Advanced Authoring Patterns via CLIauthoring-cli-quickref.md § Advanced Authoring Patterns via CLITransactions, schema evolution, stored procedures, time travel
Bash Templatesauthoring-script-templates.md § Bash TemplatesCOPY INTO, ELT pipeline, upsert with retry, schema migration, time travel recovery, stored procedure
PowerShell Templatesauthoring-script-templates.md § PowerShell TemplatesCOPY INTO ingestion, incremental upsert with retry
Tool StackSKILL.md § Tool Stacksqlcmd (Go) + az CLI + jq; verify before first op
ConnectionSKILL.md § ConnectionFQDN discovery, reusable vars, PowerShell
Script Generationauthoring-cli-quickref.md § Script Generationsqlcmd output flags, piped input, parameterized queries
Agentic WorkflowsSKILL.md § Agentic WorkflowsStart here — discover schema before any write
Monitoring Authoring Operationsauthoring-cli-quickref.md § Monitoring Authoring OperationsActive DML/DDL, recent ETL, failed writes
Gotchas, Rules, TroubleshootingSKILL.md § Gotchas, Rules, TroubleshootingMUST DO / AVOID / PREFER checklists
Agent Integration Notesauthoring-cli-quickref.md § Agent Integration NotesPlatform-specific tips (Copilot CLI, Claude Code)

Tool Stack

ToolRoleInstall
sqlcmd (Go)Primary: Execute DDL/DML T-SQL. Standalone binary, no ODBC, built-in Entra ID auth.winget install sqlcmd / brew install sqlcmd / apt-get install sqlcmd
az CLIAuth (az login), token acquisition, Fabric REST for endpoint discovery, snapshot management.Pre-installed in most dev environments
jqParse JSON from az restPre-installed or trivial

Agent check — verify before first operation:

sqlcmd --version 2>/dev/null || echo "INSTALL: winget install sqlcmd OR brew install sqlcmd"

Authoring Scope by Item Type

CapabilityWarehouse (DW)Lakehouse/Mirrored DB SQLEP
Table DDL (CREATE/ALTER/DROP)
DML (INSERT/UPDATE/DELETE/MERGE)
COPY INTO, OPENROWSET (ingest)OPENROWSET read-only
Transactions
Time travel, snapshots
CREATE VIEW/FUNCTION/PROCEDURE
CREATE SCHEMA

Connection

Discover the SQL Endpoint FQDN

Per COMMON-CLI.md Discovering Connection Parameters via REST:

WS_ID="<workspaceId>"
ITEM_ID="<warehouseOrLakehouseId>"

# Warehouse
az rest --method get \
  --resource "https://api.fabric.microsoft.com" \
  --url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/warehouses/$ITEM_ID" \
  --query "properties.connectionString" --output tsv

# Lakehouse SQL endpoint
az rest --method get \
  --resource "https://api.fabric.microsoft.com" \
  --url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/lakehouses/$ITEM_ID" \
  --query "properties.sqlEndpointProperties.connectionString" --output tsv

Result: <uniqueId>.datawarehouse.fabric.microsoft.com

Connect with sqlcmd (Go)

# Non-interactive one-shot query
sqlcmd -S "<endpoint>.datawarehouse.fabric.microsoft.com" -d "<DatabaseName>" -G \
  -Q "SELECT TOP 10 * FROM dbo.FactSales"

# Service principal (CI/CD)
SQLCMDPASSWORD="<clientSecret>" \
sqlcmd -S "<endpoint>.datawarehouse.fabric.microsoft.com" -d "<DatabaseName>" \
  --authentication-method ActiveDirectoryServicePrincipal \
  -U "<appId>" \
  -Q "SELECT COUNT(*) FROM dbo.FactSales"

Reusable Connection Variables

# Set once at script top
FABRIC_SERVER="<endpoint>.datawarehouse.fabric.microsoft.com"
FABRIC_DB="<DatabaseName>"
SQLCMD="sqlcmd -S $FABRIC_SERVER -d $FABRIC_DB -G"

# Use throughout
$SQLCMD -Q "SELECT TOP 5 * FROM dbo.DimProduct"
$SQLCMD -i myscript.sql

PowerShell / Windows CMD

$s = "<endpoint>.datawarehouse.fabric.microsoft.com"; $db = "<DatabaseName>"
sqlcmd -S $s -d $db -G -Q "SELECT TOP 10 * FROM dbo.FactSales"
# CMD: use set S=... and %S% / %DB% instead of $variables

Agentic Workflows

Schema Discovery Before Authoring

Before any write operation, discover the target schema:

# 1. List tables
$SQLCMD -Q "SELECT table_schema, table_name FROM information_schema.tables ORDER BY 1,2" -W

# 2. Check columns
$SQLCMD -Q "SELECT column_name, data_type, is_nullable FROM information_schema.columns WHERE table_name='FactSales' ORDER BY ordinal_position" -W

# 3. Sample data
$SQLCMD -Q "SELECT TOP 5 * FROM dbo.FactSales" -W

# 4. Check constraints
$SQLCMD -Q "SELECT constraint_name, constraint_type FROM information_schema.table_constraints WHERE table_name='FactSales'" -W

# 5. Row counts
$SQLCMD -Q "SELECT s.name AS [schema], t.name AS [table], SUM(p.rows) AS row_count FROM sys.tables t JOIN sys.schemas s ON t.schema_id=s.schema_id JOIN sys.partitions p ON t.object_id=p.object_id AND p.index_id IN (0,1) GROUP BY s.name, t.name ORDER BY row_count DESC" -W

# 6. Programmability objects
$SQLCMD -Q "SELECT name, type_desc FROM sys.objects WHERE type IN ('V','FN','IF','P','TF') ORDER BY type_desc, name" -W

Agentic Workflow

  1. Discover → Run steps 1–4 to understand available tables/columns.
  2. SampleSELECT TOP 5 on relevant tables.
  3. Formulate → Select pattern from SQLDW-AUTHORING-CORE.md (Table DDL through Common Authoring Patterns).
  4. Execute$SQLCMD -Q "..." or $SQLCMD -i file.sql for multi-statement.
  5. Verify → Query affected table (SELECT COUNT(*), SELECT TOP 5).
  6. Optionally script → Generate reusable .sh or .ps1 using references/authoring-script-templates.md.

Gotchas, Rules, Troubleshooting

For full authoring gotchas: SQLDW-AUTHORING-CORE.md Authoring Gotchas and Troubleshooting. For CLI-specific issues: COMMON-CLI.md Gotchas & Troubleshooting (CLI-Specific).

MUST DO

  • Verify workspace has capacity before creating warehouse — call GET /v1/workspaces/{id} and check capacityId.
  • Always -d <DatabaseName> — FQDN alone is insufficient.
  • Always -G or --authentication-method — SQL auth not supported on Fabric.
  • az login firstActiveDirectoryDefault uses az session. No session → cryptic failure.
  • SET NOCOUNT ON; in scripts — suppresses row-count messages that corrupt output.
  • Use -i file.sql for multi-statement batches (CREATE PROCEDURE, transactions with GO separators).
  • Label authoring queries with OPTION (LABEL = 'ETL_description').
  • Use explicit CAST() in CTAS to control output types.
  • Keep transactions short — long transactions increase conflict window.

AVOID

  • ODBC sqlcmd (/opt/mssql-tools/bin/sqlcmd) — requires ODBC driver. Use Go version.
  • Omitting -W in scripts — trailing spaces corrupt CSV.
  • Singleton INSERT ... VALUES at scale — creates tiny Parquet files. Use INSERT...SELECT, CTAS, or COPY INTO.
  • DROP TABLE IF EXISTS + CREATE TABLE to refresh — loses time-travel history. Use TRUNCATE TABLE + INSERT INTO.
  • MERGE in production — preview, table-level conflict detection. Use DELETE + INSERT.
  • ALTER COLUMN — not supported. Use CTAS workaround (Schema Evolution).
  • Variables in CTAS — not allowed. Wrap in dynamic SQL: EXEC sp_executesql N'CREATE TABLE ...'.
  • DML on Lakehouse/Mirrored DB SQLEP — read-only for table data. Only views/funcs/procs can be authored.
  • Concurrent UPDATE/DELETE on same table — snapshot isolation conflicts at table level. Serialize writes.
  • Hardcoded FQDNs — discover via REST API (Connection section).
  • MARS — not supported. Remove MultipleActiveResultSets from connection strings.

PREFER

  • CTAS over CREATE TABLE + INSERT — parallel, single-operation.
  • INSERT ... SELECT over singleton INSERTs.
  • COPY INTO for external file ingestion — highest throughput.
  • DELETE + INSERT over MERGE for upserts in production.
  • TRUNCATE TABLE over DELETE FROM without WHERE — faster, preserves history.
  • -i file.sql over -Q "..." for anything beyond simple one-liners.
  • Piped here-doc for multi-statement batches without GO requirements.
  • CTAS + sp_rename for large-scale transforms instead of UPDATE.
  • sqlcmd (Go) -G over curl+token for SQL queries.
  • -Q (non-interactive exit) for agentic use.
  • -F vertical for exploration of wide tables.
  • Env vars (FABRIC_SERVER, FABRIC_DB) for script reuse.

TROUBLESHOOTING

SymptomFix
Error 24556/24706 snapshot conflictSerialize writes to same table; retry with backoff
COPY INTO auth errorGrant Storage Blob Data Reader on ADLS; or SAS in CREDENTIAL
COPY INTO from OneLake failsProvision workspace identity; check firewall rules
CTAS unexpected typesUse explicit CAST() in SELECT
Singleton INSERT poor perfRemediate: CTAS + drop + rename to consolidate Parquet
Proc CREATE fails with -QUse -i file.sql (GO separators needed)
sp_rename on SQLEP failsOnly available on Warehouse, not Lakehouse/Mirrored DB
Deploy drops/recreates tableAvoid ALTER TABLE in DB project; apply manually
Login failed for userVerify -d matches item name exactly (case-sensitive)
Cannot open server / Login timeout expiredRe-discover FQDN via REST API; check port 1433 / firewall
ActiveDirectoryDefault failureaz login expired — az login --tenant <tenantId>
Garbled CSV / (N rows affected) in fileAdd -W -s"," -w 4000; prepend SET NOCOUNT ON;
sqlcmd not foundInstall Go version: winget install sqlcmd