mirror of
https://github.com/github/awesome-copilot.git
synced 2026-05-15 19:21:45 +00:00
Update FlowStudio Power Automate skills (#1664)
* feat(flowstudio): align Power Automate skills with MCP server v1.1.6 Foundation skill (flowstudio-power-automate-mcp) rewritten to use the server's new tool_search and list_skills meta-tools (v1.1.5+) for discovery instead of cataloging every tool by hand. Cut from 519 to 295 lines. New "Which Skill to Use When" intent-keyed decision tree points at the four specialized skills. Build/debug/governance/monitoring updated for use-case framing. Tools that genuinely cross tiers (e.g. debug skill borrowing get_store_flow_summary) are correct when the workflow needs them — the split between skills is by use-case intent, not by tool partition. Build skill: new Step 3a Resolving Dynamic Connector Values covers get_live_dynamic_options outer-parameter auto-bridge (v1.1.6+) and the AadGraph user-picker fallback via shared_office365users.SearchUserV2 (replaces broken builtInOperation:AadGraph.GetUsers). Debug skill: Outlook user-picker failure note pointing at the fallback. Monitoring skill description disambiguates from the server's monitor-flow tool bundle (runtime control of a single flow) — this skill is tenant-wide health analytics over the cached store. All 5 skills validate via npm run skill:validate; line endings LF only; codespell clean; auto-regenerated docs/README.skills.md included. * fix(flowstudio): remove deprecated tool references The v1.1.5 MCP server release marked 5 tools [DEPRECATED] but the previous alignment commit missed them. Replacements per server source: - get_live_flow_http_schema → read trigger.inputs.schema from get_live_flow - get_live_flow_trigger_url → read trigger.metadata.callbackUrl from get_live_flow - get_store_flow_trigger_url → get_store_flow.triggerUrl field - get_store_flow_errors → get_store_flow_runs(status=["Failed"]) - set_store_flow_state → set_live_flow_state Touches build, debug, governance, monitoring SKILL.md and the foundation skill's tool-reference.md. Remaining mentions of the deprecated names are intentional — they live in deprecation notices naming the obsolete wrapper alongside its replacement. * Update FlowStudio Power Automate skills * Cover latest FlowStudio MCP actions * Trim FlowStudio Power Automate skills * Number FlowStudio build workflow steps
This commit is contained in:
@@ -132,6 +132,10 @@ Result reference: `@body('Get_SP_Item')?['FieldName']`
|
||||
}
|
||||
```
|
||||
|
||||
> `PatchItem` can validate required SharePoint columns even when you are not
|
||||
> changing those fields. Echo unchanged required fields from the trigger or a
|
||||
> prior Get Item action, for example `item/Title`, and use internal field names.
|
||||
|
||||
---
|
||||
|
||||
### SharePoint — File Upsert (Create or Overwrite in Document Library)
|
||||
@@ -286,6 +290,10 @@ SharePoint REST API via the `HttpRequest` operation:
|
||||
> The `HttpRequest` operation reuses the existing SharePoint connection — no extra
|
||||
> authentication needed. Use this when the standard Update Item connector can't
|
||||
> reach the target list (different site collection, or you need raw REST control).
|
||||
> Keep the connector-specific parameter names exactly as shown:
|
||||
> `parameters/method`, `parameters/uri`, `parameters/headers`, and
|
||||
> `parameters/body`. The body is a JSON string, and `parameters/uri` is relative
|
||||
> to the SharePoint `dataset`.
|
||||
|
||||
---
|
||||
|
||||
@@ -340,6 +348,22 @@ the file; the flow downloads and filters it for before/after comparisons.
|
||||
|
||||
---
|
||||
|
||||
## Excel Online
|
||||
|
||||
### Excel — Run Office Script
|
||||
|
||||
Office Script actions require real workbook and script identifiers at save time.
|
||||
Do not deploy placeholder `scriptId` values; `update_live_flow` can fail during
|
||||
dynamic operation validation even before a test run exists.
|
||||
|
||||
Use `describe_live_connector` or `get_live_dynamic_options` when available, or
|
||||
ask the user for the workbook and script if they are not discoverable. If a real
|
||||
`scriptId` still cannot be resolved, ask the user to add the Run script action
|
||||
once in the designer, then read the flow definition and preserve the resolved
|
||||
parameters.
|
||||
|
||||
---
|
||||
|
||||
## Outlook
|
||||
|
||||
### Outlook — Send Email
|
||||
@@ -479,6 +503,20 @@ For 1:1 ("Chat with Flow bot"), use `"location": "Chat with Flow bot"` and set
|
||||
|
||||
---
|
||||
|
||||
## Copilot Studio
|
||||
|
||||
### Copilot Studio — Invoke Agent
|
||||
|
||||
When using the Copilot Studio connector, publish the agent before running the
|
||||
flow. Draft/test agents can exist in the studio canvas but still be unavailable
|
||||
or stale through the flow connector endpoint.
|
||||
|
||||
If a connector action fails with an unavailable-agent or endpoint-style error,
|
||||
publish the agent, wait briefly for propagation, then resubmit the same flow run
|
||||
before changing the flow definition.
|
||||
|
||||
---
|
||||
|
||||
## Approvals
|
||||
|
||||
### Split Approval (Create → Wait)
|
||||
|
||||
@@ -337,6 +337,23 @@ walking a time range, polling until a status changes).
|
||||
|
||||
---
|
||||
|
||||
### Agent Retry Loop
|
||||
|
||||
When a flow calls an AI or Copilot-style agent until it reaches a terminal
|
||||
outcome, keep the loop state explicit:
|
||||
|
||||
- Initialize variables such as `agentStatus`, `attempt`, and `finalPayload`
|
||||
before the `Until`.
|
||||
- Inside the loop, call the agent, validate the response, update the status, and
|
||||
delay/retry only when the status is non-terminal.
|
||||
- Put final dispatch actions such as email, SharePoint update, or Teams post
|
||||
after the loop so retries do not duplicate side effects.
|
||||
- If the platform rejects a complex `Switch` nested inside `Until`, keep the
|
||||
loop body to simple validation and state updates, then route with `Switch`
|
||||
after the loop.
|
||||
|
||||
---
|
||||
|
||||
### Async Polling with RequestId Correlation
|
||||
|
||||
When an API starts a long-running job asynchronously (e.g. Power BI dataset refresh,
|
||||
@@ -486,6 +503,19 @@ Normalize before compare: @replace(coalesce(outputs('Value'),''),'_',' ')
|
||||
Robust non-empty check: @greater(length(trim(coalesce(string(outputs('Val')), ''))), 0)
|
||||
```
|
||||
|
||||
### Unsupported / Risky Expression Assumptions
|
||||
|
||||
Power Automate expressions are Workflow Definition Language, not JavaScript.
|
||||
These patterns often look plausible but do not deploy or do not behave as agents
|
||||
expect:
|
||||
|
||||
| Goal | Avoid | Use instead |
|
||||
|---|---|---|
|
||||
| Build an object inline | `createObject(...)` | A Compose action with a JSON object literal |
|
||||
| Transform an array inline | `select(...)` inside an expression | Data Operations `Select` action |
|
||||
| Filter an array inline | `filter(...)` inside an expression | Data Operations `Filter array` action |
|
||||
| Find an array item index | `indexOf(array, item)` | Foreach with a counter variable, or build a keyed object map |
|
||||
|
||||
### Newlines in Expressions
|
||||
|
||||
> **`\n` does NOT produce a newline inside Power Automate expressions.** It is
|
||||
|
||||
@@ -142,24 +142,8 @@ without a loop:
|
||||
|
||||
Result: `@body('Generate_Date_Series')` → `["2025-01-06", "2025-01-07", …, "2025-01-19"]`
|
||||
|
||||
```json
|
||||
// Flatten a 2D array (rows × cols) into 1D using arithmetic indexing
|
||||
"Flatten_Grid": {
|
||||
"type": "Select",
|
||||
"inputs": {
|
||||
"from": "@range(0, mul(length(outputs('Rows')), length(outputs('Cols'))))",
|
||||
"select": {
|
||||
"row": "@outputs('Rows')[div(item(), length(outputs('Cols')))]",
|
||||
"col": "@outputs('Cols')[mod(item(), length(outputs('Cols')))]"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> `range()` is zero-based. The Cartesian product pattern above uses `div(i, cols)`
|
||||
> for the row index and `mod(i, cols)` for the column index — equivalent to a
|
||||
> nested for-loop flattened into a single pass. Useful for generating time-slot ×
|
||||
> date grids, shift × location assignments, etc.
|
||||
For Cartesian products, iterate `range(0, mul(rowCount, colCount))` and derive
|
||||
indexes with `div(item(), colCount)` and `mod(item(), colCount)`.
|
||||
|
||||
---
|
||||
|
||||
@@ -184,23 +168,6 @@ dictionary type, build one from an array using Select + join + json:
|
||||
|
||||
Lookup: `@outputs('Assemble_Dictionary')?['myKey']`
|
||||
|
||||
```json
|
||||
// Practical example: date → rate-code lookup for business rules
|
||||
"Build_Holiday_Rates": {
|
||||
"type": "Select",
|
||||
"inputs": {
|
||||
"from": "@body('Get_Holidays')?['value']",
|
||||
"select": "@concat('\"', formatDateTime(item()?['Date'], 'yyyy-MM-dd'), '\":\"', item()?['RateCode'], '\"')"
|
||||
}
|
||||
},
|
||||
"Holiday_Dict": {
|
||||
"type": "Compose",
|
||||
"inputs": "@json(concat('{', join(body('Build_Holiday_Rates'), ','), '}'))"
|
||||
}
|
||||
```
|
||||
|
||||
Then inside a loop: `@coalesce(outputs('Holiday_Dict')?[item()?['Date']], 'Standard')`
|
||||
|
||||
> The `json(concat('{', join(...), '}'))` pattern works for string values. For numeric
|
||||
> or boolean values, omit the inner escaped quotes around the value portion.
|
||||
> Keys must be unique — duplicate keys silently overwrite earlier ones.
|
||||
@@ -280,111 +247,20 @@ CSV → database), avoid nested `Apply to each` loops to find changed records.
|
||||
Instead, **project flat key arrays** and use `contains()` to perform set operations —
|
||||
zero nested loops, and the final loop only touches changed items.
|
||||
|
||||
**Full insert/update/delete sync pattern:**
|
||||
**Insert/update/delete sync recipe:**
|
||||
|
||||
```json
|
||||
// Step 1 — Project a flat key array from the DESTINATION (e.g. SharePoint)
|
||||
"Select_Dest_Keys": {
|
||||
"type": "Select",
|
||||
"inputs": {
|
||||
"from": "@outputs('Get_Dest_Items')?['body/value']",
|
||||
"select": "@item()?['Title']"
|
||||
}
|
||||
}
|
||||
// → ["KEY1", "KEY2", "KEY3", ...]
|
||||
1. `Select_Dest_Keys` from destination rows.
|
||||
2. `Filter_To_Insert`: source rows whose key is not in destination keys.
|
||||
3. `Filter_Already_Exists`: source rows whose key is in destination keys.
|
||||
4. For each compared field, run `Filter_<Field>_Changed`; combine them with
|
||||
`union()` into `Union_Changed`.
|
||||
5. `Select_Changed_Keys` from `Union_Changed`, then filter destination rows to
|
||||
only those keys before updating.
|
||||
6. `Select_Source_Keys`, then `Filter_To_Delete` destination rows whose key is
|
||||
not in source keys.
|
||||
|
||||
// Step 2 — INSERT: source rows whose key is NOT in destination
|
||||
"Filter_To_Insert": {
|
||||
"type": "Query",
|
||||
"inputs": {
|
||||
"from": "@body('Source_Array')",
|
||||
"where": "@not(contains(body('Select_Dest_Keys'), item()?['key']))"
|
||||
}
|
||||
}
|
||||
// → Apply to each Filter_To_Insert → CreateItem
|
||||
|
||||
// Step 3 — INNER JOIN: source rows that exist in destination
|
||||
"Filter_Already_Exists": {
|
||||
"type": "Query",
|
||||
"inputs": {
|
||||
"from": "@body('Source_Array')",
|
||||
"where": "@contains(body('Select_Dest_Keys'), item()?['key'])"
|
||||
}
|
||||
}
|
||||
|
||||
// Step 4 — UPDATE: one Filter per tracked field, then union them
|
||||
"Filter_Field1_Changed": {
|
||||
"type": "Query",
|
||||
"inputs": {
|
||||
"from": "@body('Filter_Already_Exists')",
|
||||
"where": "@not(equals(item()?['field1'], item()?['dest_field1']))"
|
||||
}
|
||||
}
|
||||
"Filter_Field2_Changed": {
|
||||
"type": "Query",
|
||||
"inputs": {
|
||||
"from": "@body('Filter_Already_Exists')",
|
||||
"where": "@not(equals(item()?['field2'], item()?['dest_field2']))"
|
||||
}
|
||||
}
|
||||
"Union_Changed": {
|
||||
"type": "Compose",
|
||||
"inputs": "@union(body('Filter_Field1_Changed'), body('Filter_Field2_Changed'))"
|
||||
}
|
||||
// → rows where ANY tracked field differs
|
||||
|
||||
// Step 5 — Resolve destination IDs for changed rows (no nested loop)
|
||||
"Select_Changed_Keys": {
|
||||
"type": "Select",
|
||||
"inputs": { "from": "@outputs('Union_Changed')", "select": "@item()?['key']" }
|
||||
}
|
||||
"Filter_Dest_Items_To_Update": {
|
||||
"type": "Query",
|
||||
"inputs": {
|
||||
"from": "@outputs('Get_Dest_Items')?['body/value']",
|
||||
"where": "@contains(body('Select_Changed_Keys'), item()?['Title'])"
|
||||
}
|
||||
}
|
||||
// Step 6 — Single loop over changed items only
|
||||
"Apply_to_each_Update": {
|
||||
"type": "Foreach",
|
||||
"foreach": "@body('Filter_Dest_Items_To_Update')",
|
||||
"actions": {
|
||||
"Get_Source_Row": {
|
||||
"type": "Query",
|
||||
"inputs": {
|
||||
"from": "@outputs('Union_Changed')",
|
||||
"where": "@equals(item()?['key'], items('Apply_to_each_Update')?['Title'])"
|
||||
}
|
||||
},
|
||||
"Update_Item": {
|
||||
"...": "...",
|
||||
"id": "@items('Apply_to_each_Update')?['ID']",
|
||||
"item/field1": "@first(body('Get_Source_Row'))?['field1']"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Step 7 — DELETE: destination keys NOT in source
|
||||
"Select_Source_Keys": {
|
||||
"type": "Select",
|
||||
"inputs": { "from": "@body('Source_Array')", "select": "@item()?['key']" }
|
||||
}
|
||||
"Filter_To_Delete": {
|
||||
"type": "Query",
|
||||
"inputs": {
|
||||
"from": "@outputs('Get_Dest_Items')?['body/value']",
|
||||
"where": "@not(contains(body('Select_Source_Keys'), item()?['Title']))"
|
||||
}
|
||||
}
|
||||
// → Apply to each Filter_To_Delete → DeleteItem
|
||||
```
|
||||
|
||||
> **Why this beats nested loops**: the naive approach (for each dest item, scan source)
|
||||
> is O(n × m) and hits Power Automate's 100k-action run limit fast on large lists.
|
||||
> This pattern is O(n + m): one pass to build key arrays, one pass per filter.
|
||||
> The update loop in Step 6 only iterates *changed* records — often a tiny fraction
|
||||
> of the full collection. Run Steps 2/4/7 in **parallel Scopes** for further speed.
|
||||
This changes O(n x m) nested loops to O(n + m) set operations and helps avoid
|
||||
Power Automate's 100k-action run limit.
|
||||
|
||||
---
|
||||
|
||||
@@ -649,14 +525,8 @@ Parse a raw CSV string into an array of objects using only built-in expressions.
|
||||
Avoids the premium "Parse CSV" connector action.
|
||||
|
||||
```json
|
||||
"Delimiter": {
|
||||
"type": "Compose",
|
||||
"inputs": ","
|
||||
},
|
||||
"Strip_Quotes": {
|
||||
"type": "Compose",
|
||||
"inputs": "@replace(body('Get_File_Content'), '\"', '')"
|
||||
},
|
||||
"Delimiter": { "type": "Compose", "inputs": "," },
|
||||
"Strip_Quotes": { "type": "Compose", "inputs": "@replace(body('Get_File_Content'), '\"', '')" },
|
||||
"Detect_Line_Ending": {
|
||||
"type": "Compose",
|
||||
"inputs": "@if(equals(indexOf(outputs('Strip_Quotes'), decodeUriComponent('%0D%0A')), -1), if(equals(indexOf(outputs('Strip_Quotes'), decodeUriComponent('%0A')), -1), decodeUriComponent('%0D'), decodeUriComponent('%0A')), decodeUriComponent('%0D%0A'))"
|
||||
@@ -665,10 +535,7 @@ Avoids the premium "Parse CSV" connector action.
|
||||
"type": "Compose",
|
||||
"inputs": "@split(first(split(outputs('Strip_Quotes'), outputs('Detect_Line_Ending'))), outputs('Delimiter'))"
|
||||
},
|
||||
"Data_Rows": {
|
||||
"type": "Compose",
|
||||
"inputs": "@skip(split(outputs('Strip_Quotes'), outputs('Detect_Line_Ending')), 1)"
|
||||
},
|
||||
"Data_Rows": { "type": "Compose", "inputs": "@skip(split(outputs('Strip_Quotes'), outputs('Detect_Line_Ending')), 1)" },
|
||||
"Select_CSV_Body": {
|
||||
"type": "Select",
|
||||
"inputs": {
|
||||
@@ -691,16 +558,9 @@ Avoids the premium "Parse CSV" connector action.
|
||||
|
||||
Result: `@body('Filter_Empty_Rows')` — array of objects with header names as keys.
|
||||
|
||||
> **`Detect_Line_Ending`** handles CRLF (Windows), LF (Unix), and CR (old Mac) automatically
|
||||
> using `indexOf()` with `decodeUriComponent('%0D%0A' / '%0A' / '%0D')`.
|
||||
>
|
||||
> **Dynamic key names in `Select`**: `@{outputs('Headers')[0]}` as a JSON key in a
|
||||
> `Select` shape sets the output property name at runtime from the header row —
|
||||
> this works as long as the expression is in `@{...}` interpolation syntax.
|
||||
>
|
||||
> **Columns with embedded commas**: if field values can contain the delimiter,
|
||||
> use `length(split(row, ','))` in a Switch to detect the column count and manually
|
||||
> reassemble the split fragments: `@concat(split(item(),',')[1],',',split(item(),',')[2])`
|
||||
Notes: `Detect_Line_Ending` handles CRLF/LF/CR. Dynamic keys in `Select` require
|
||||
`@{...}` interpolation. This simple pattern does not safely parse quoted fields
|
||||
with embedded delimiters; for those, use a dedicated parser or custom action.
|
||||
|
||||
---
|
||||
|
||||
|
||||
@@ -59,6 +59,15 @@ Beyond the required `type`, `runAfter`, and `inputs`, actions can include:
|
||||
| `runtimeConfiguration` | Pagination, concurrency, secure data, chunked transfer |
|
||||
| `operationOptions` | `"Sequential"` for Foreach, `"DisableAsyncPattern"` for HTTP |
|
||||
| `limit` | Timeout override (e.g. `{"timeout": "PT2H"}`) |
|
||||
| `metadata` | Designer metadata such as `operationMetadataId` |
|
||||
|
||||
#### Designer Metadata
|
||||
|
||||
For existing connector actions, preserve `metadata.operationMetadataId` when you
|
||||
edit the definition. For new connector actions or Skills/HTTP response actions,
|
||||
add a stable GUID and keep it stable across updates. Do not regenerate these IDs
|
||||
on every deploy; the designer and some run-only surfaces use them to keep action
|
||||
identity consistent.
|
||||
|
||||
#### `runtimeConfiguration` Variants
|
||||
|
||||
|
||||
@@ -93,6 +93,40 @@ Access any field dynamically: `@triggerBody()?['anyField']`
|
||||
|
||||
---
|
||||
|
||||
## Manual (Copilot Studio Skills)
|
||||
|
||||
Use the Skills trigger when the flow is meant to be called by a Copilot Studio
|
||||
agent tool. Keep the trigger schema explicit so the agent receives predictable
|
||||
input names and types.
|
||||
|
||||
```json
|
||||
"manual": {
|
||||
"type": "Request",
|
||||
"kind": "Skills",
|
||||
"inputs": {
|
||||
"schema": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"itemId": { "type": "string" },
|
||||
"notes": { "type": "string" }
|
||||
},
|
||||
"required": ["itemId"]
|
||||
}
|
||||
},
|
||||
"metadata": {
|
||||
"operationMetadataId": "<stable-guid>"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
After deploying a production Skills-triggered flow, call
|
||||
`add_live_flow_to_solution` with the target `solutionId`; Copilot Studio agent
|
||||
tool discovery expects the flow to be solution-aware. For MCP-driven testing,
|
||||
use a temporary HTTP twin with the same actions and payload shape, then restore
|
||||
the Skills trigger after the actions are verified.
|
||||
|
||||
---
|
||||
|
||||
## Automated (SharePoint Item Created)
|
||||
|
||||
```json
|
||||
|
||||
Reference in New Issue
Block a user