Task-Boundary Mode How task-instance boundaries are drawn from the event stream. Applies to every Task SoP, Step SoP, and Variants view.
77.8
Avg Adherence Score (0-100)
Median 82.5 · Range 31.4–97.5
15.6 min
Avg Task Duration
0.8 min median
4.5
Avg Sub-Steps per Instance
64.6% have multiple sub-steps
1.3%
Reference Path Coverage
Instances following mined SoP exactly
Reference Baseline 1.3% of instances follow this exactly

Review SAE Case — Mined SoP

The most common sub-step sequence observed across 373 task instances. Use the dropdown on the right to compare any deviation to this baseline.

1
View SAE Case
0.6s 0.2 edits Veeva Safety 99%
2
Microsoft Excel
0.0s Microsoft Excel 6%
Variant Explorer
Genuine Complexity Complex-Case Execution Path 132 instances · 10 analysts · Δ adherence +5.1
Observed Sequence (vs Reference SoP on left)
differs from reference matches reference
1
View SAE Case
×
Microsoft Excel
skipped — this reference step was not executed
Genuine Complexity — this sequence differs from the reference path (1 sub-steps vs 2 in reference) but shows no strong pain-point signals: adherence score 83.0 vs reference 77.8 (delta +5.1). Likely a legitimate case-complexity path — not an automation candidate, but worth cataloging.

Application Mix

Which applications analysts touch while executing this task, measured by event count and active dwell time.

Application Events Event Share Dwell Share
Veeva Safety 1,236 73.9% 87.0%
collaboration.merck.com 70 4.2% 0.0%
Acrobat 53 3.2% 0.0%
Microsoft Excel 52 3.1% 0.0%
Microsoft Word 51 3.1% 0.0%
usc-excel.officeapps.live.com 43 2.6% 0.0%
Microsoft Teams 42 2.5% 0.0%
Microsoft Outlook 17 1.0% 0.0%
apps.powerapps.com 15 0.9% 0.0%
runtime-app.powerplatform.com 13 0.8% 0.0%

Top Cross-App Transitions (within task instance)

Consecutive events that cross application boundaries within a task instance — the signal where swivel-chair patterns live.

From To Transitions
Veeva Safety Microsoft Excel 34
collaboration.merck.com usc-excel.officeapps.live.com 34
usc-excel.officeapps.live.com collaboration.merck.com 33
Microsoft Word Veeva Safety 22
Veeva Safety Microsoft Teams 20
Veeva Safety collaboration.merck.com 20
Veeva Safety Acrobat 20
Microsoft Excel Veeva Safety 19
Veeva Safety Microsoft Word 18
Acrobat Veeva Safety 16

On-Case Application Journey (±15 min around anchor)

Apps the same analyst touched on the same case within ±15 minutes of the task anchor. Captures work that spans task boundaries — the Veeva/Acrobat/Outlook/Word context surrounding the anchor click.

Case counts per app are not mutually exclusive — a case that touched Veeva AND Phobos in this window is counted in both rows. Overlap reflects real cross-app work, not double-counted cases.
Veeva Safety DOM 4,539 events 168 cases 49.1 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick <span> "Complete" 273×
pyzeClick <a> "Show more" 196×
pyzeClick <li> "Narrative" 173×
pyzeClick <span> "Continue" 161×
pyzeClick <li> "Details" 157×
pyzeClick <li> "Delete" 143×
pyzeClick <li> "Unknown" 133×
pyzeClick <span> "Adverse Events" 127×
Phobos DOM 739 events 84 cases 19.2 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick <button> "Proceed" 76×
pyzeClick <select> "Select Attachment description · Legal · Legal Pre-Sca" 52×
pyzeClick <button> "Advance" 48×
pyzeClick <button> "Save Comments" 44×
inputld <select> "Select Attachment description · Legal · Legal Pre-Sca" 38×
pyzeClick <a> "Cases" 35×
pyzeClick <div> "Please reply to “Request Information" using dropd" 34×
pyzeClick <button> "Send to process" 28×
Microsoft Word Document 144 events 43 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
file V-Narrative 552979 (English) (26689105_0_2_348832) 1453×
file V-Narrative 549707 (English) (26645831_0_1_347236) 636×
file V-Narrative 549550 (English) (26643764_0_1_347593) 526×
file V-Narrative 531284 (English) (26355806_0_2_340584) 225×
file V-Narrative 525025 (English) (26253159_0_3_342205) 211×
file V-Narrative 546801 (English) (26597391_0_2_345463) 187×
file V-Narrative 541572 (English) (26508302_0_1_342423) 174×
file V-Narrative 542162 (English) (26518018_0_1_342783) 168×
Microsoft Teams App 136 events 49 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
1,688 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Acrobat App 123 events 42 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
1,574 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft Excel Document 105 events 48 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
file 2355984 (v0.1) 469×
file Case tracking list 450×
file Priorización 407×
file I-0000354334 153×
file I-0000353124 142×
file I-0000350382 142×
file Daily tracking list (PV-SMART) copy 125×
file I-0000355770 93×
collaboration.merck.com App 103 events 40 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
Microsoft Outlook Message 56 events 33 cases
Message-level capture
Task Mining captures email subjects. Case numbers frequently appear in the subject line.
usc-excel.officeapps.live.com Document 44 events 6 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
ONENOTE Document 26 events 17 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.