Task-Boundary Mode How task-instance boundaries are drawn from the event stream. Applies to every Task SoP, Step SoP, and Variants view.
80.9
Avg Adherence Score (0-100)
Median 82.5 · Range 40.4–97.5
11.9 min
Avg Task Duration
0.4 min median
2.9
Avg Sub-Steps per Instance
63.6% have multiple sub-steps
0.1%
Reference Path Coverage
Instances following mined SoP exactly
Reference Baseline 0.1% of instances follow this exactly

Open Narrative Workspace — Mined SoP

The most common sub-step sequence observed across 764 task instances. Use the dropdown on the right to compare any deviation to this baseline.

1
Open Narrative Section
0.4s 0.1 edits Veeva Safety 100%
2
Acrobat
0.0s Acrobat 1%
3
Show More Details
0.1s 0.0 edits Veeva Safety 5%
4
Microsoft Excel
0.0s Microsoft Excel 0%
5
Open Action Items
0.0s Veeva Safety 0%
6
Microsoft Excel
0.0s Microsoft Excel 0%
7
Acrobat
0.0s Acrobat 1%
8
Open System Section
0.0s Veeva Safety 0%
9
Acrobat
0.0s Acrobat 0%
10
Microsoft Outlook
0.0s Microsoft Outlook 0%
Variant Explorer
Genuine Complexity Complex-Case Execution Path 278 instances · 10 analysts · Δ adherence +3.8
Observed Sequence (vs Reference SoP on left)
differs from reference matches reference
1
Open Narrative Section
×
Acrobat
skipped — this reference step was not executed
×
Show More Details
skipped — this reference step was not executed
×
Microsoft Excel
skipped — this reference step was not executed
×
Open Action Items
skipped — this reference step was not executed
×
Microsoft Excel
skipped — this reference step was not executed
×
Acrobat
skipped — this reference step was not executed
×
Open System Section
skipped — this reference step was not executed
×
Acrobat
skipped — this reference step was not executed
×
Microsoft Outlook
skipped — this reference step was not executed
Genuine Complexity — this sequence differs from the reference path (1 sub-steps vs 10 in reference) but shows no strong pain-point signals: adherence score 84.6 vs reference 80.9 (delta +3.8). Likely a legitimate case-complexity path — not an automation candidate, but worth cataloging.

Application Mix

Which applications analysts touch while executing this task, measured by event count and active dwell time.

Application Events Event Share Dwell Share
Veeva Safety 1,854 82.8% 96.0%
Microsoft Word 123 5.5% 0.0%
Acrobat 71 3.2% 0.0%
Microsoft Teams 51 2.3% 0.0%
Microsoft Excel 40 1.8% 0.0%
Microsoft Outlook 26 1.2% 0.0%
ONENOTE 18 0.8% 0.0%
gpteal.merck.com 6 0.3% 0.0%
newtab 5 0.2% 0.0%
collaboration.merck.com 5 0.2% 0.0%

Top Cross-App Transitions (within task instance)

Consecutive events that cross application boundaries within a task instance — the signal where swivel-chair patterns live.

From To Transitions
Veeva Safety Microsoft Word 67
Microsoft Word Veeva Safety 46
Veeva Safety Microsoft Teams 35
Veeva Safety Acrobat 31
Microsoft Word Acrobat 30
Veeva Safety Microsoft Excel 29
Acrobat Microsoft Word 27
Acrobat Veeva Safety 25
Microsoft Teams Veeva Safety 20
Veeva Safety Microsoft Outlook 20

On-Case Application Journey (±15 min around anchor)

Apps the same analyst touched on the same case within ±15 minutes of the task anchor. Captures work that spans task boundaries — the Veeva/Acrobat/Outlook/Word context surrounding the anchor click.

Case counts per app are not mutually exclusive — a case that touched Veeva AND Phobos in this window is counted in both rows. Overlap reflects real cross-app work, not double-counted cases.
Veeva Safety DOM 8,854 events 319 cases 100.5 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick <span> "Narrative" 860×
pyzeClick <span> "Complete" 835×
pyzeClick <a> "Show more" 603×
pyzeClick <span> "Adverse Events" 388×
pyzeClick <span> "Products" 235×
pyzeClick <span> "Submissions & Distributions" 205×
pyzeClick <span> "Documents" 205×
pyzeClick <button> "Save" 188×
Phobos DOM 1,127 events 157 cases 31.5 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick <button> "Proceed" 137×
pyzeClick <button> "Save Comments" 93×
pyzeClick <button> "Advance" 82×
pyzeClick <button> "Send to process" 69×
pyzeClick <select> "Select Compound · - sitagliptin phosphate Tablet (G" 49×
pyzeClick <a> "Cases" 45×
pyzeClick <select> "Select Attachment description · Legal · Legal Pre-Sca" 38×
pyzeClick <a> "Cases Coordinator" 34×
Microsoft Word Document 421 events 104 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
file V-Narrative 543823 (English) (26544362_0_2_343646) 873×
file V-Narrative 533164 (English) (26378749_0_4_342805) 659×
file V-Narrative 533533 (English) (26384397_0_2_337752) 361×
file V-Narrative 551229 (English) (26664901_0_1_349465) 326×
file V-Narrative 549707 (English) (26645831_0_1_347236) 324×
file V-Narrative 552979 (English) (26689105_0_2_348832) 307×
file V-Narrative 542512 (English) (26522671_0_2_344761) 255×
file V-Narrative 530932 (English) (26349278_0_2_339374) 248×
Acrobat App 256 events 62 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
3,090 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft Teams App 197 events 77 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
2,449 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft Excel Document 146 events 48 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
file Case tracking list 867×
file I-0000346875 161×
file I-0000354334 153×
file I-0000353124 142×
file I-0000350382 132×
file 2352822 (v0.1) 131×
file I-0000340178 109×
file I-0000348506 95×
Microsoft Outlook Message 112 events 58 cases
Message-level capture
Task Mining captures email subjects. Case numbers frequently appear in the subject line.
collaboration.merck.com App 63 events 32 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
ONENOTE Document 39 events 22 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
RIM Vault DOM 21 events 3 cases 0.2 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.