Task-Boundary Mode How task-instance boundaries are drawn from the event stream. Applies to every Task SoP, Step SoP, and Variants view.
Sub-Task Filter
Reshapes all tabs below to just this anchor within the composite task.
80.9
Avg Adherence Score (0-100)
Median 82.5 · Range 39.4–97.5
8.0 min
Avg Task Duration
0.0 min median
1.9
Avg Sub-Steps per Instance
35.3% have multiple sub-steps
58.4%
Reference Path Coverage
Instances following mined SoP exactly
Reference Baseline 58.4% of instances follow this exactly

Manage Work List — Mined SoP

The most common sub-step sequence observed across 685 task instances. Use the dropdown on the right to compare any deviation to this baseline.

1
View Processing Routed
0.0s Phobos 67%
Variant Explorer
Genuine Complexity Complex-Case Execution Path 74 instances · 6 analysts · Δ adherence +4.8
Observed Sequence (vs Reference SoP on left)
differs from reference matches reference
1
Filter Cases
reference at this step: View Processing Routed
Genuine Complexity — this sequence differs from the reference path (1 sub-steps vs 1 in reference) but shows no strong pain-point signals: adherence score 85.7 vs reference 80.9 (delta +4.8). Likely a legitimate case-complexity path — not an automation candidate, but worth cataloging.

Application Mix

Which applications analysts touch while executing this task, measured by event count and active dwell time.

Application Events Event Share Dwell Share
Phobos 1,031 80.7% 87.4%
Veeva Safety 187 14.6% 12.6%
Microsoft Teams 15 1.2% 0.0%
collaboration.merck.com 10 0.8% 0.0%
Acrobat 9 0.7% 0.0%
Microsoft Excel 7 0.5% 0.0%
Microsoft Word 5 0.4% 0.0%
Microsoft Outlook 3 0.2% 0.0%
usc-excel.officeapps.live.com 2 0.2% 0.0%
YoudaoDict 2 0.2% 0.0%

Top Cross-App Transitions (within task instance)

Consecutive events that cross application boundaries within a task instance — the signal where swivel-chair patterns live.

From To Transitions
Phobos Veeva Safety 59
Phobos Microsoft Teams 8
Veeva Safety Microsoft Teams 6
Veeva Safety Phobos 5
Phobos Microsoft Excel 5
Phobos collaboration.merck.com 4
Acrobat Veeva Safety 4
Microsoft Teams Veeva Safety 4
Veeva Safety Microsoft Word 3
Veeva Safety Acrobat 3

On-Case Application Journey (±15 min around anchor)

Apps the same analyst touched on the same case within ±15 minutes of the task anchor. Captures work that spans task boundaries — the Veeva/Acrobat/Outlook/Word context surrounding the anchor click.

Case counts per app are not mutually exclusive — a case that touched Veeva AND Phobos in this window is counted in both rows. Overlap reflects real cross-app work, not double-counted cases.
Veeva Safety DOM 3,669 events 239 cases 44.4 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick <span> "Complete" 452×
pyzeClick <a> "Show more" 165×
pyzeClick <span> "Adverse Events" 117×
pyzeClick <span> "Narrative" 108×
pyzeClick <span> "Submissions & Distributions" 93×
pyzeClick <li> "Narrative" 92×
pyzeClick <span> "Products" 86×
pyzeClick <span> "Case Assessment Results" 83×
Phobos DOM 3,345 events 275 cases 83.3 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick <button> "Proceed" 316×
pyzeClick <button> "Send to process" 219×
pyzeClick <a> "Cases" 180×
pyzeClick <button> "Save Comments" 172×
pyzeClick <button> "Advance" 140×
pyzeClick <a> "Cases Coordinator" 129×
pyzeClick <select> "Select Attachment description · Legal · Legal Pre-Sca" 127×
pyzeClick <a> "Inbox" 103×
Microsoft Word Document 106 events 43 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
file V-Narrative 533164 (English) (26378749_0_4_342805) 659×
file V-Narrative 545329 (English) (26567747_0_2_344796) 215×
file V-Narrative 538518 (English) (26459147_0_1_344628) 198×
file V-Narrative 525091 (English) (26253532_0_1_338428) 197×
file V-Narrative 542172 (English) (26517690_0_2_345412) 181×
file V-Narrative 542162 (English) (26518018_0_1_342783) 168×
file V-Narrative 530716 (English) (26341165_0_2_337732) 159×
file V-Narrative 524911 (English) (26250161_0_1_339456) 142×
Acrobat App 103 events 39 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
1,283 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft Teams App 96 events 50 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
990 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
collaboration.merck.com App 90 events 41 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
775 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft Excel Document 71 events 33 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
Microsoft Outlook Message 59 events 34 cases
Message-level capture
Task Mining captures email subjects. Case numbers frequently appear in the subject line.
usc-excel.officeapps.live.com Document 34 events 6 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
ONENOTE Document 15 events 8 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.