Task-Boundary Mode How task-instance boundaries are drawn from the event stream. Applies to every Task SoP, Step SoP, and Variants view.
Sub-Task Filter
Reshapes all tabs below to just this anchor within the composite task.
90.0
Avg Adherence Score (0-100)
Median 97.5 · Range 45.2–97.5
19.1 min
Avg Task Duration
0.1 min median
1.3
Avg Sub-Steps per Instance
15.8% have multiple sub-steps
1.0%
Reference Path Coverage
Instances following mined SoP exactly
Reference Baseline 1.0% of instances follow this exactly

Categorize Case — Mined SoP

The most common sub-step sequence observed across 702 task instances. Use the dropdown on the right to compare any deviation to this baseline.

1
Enter Case Type
2.3s Phobos 56%
2
Navigate to Cases
0.0s Veeva Safety 1%
Variant Explorer
Genuine Complexity Complex-Case Execution Path 338 instances · 8 analysts · Δ adherence +1.3
Observed Sequence (vs Reference SoP on left)
differs from reference matches reference
1
Enter Case Type
×
Navigate to Cases
skipped — this reference step was not executed
Genuine Complexity — this sequence differs from the reference path (1 sub-steps vs 2 in reference) but shows no strong pain-point signals: adherence score 91.3 vs reference 90.0 (delta +1.3). Likely a legitimate case-complexity path — not an automation candidate, but worth cataloging.

Application Mix

Which applications analysts touch while executing this task, measured by event count and active dwell time.

Application Events Event Share Dwell Share
Phobos 748 83.0% 95.8%
Veeva Safety 120 13.3% 2.5%
Microsoft Teams 8 0.9% 0.0%
Acrobat 7 0.8% 0.0%
collaboration.merck.com 5 0.6% 0.0%
Microsoft Word 3 0.3% 0.0%
Microsoft Excel 3 0.3% 0.0%
Quality Vault 2 0.2% 1.7%
Microsoft Outlook 1 0.1% 0.0%
Notepad 1 0.1% 0.0%

Top Cross-App Transitions (within task instance)

Consecutive events that cross application boundaries within a task instance — the signal where swivel-chair patterns live.

From To Transitions
Phobos Veeva Safety 55
Phobos Microsoft Teams 5
Microsoft Word Acrobat 3
Phobos collaboration.merck.com 3
Acrobat Veeva Safety 3
Phobos Microsoft Excel 2
Veeva Safety Phobos 2
Phobos Acrobat 2
Veeva Safety Microsoft Teams 2
Veeva Safety Microsoft Word 2

On-Case Application Journey (±15 min around anchor)

Apps the same analyst touched on the same case within ±15 minutes of the task anchor. Captures work that spans task boundaries — the Veeva/Acrobat/Outlook/Word context surrounding the anchor click.

Case counts per app are not mutually exclusive — a case that touched Veeva AND Phobos in this window is counted in both rows. Overlap reflects real cross-app work, not double-counted cases.
Phobos DOM 4,209 events 297 cases 109.1 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick <button> "Proceed" 462×
pyzeClick <select> "Select Attachment description · Legal · Legal Pre-Sca" 451×
pyzeClick <button> "Save Comments" 328×
inputld <select> "Select Attachment description · Legal · Legal Pre-Sca" 310×
pyzeClick <button> "Advance" 216×
pyzeClick <button> "Send to process" 208×
pyzeClick <select> "Select Compound · - sitagliptin phosphate Tablet (G" 152×
pyzeClick <a> "Cases" 145×
Veeva Safety DOM 3,897 events 262 cases 47.7 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick <span> "Complete" 539×
pyzeClick <a> "Show more" 180×
pyzeClick <span> "Adverse Events" 125×
pyzeClick <span> "Narrative" 116×
pyzeClick <li> "Narrative" 105×
pyzeClick <li> "Details" 90×
pyzeClick <span> "Approval Complete" 85×
pyzeClick <span> "Products" 82×
Microsoft Word Document 101 events 46 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
file V-Narrative 538518 (English) (26459147_0_1_344628) 198×
file V-Narrative 542172 (English) (26517690_0_2_345412) 181×
file V-Narrative 542162 (English) (26518018_0_1_342783) 170×
file V-Narrative 538589 (English) (26461195_0_1_346748) 160×
file V-Narrative 527377 (English) (26286859_0_1_339572) 113×
file V-Narrative 525025 (English) (26253159_0_3_342205) 109×
file V-Narrative 537919 (English) (26452534_0_2_341686) 99×
file V-Narrative 541519 (English) (26506626_0_2_342362) 75×
Acrobat App 99 events 36 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
1,357 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft Teams App 90 events 55 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
1,141 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
collaboration.merck.com App 84 events 40 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
810 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft Excel Document 71 events 34 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
Microsoft Outlook Message 56 events 31 cases
Message-level capture
Task Mining captures email subjects. Case numbers frequently appear in the subject line.
RIM Vault DOM 39 events 5 cases 0.2 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
usc-excel.officeapps.live.com Document 32 events 7 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.