Task-Boundary Mode How task-instance boundaries are drawn from the event stream. Applies to every Task SoP, Step SoP, and Variants view.
Sub-Task Filter
Reshapes all tabs below to just this anchor within the composite task.
80.8
Avg Adherence Score (0-100)
Median 82.5 · Range 35.2–97.5
26.3 min
Avg Task Duration
0.5 min median
2.6
Avg Sub-Steps per Instance
46.9% have multiple sub-steps
42.8%
Reference Path Coverage
Instances following mined SoP exactly
Reference Baseline 42.8% of instances follow this exactly

Process Submission — Mined SoP

The most common sub-step sequence observed across 848 task instances. Use the dropdown on the right to compare any deviation to this baseline.

1
View Localized Cases
0.0s Phobos 43%
Variant Explorer
Genuine Complexity Complex-Case Execution Path 172 instances · 12 analysts · Δ adherence +2.3
Observed Sequence (vs Reference SoP on left)
differs from reference matches reference
1
Open Submissions Section
reference at this step: View Localized Cases
Genuine Complexity — this sequence differs from the reference path (1 sub-steps vs 1 in reference) but shows no strong pain-point signals: adherence score 83.1 vs reference 80.8 (delta +2.3). Likely a legitimate case-complexity path — not an automation candidate, but worth cataloging.

Application Mix

Which applications analysts touch while executing this task, measured by event count and active dwell time.

Application Events Event Share Dwell Share
Veeva Safety 1,478 66.3% 98.9%
Phobos 451 20.2% 0.0%
Acrobat 66 3.0% 0.0%
Microsoft Word 46 2.1% 0.0%
Microsoft Excel 40 1.8% 0.0%
Microsoft Teams 34 1.5% 0.0%
collaboration.merck.com 31 1.4% 0.0%
Microsoft Outlook 21 0.9% 0.0%
usc-excel.officeapps.live.com 13 0.6% 0.0%
apps.powerapps.com 8 0.4% 0.0%

Top Cross-App Transitions (within task instance)

Consecutive events that cross application boundaries within a task instance — the signal where swivel-chair patterns live.

From To Transitions
Veeva Safety Acrobat 28
Veeva Safety Microsoft Excel 24
Veeva Safety Microsoft Word 22
Veeva Safety Microsoft Teams 21
Acrobat Veeva Safety 20
Microsoft Excel Veeva Safety 17
Acrobat Microsoft Word 17
Microsoft Word Acrobat 17
Microsoft Word Veeva Safety 17
Veeva Safety collaboration.merck.com 13

On-Case Application Journey (±15 min around anchor)

Apps the same analyst touched on the same case within ±15 minutes of the task anchor. Captures work that spans task boundaries — the Veeva/Acrobat/Outlook/Word context surrounding the anchor click.

Case counts per app are not mutually exclusive — a case that touched Veeva AND Phobos in this window is counted in both rows. Overlap reflects real cross-app work, not double-counted cases.
Veeva Safety DOM 7,182 events 333 cases 84.8 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick <span> "Submissions & Distributions" 536×
pyzeClick <span> "Complete" 504×
pyzeClick <a> "Show more" 365×
pyzeClick <span> "Narrative" 300×
pyzeClick <span> "Adverse Events" 280×
pyzeClick <span> "Products" 196×
pyzeClick <span> "Reference Numbers" 184×
pyzeClick <li> "Narrative" 180×
Phobos DOM 1,842 events 195 cases 42.1 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick <button> "Proceed" 125×
pyzeClick <button> "Send to process" 103×
pyzeClick <button> "Save Comments" 72×
pyzeClick <button> "Advance" 54×
pyzeClick <td> "2352671 (v0.1) Japanese (Japan)" 40×
pyzeClick <select> "Select Attachment description · Legal · Legal Pre-Sca" 32×
pyzeClick <a> "Cases" 31×
pyzeClick <button> "Advanced" 26×
Acrobat App 240 events 61 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
3,051 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft Word Document 212 events 61 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
file V-Narrative 543823 (English) (26544362_0_2_343646) 775×
file V-Narrative 540591 (English) (26495494_0_2_345212) 723×
file V-Narrative 528191 (English) (26300681_0_1_338349) 382×
file V-Narrative 549707 (English) (26645831_0_1_347236) 313×
file V-Narrative 514031 (English) (26063743_0_1_327856) 286×
file V-Narrative 533882 (English) (26397511_0_1_345751) 230×
file V-Narrative 525091 (English) (26253532_0_1_338428) 201×
file Narrative 526208 (English) 190×
Microsoft Excel Document 168 events 62 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
file Case tracking list 699×
file 2355984 (v0.1) 469×
file Priorización 351×
file 2329445 (v2.1) 339×
file 2352822 (v0.1) 234×
file Daily tracking list (PV-SMART) copy 225×
file 1761620 (v2.1) 186×
file I-0000354334 153×
Microsoft Teams App 163 events 65 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
2,356 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
collaboration.merck.com App 78 events 34 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
Microsoft Outlook Message 67 events 41 cases
Message-level capture
Task Mining captures email subjects. Case numbers frequently appear in the subject line.
ONENOTE Document 36 events 18 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
YoudaoDict App 24 events 12 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.