Task-Boundary Mode How task-instance boundaries are drawn from the event stream. Applies to every Task SoP, Step SoP, and Variants view.
80.3
Avg Adherence Score (0-100)
Median 82.5 · Range 37.0–97.5
17.6 min
Avg Task Duration
0.6 min median
3.0
Avg Sub-Steps per Instance
57.4% have multiple sub-steps
42.6%
Reference Path Coverage
Instances following mined SoP exactly
Reference Baseline 42.6% of instances follow this exactly

Open Adverse Event Review — Mined SoP

The most common sub-step sequence observed across 856 task instances. Use the dropdown on the right to compare any deviation to this baseline.

1
Open Adverse Events Section
0.3s 0.1 edits Veeva Safety 100%
Variant Explorer
Genuine Complexity Complex-Case Execution Path 21 instances · 4 analysts · Δ adherence +2.2
Observed Sequence (vs Reference SoP on left)
differs from reference matches reference
1
Open Adverse Events Section
2
Open Case Assessment Results
extra step (reference ends at step 1)
Genuine Complexity — this sequence differs from the reference path (2 sub-steps vs 1 in reference) but shows no strong pain-point signals: adherence score 82.5 vs reference 80.3 (delta +2.2). Likely a legitimate case-complexity path — not an automation candidate, but worth cataloging.

Application Mix

Which applications analysts touch while executing this task, measured by event count and active dwell time.

Application Events Event Share Dwell Share
Veeva Safety 2,186 83.8% 97.9%
Microsoft Word 78 3.0% 0.0%
Acrobat 66 2.5% 0.0%
Microsoft Teams 64 2.5% 0.0%
Microsoft Excel 58 2.2% 0.0%
Microsoft Outlook 37 1.4% 0.0%
collaboration.merck.com 18 0.7% 0.0%
milkeyway.merck.com 16 0.6% 0.0%
YoudaoDict 9 0.3% 0.0%
ONENOTE 8 0.3% 0.0%

Top Cross-App Transitions (within task instance)

Consecutive events that cross application boundaries within a task instance — the signal where swivel-chair patterns live.

From To Transitions
Veeva Safety Acrobat 49
Veeva Safety Microsoft Teams 42
Veeva Safety Microsoft Word 41
Veeva Safety Microsoft Excel 37
Veeva Safety Microsoft Outlook 31
Acrobat Veeva Safety 30
Microsoft Word Veeva Safety 29
Microsoft Excel Veeva Safety 19
Microsoft Teams Veeva Safety 18
Microsoft Teams Microsoft Word 12

On-Case Application Journey (±15 min around anchor)

Apps the same analyst touched on the same case within ±15 minutes of the task anchor. Captures work that spans task boundaries — the Veeva/Acrobat/Outlook/Word context surrounding the anchor click.

Case counts per app are not mutually exclusive — a case that touched Veeva AND Phobos in this window is counted in both rows. Overlap reflects real cross-app work, not double-counted cases.
Veeva Safety DOM 8,858 events 459 cases 106.1 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick <span> "Adverse Events" 964×
pyzeClick <span> "Complete" 549×
pyzeClick <span> "Narrative" 400×
pyzeClick <a> "Show more" 361×
pyzeClick <span> "Products" 294×
pyzeClick <span> "Submissions & Distributions" 224×
pyzeClick <span> "Patient" 224×
pyzeClick <span> "Documents" 206×
Phobos DOM 1,214 events 173 cases 34.3 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick <button> "Proceed" 152×
pyzeClick <select> "Select Attachment description · Legal · Legal Pre-Sca" 129×
pyzeClick <button> "Save Comments" 113×
pyzeClick <button> "Advance" 92×
inputld <select> "Select Attachment description · Legal · Legal Pre-Sca" 74×
pyzeClick <button> "Send to process" 70×
pyzeClick <select> "Select Compound · - sitagliptin phosphate Tablet (G" 62×
pyzeClick <div> "Edit info and/or add comments · × · Compound · Select C" 38×
Microsoft Word Document 329 events 90 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
file V-Narrative 552979 (English) (26689105_0_2_348832) 1076×
file V-Narrative 543823 (English) (26544362_0_2_343646) 895×
file V-Narrative 549707 (English) (26645831_0_1_347236) 556×
file V-Narrative 549550 (English) (26643764_0_1_347593) 526×
file V-Narrative 534252 (English) (26399499_0_2_343143) 301×
file V-Narrative 533936 (English) (26397592_0_1_338360) 264×
file V-Narrative 542512 (English) (26522671_0_2_344761) 255×
file V-Narrative 540358 (English) (26494608_0_2_342069) 228×
Acrobat App 267 events 66 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
4,151 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft Teams App 231 events 100 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
2,848 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft Excel Document 212 events 85 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
file Case tracking list 973×
file 2329445 (v2.1) 370×
file Priorización 234×
file I-0000346875 167×
file I-0000354334 153×
file I-0000350382 142×
file I-0000353124 142×
file I-0000340178 109×
Microsoft Outlook Message 121 events 71 cases
Message-level capture
Task Mining captures email subjects. Case numbers frequently appear in the subject line.
collaboration.merck.com App 90 events 46 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
ONENOTE Document 40 events 24 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
RIM Vault DOM 38 events 5 cases 0.3 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.