Task-Boundary Mode How task-instance boundaries are drawn from the event stream. Applies to every Task SoP, Step SoP, and Variants view.
76.8
Avg Adherence Score (0-100)
Median 82.5 · Range 31.1–97.5
17.7 min
Avg Task Duration
1.7 min median
4.7
Avg Sub-Steps per Instance
68.0% have multiple sub-steps
0.4%
Reference Path Coverage
Instances following mined SoP exactly
Reference Baseline 0.4% of instances follow this exactly

Review SUSAR Case — Mined SoP

The most common sub-step sequence observed across 269 task instances. Use the dropdown on the right to compare any deviation to this baseline.

1
View SUSAR Case
0.5s 0.2 edits Veeva Safety 100%
2
Open Study Section
0.0s Veeva Safety 0%
Variant Explorer
Genuine Complexity Complex-Case Execution Path 86 instances · 8 analysts · Δ adherence +5.9
Observed Sequence (vs Reference SoP on left)
differs from reference matches reference
1
View SUSAR Case
×
Open Study Section
skipped — this reference step was not executed
Genuine Complexity — this sequence differs from the reference path (1 sub-steps vs 2 in reference) but shows no strong pain-point signals: adherence score 82.7 vs reference 76.8 (delta +5.9). Likely a legitimate case-complexity path — not an automation candidate, but worth cataloging.

Application Mix

Which applications analysts touch while executing this task, measured by event count and active dwell time.

Application Events Event Share Dwell Share
Veeva Safety 911 72.1% 96.1%
collaboration.merck.com 64 5.1% 0.0%
Acrobat 62 4.9% 0.0%
usc-excel.officeapps.live.com 49 3.9% 0.0%
Microsoft Excel 40 3.2% 0.0%
Microsoft Word 37 2.9% 0.0%
Microsoft Teams 36 2.8% 0.0%
YoudaoDict 13 1.0% 0.0%
MEDDRABROWSERWIN 8 0.6% 0.0%
Phobos 7 0.6% 0.0%

Top Cross-App Transitions (within task instance)

Consecutive events that cross application boundaries within a task instance — the signal where swivel-chair patterns live.

From To Transitions
collaboration.merck.com usc-excel.officeapps.live.com 39
usc-excel.officeapps.live.com collaboration.merck.com 36
Veeva Safety Microsoft Excel 28
Veeva Safety Acrobat 22
Acrobat Microsoft Word 21
Acrobat Veeva Safety 20
Veeva Safety Microsoft Teams 19
Microsoft Word Acrobat 19
Veeva Safety collaboration.merck.com 14
Microsoft Excel Veeva Safety 14

On-Case Application Journey (±15 min around anchor)

Apps the same analyst touched on the same case within ±15 minutes of the task anchor. Captures work that spans task boundaries — the Veeva/Acrobat/Outlook/Word context surrounding the anchor click.

Case counts per app are not mutually exclusive — a case that touched Veeva AND Phobos in this window is counted in both rows. Overlap reflects real cross-app work, not double-counted cases.
Veeva Safety DOM 2,972 events 91 cases 47.8 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick <span> "Complete" 179×
pyzeClick <span> "Narrative" 123×
pyzeClick <span> "Adverse Events" 105×
pyzeClick <a> "Show more" 91×
pyzeClick <span> "Submissions & Distributions" 72×
pyzeClick <button> "Create" 66×
pyzeClick <li> "Narrative" 64×
pyzeClick <span> "Reference Numbers" 60×
Phobos DOM 526 events 49 cases 12.8 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick <button> "Proceed" 65×
pyzeClick <button> "Save Comments" 56×
pyzeClick <select> "Select Attachment description · Legal · Legal Pre-Sca" 42×
pyzeClick <button> "Send to process" 33×
inputld <select> "Select Attachment description · Legal · Legal Pre-Sca" 32×
pyzeClick <div> "Please reply to “Request Information" using dropd" 30×
pyzeClick <a> "Cases" 23×
pyzeClick <button> "Advance" 23×
Microsoft Word Document 140 events 38 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
file V-Narrative 533164 (English) (26378749_0_4_342805) 659×
file V-Narrative 534252 (English) (26399499_0_2_343143) 515×
file V-Narrative 543823 (English) (26544362_0_2_343646) 514×
file V-Narrative 540591 (English) (26495494_0_2_345212) 429×
file V-Narrative 553016 (English) (26689302_0_2_348778) 392×
file V-Narrative 528191 (English) (26300681_0_1_338349) 377×
file V-Narrative 530932 (English) (26349278_0_2_339374) 366×
file V-Narrative 555782 (English) (26733642_0_2_350941) 205×
Acrobat App 133 events 31 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
2,046 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
collaboration.merck.com App 114 events 23 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
363 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft Teams App 106 events 38 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
1,233 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft Excel Document 93 events 35 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
usc-excel.officeapps.live.com Document 70 events 10 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
Microsoft Outlook Message 31 events 18 cases
Message-level capture
Task Mining captures email subjects. Case numbers frequently appear in the subject line.
apps.powerapps.com App 23 events 4 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.