This is the mined reference path — 55 instances across 2 analysts follow this exact sequence. Treat as the proposed Standard Operating Procedure for Open Combined Review.
Application Mix
Which applications analysts touch while executing this task, measured by event count and active dwell time.
Application
Events
Event Share
Dwell Share
Phobos
55
100.0%
0.0%
On-Case Application Journey (±15 min around anchor)
Apps the same analyst touched on the same case within ±15 minutes of the task anchor. Captures work that spans task boundaries — the Veeva/Acrobat/Outlook/Word context surrounding the anchor click.
Case counts per app are not mutually exclusive — a case that touched Veeva AND Phobos in this window is counted in both rows. Overlap reflects real cross-app work, not double-counted cases.
Veeva SafetyDOM219 events15 cases1.6 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick<span>"Complete"24×
pyzeClick<span>"Narrative"17×
pyzeClick<a>"Show more"17×
pyzeClick<span>"Submissions & Distributions"12×
pyzeClick<span>"Products"11×
pyzeClick<span>"Workflow Timeline"11×
pyzeClick<span>"Study"10×
pyzeClick<span>"Reference Numbers"10×
PhobosDOM175 events20 cases2.4 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick<a>"Cases"50×
pyzeClick<a>"Review"46×
pyzeClick<a>"Cases Coordinator"41×
pyzeClick<a>"Review Coordinator"40×
pyzeClick<button>"Proceed"40×
pyzeClick<button>"Send to process"38×
pyzeClick<a>"Inbox"37×
pyzeClick<button>"Save Comments"36×
AcrobatApp7 events2 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
128 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft TeamsApp5 events3 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
55 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
ONENOTEDocument3 events1 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
MEDDRABROWSERWINApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
43 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
YoudaoDictApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
sync.merck.comApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
collaboration.merck.comApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
gpteal.merck.comDOM1 events1 cases
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
Adherence Vectors
Each task instance is scored on four behavioral vectors. Lower values are more adherent to the mined SoP.
0.00
Swivel Rate
events in non-primary app / total
1.00
Zero-Edit Visit Rate
inspection-only clicks
1.0
Avg Revisit Depth
max repeats of any activity
0.00
Cross-App Time Share
time outside primary app
Meta Score Distribution
Instance count in each 10-point adherence bucket. Right-skewed = healthy execution; long left tail = concentrated pain points.
0
10
20
30
40
50
60
70
80
90
Top Concern Vector
Zero-Edit Visit Rate — current value 1.00
Investigate why analysts inspect-and-leave without editing — usually missing upstream data or unclear UI affordances.
Vector Contribution to Adherence Loss
Each vector's weighted contribution to lost adherence. The top row is your highest-leverage fix.
Zero-Edit Visit Rate
value 1.00
Investigate why analysts inspect-and-leave without editing — usually missing upstream data or unclear UI affordances.
Revisit Depth
value 1.00
High revisit depth = mid-flow rework. Trace which sub-step is being repeated; usually a data-readiness gap upstream.
Swivel Rate
value 0.00
Reduce cross-app hops by surfacing the secondary-app data inline (API integration or pre-fetched context panel).
Cross-App Time Share
value 0.00
Time outside primary app dominates this score. Identify which secondary app(s) and whether their function can be in-lined.
Lowest-Adherence Instances
The 5 task instances that scored lowest. These are the concrete cases worth investigating to understand the worst patterns.
Case
Analyst
Score
Duration
Sub-Steps
Swivel
X-App
Revisit
1757023
Analyst 05
82.5
0.0m
1
0.00
0.00
1
2335737
Analyst 05
82.5
0.1m
1
0.00
0.00
1
2335737
Analyst 05
82.5
0.0m
1
0.00
0.00
1
2087257
Analyst 09
82.5
0.0m
1
0.00
0.00
1
2087257
Analyst 09
82.5
0.0m
1
0.00
0.00
1
Users on This Task
2 analysts executed this task during the pilot. Scatter plots productivity (instances per active day) against adherence (meta score). Top-right = healthy high-volume analysts; bottom-right = firefighters; top-left = careful low-volume; bottom-left = struggling.
Adherence (meta score) →
Productivity (instances / active day) →
09
05
User-by-User Breakdown
Click any analyst to see their dominant sequence vs the reference SoP.
Analyst
Instances
Inst / Day
Adherence
Weakest Vector
Swivel
X-App
Min
▸
Analyst 09
51
4.6
82.5
Zero-Edit Visits
0.00
0.00
825
Reference SoP
1
View Combined Review
Analyst 09's Dominant Sequence
— 51 of 51 instances
1
View Combined Review
▸
Analyst 05
4
2.0
82.5
Zero-Edit Visits
0.00
0.00
0
Reference SoP
1
View Combined Review
Analyst 05's Dominant Sequence
— 4 of 4 instances
1
View Combined Review
82.5
Avg Adherence Score (0-100)
Median 82.5 · Range 82.5–82.5
15.0 min
Avg Task Duration
0.0 min median
1.0
Avg Sub-Steps per Instance
0.0% have multiple sub-steps
100.0%
Reference Path Coverage
Instances following mined SoP exactly
Reference Baseline100.0% of instances follow this exactly
Open Combined Review — Mined SoP
The most common sub-step sequence observed across 55 task instances. Use the dropdown on the right to compare any deviation to this baseline.
1
View Combined Review
0.0sPhobos100%
No deviation variants observed for this task in the current cut — every instance follows the reference path.
This is the mined reference path — 55 instances across 2 analysts follow this exact sequence. Treat as the proposed Standard Operating Procedure for Open Combined Review.
Application Mix
Which applications analysts touch while executing this task, measured by event count and active dwell time.
Application
Events
Event Share
Dwell Share
Phobos
55
100.0%
0.0%
On-Case Application Journey (±15 min around anchor)
Apps the same analyst touched on the same case within ±15 minutes of the task anchor. Captures work that spans task boundaries — the Veeva/Acrobat/Outlook/Word context surrounding the anchor click.
Case counts per app are not mutually exclusive — a case that touched Veeva AND Phobos in this window is counted in both rows. Overlap reflects real cross-app work, not double-counted cases.
Veeva SafetyDOM219 events15 cases1.6 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick<span>"Complete"24×
pyzeClick<span>"Narrative"17×
pyzeClick<a>"Show more"17×
pyzeClick<span>"Submissions & Distributions"12×
pyzeClick<span>"Workflow Timeline"11×
pyzeClick<span>"Products"11×
pyzeClick<span>"Reference Numbers"10×
pyzeClick<span>"Study"10×
PhobosDOM175 events20 cases2.4 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick<a>"Cases"50×
pyzeClick<a>"Review"46×
pyzeClick<a>"Cases Coordinator"41×
pyzeClick<a>"Review Coordinator"40×
pyzeClick<button>"Proceed"40×
pyzeClick<button>"Send to process"38×
pyzeClick<a>"Inbox"37×
pyzeClick<button>"Save Comments"36×
AcrobatApp7 events2 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
128 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft TeamsApp5 events3 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
55 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
ONENOTEDocument3 events1 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
MEDDRABROWSERWINApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
43 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
YoudaoDictApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
sync.merck.comApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
collaboration.merck.comApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
gpteal.merck.comDOM1 events1 cases
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
Adherence Vectors
Each task instance is scored on four behavioral vectors. Lower values are more adherent to the mined SoP.
0.00
Swivel Rate
events in non-primary app / total
1.00
Zero-Edit Visit Rate
inspection-only clicks
1.0
Avg Revisit Depth
max repeats of any activity
0.00
Cross-App Time Share
time outside primary app
Meta Score Distribution
Instance count in each 10-point adherence bucket. Right-skewed = healthy execution; long left tail = concentrated pain points.
0
10
20
30
40
50
60
70
80
90
Top Concern Vector
Zero-Edit Visit Rate — current value 1.00
Investigate why analysts inspect-and-leave without editing — usually missing upstream data or unclear UI affordances.
Vector Contribution to Adherence Loss
Each vector's weighted contribution to lost adherence. The top row is your highest-leverage fix.
Zero-Edit Visit Rate
value 1.00
Investigate why analysts inspect-and-leave without editing — usually missing upstream data or unclear UI affordances.
Revisit Depth
value 1.00
High revisit depth = mid-flow rework. Trace which sub-step is being repeated; usually a data-readiness gap upstream.
Swivel Rate
value 0.00
Reduce cross-app hops by surfacing the secondary-app data inline (API integration or pre-fetched context panel).
Cross-App Time Share
value 0.00
Time outside primary app dominates this score. Identify which secondary app(s) and whether their function can be in-lined.
Lowest-Adherence Instances
The 5 task instances that scored lowest. These are the concrete cases worth investigating to understand the worst patterns.
Case
Analyst
Score
Duration
Sub-Steps
Swivel
X-App
Revisit
1757023
Analyst 05
82.5
0.0m
1
0.00
0.00
1
2335737
Analyst 05
82.5
0.0m
1
0.00
0.00
1
2087257
Analyst 09
82.5
0.0m
1
0.00
0.00
1
2087257
Analyst 09
82.5
0.1m
1
0.00
0.00
1
2087257
Analyst 09
82.5
0.2m
1
0.00
0.00
1
Users on This Task
2 analysts executed this task during the pilot. Scatter plots productivity (instances per active day) against adherence (meta score). Top-right = healthy high-volume analysts; bottom-right = firefighters; top-left = careful low-volume; bottom-left = struggling.
Adherence (meta score) →
Productivity (instances / active day) →
05
09
User-by-User Breakdown
Click any analyst to see their dominant sequence vs the reference SoP.
Analyst
Instances
Inst / Day
Adherence
Weakest Vector
Swivel
X-App
Min
▸
Analyst 05
4
2.0
82.5
Zero-Edit Visits
0.00
0.00
0
Reference SoP
1
View Combined Review
Analyst 05's Dominant Sequence
— 4 of 4 instances
1
View Combined Review
▸
Analyst 09
51
4.6
82.5
Zero-Edit Visits
0.00
0.00
825
Reference SoP
1
View Combined Review
Analyst 09's Dominant Sequence
— 51 of 51 instances
1
View Combined Review
82.4
Avg Adherence Score (0-100)
Median 82.5 · Range 80.0–82.5
15.6 min
Avg Task Duration
0.0 min median
1.0
Avg Sub-Steps per Instance
3.8% have multiple sub-steps
100.0%
Reference Path Coverage
Instances following mined SoP exactly
Reference Baseline100.0% of instances follow this exactly
Open Combined Review — Mined SoP
The most common sub-step sequence observed across 53 task instances. Use the dropdown on the right to compare any deviation to this baseline.
1
View Combined Review
0.0sPhobos100%
No deviation variants observed for this task in the current cut — every instance follows the reference path.
This is the mined reference path — 53 instances across 2 analysts follow this exact sequence. Treat as the proposed Standard Operating Procedure for Open Combined Review.
Application Mix
Which applications analysts touch while executing this task, measured by event count and active dwell time.
Application
Events
Event Share
Dwell Share
Phobos
55
100.0%
0.0%
On-Case Application Journey (±15 min around anchor)
Apps the same analyst touched on the same case within ±15 minutes of the task anchor. Captures work that spans task boundaries — the Veeva/Acrobat/Outlook/Word context surrounding the anchor click.
Case counts per app are not mutually exclusive — a case that touched Veeva AND Phobos in this window is counted in both rows. Overlap reflects real cross-app work, not double-counted cases.
Veeva SafetyDOM219 events15 cases1.6 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick<span>"Complete"24×
pyzeClick<span>"Narrative"17×
pyzeClick<a>"Show more"17×
pyzeClick<span>"Submissions & Distributions"12×
pyzeClick<span>"Workflow Timeline"11×
pyzeClick<span>"Products"11×
pyzeClick<span>"Reference Numbers"10×
pyzeClick<span>"Study"10×
PhobosDOM175 events20 cases2.4 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick<a>"Cases"50×
pyzeClick<a>"Review"46×
pyzeClick<a>"Cases Coordinator"41×
pyzeClick<button>"Proceed"40×
pyzeClick<a>"Review Coordinator"40×
pyzeClick<button>"Send to process"38×
pyzeClick<a>"Inbox"37×
pyzeClick<a>"Inbox Coordinator"36×
AcrobatApp7 events2 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
128 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft TeamsApp5 events3 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
55 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
ONENOTEDocument3 events1 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
MEDDRABROWSERWINApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
43 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
YoudaoDictApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
sync.merck.comApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
collaboration.merck.comApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
gpteal.merck.comDOM1 events1 cases
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
Adherence Vectors
Each task instance is scored on four behavioral vectors. Lower values are more adherent to the mined SoP.
0.00
Swivel Rate
events in non-primary app / total
1.00
Zero-Edit Visit Rate
inspection-only clicks
1.04
Avg Revisit Depth
max repeats of any activity
0.00
Cross-App Time Share
time outside primary app
Meta Score Distribution
Instance count in each 10-point adherence bucket. Right-skewed = healthy execution; long left tail = concentrated pain points.
0
10
20
30
40
50
60
70
80
90
Top Concern Vector
Zero-Edit Visit Rate — current value 1.00
Investigate why analysts inspect-and-leave without editing — usually missing upstream data or unclear UI affordances.
Vector Contribution to Adherence Loss
Each vector's weighted contribution to lost adherence. The top row is your highest-leverage fix.
Zero-Edit Visit Rate
value 1.00
Investigate why analysts inspect-and-leave without editing — usually missing upstream data or unclear UI affordances.
Revisit Depth
value 1.04
High revisit depth = mid-flow rework. Trace which sub-step is being repeated; usually a data-readiness gap upstream.
Swivel Rate
value 0.00
Reduce cross-app hops by surfacing the secondary-app data inline (API integration or pre-fetched context panel).
Cross-App Time Share
value 0.00
Time outside primary app dominates this score. Identify which secondary app(s) and whether their function can be in-lined.
Lowest-Adherence Instances
The 5 task instances that scored lowest. These are the concrete cases worth investigating to understand the worst patterns.
Case
Analyst
Score
Duration
Sub-Steps
Swivel
X-App
Revisit
2349786
Analyst 09
80.0
40.8m
2
0.00
0.00
2
2264672
Analyst 09
80.0
0.6m
2
0.00
0.00
2
2240449
Analyst 09
82.5
0.0m
1
0.00
0.00
1
2335737
Analyst 05
82.5
0.0m
1
0.00
0.00
1
2087257
Analyst 09
82.5
0.0m
1
0.00
0.00
1
Users on This Task
2 analysts executed this task during the pilot. Scatter plots productivity (instances per active day) against adherence (meta score). Top-right = healthy high-volume analysts; bottom-right = firefighters; top-left = careful low-volume; bottom-left = struggling.
Adherence (meta score) →
Productivity (instances / active day) →
05
09
User-by-User Breakdown
Click any analyst to see their dominant sequence vs the reference SoP.
Analyst
Instances
Inst / Day
Adherence
Weakest Vector
Swivel
X-App
Min
▸
Analyst 05
4
2.0
82.5
Zero-Edit Visits
0.00
0.00
0
Reference SoP
1
View Combined Review
Analyst 05's Dominant Sequence
— 4 of 4 instances
1
View Combined Review
▸
Analyst 09
49
4.5
82.4
Zero-Edit Visits
0.00
0.00
825
Reference SoP
1
View Combined Review
Analyst 09's Dominant Sequence
— 49 of 49 instances
1
View Combined Review
82.4
Avg Adherence Score (0-100)
Median 82.5 · Range 80.0–82.5
15.6 min
Avg Task Duration
0.0 min median
1.0
Avg Sub-Steps per Instance
3.8% have multiple sub-steps
100.0%
Reference Path Coverage
Instances following mined SoP exactly
Reference Baseline100.0% of instances follow this exactly
Open Combined Review — Mined SoP
The most common sub-step sequence observed across 53 task instances. Use the dropdown on the right to compare any deviation to this baseline.
1
View Combined Review
0.0sPhobos100%
No deviation variants observed for this task in the current cut — every instance follows the reference path.
This is the mined reference path — 53 instances across 2 analysts follow this exact sequence. Treat as the proposed Standard Operating Procedure for Open Combined Review.
Application Mix
Which applications analysts touch while executing this task, measured by event count and active dwell time.
Application
Events
Event Share
Dwell Share
Phobos
55
100.0%
0.0%
On-Case Application Journey (±15 min around anchor)
Apps the same analyst touched on the same case within ±15 minutes of the task anchor. Captures work that spans task boundaries — the Veeva/Acrobat/Outlook/Word context surrounding the anchor click.
Case counts per app are not mutually exclusive — a case that touched Veeva AND Phobos in this window is counted in both rows. Overlap reflects real cross-app work, not double-counted cases.
Veeva SafetyDOM219 events15 cases1.6 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick<span>"Complete"24×
pyzeClick<span>"Narrative"17×
pyzeClick<a>"Show more"17×
pyzeClick<span>"Submissions & Distributions"12×
pyzeClick<span>"Products"11×
pyzeClick<span>"Workflow Timeline"11×
pyzeClick<span>"Study"10×
pyzeClick<span>"Reference Numbers"10×
PhobosDOM175 events20 cases2.4 min
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
pyzeClick<a>"Cases"50×
pyzeClick<a>"Review"46×
pyzeClick<a>"Cases Coordinator"41×
pyzeClick<button>"Proceed"40×
pyzeClick<a>"Review Coordinator"40×
pyzeClick<button>"Send to process"38×
pyzeClick<a>"Inbox"37×
pyzeClick<a>"Inbox Coordinator"36×
AcrobatApp7 events2 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
128 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
Microsoft TeamsApp5 events3 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
55 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
ONENOTEDocument3 events1 cases
Document-level capture
Task Mining captures the specific file open during work — file names often encode case context.
MEDDRABROWSERWINApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
43 granular events captured, but no DOM, document, or message-level metadata is available for this application. Instrumenting this app directly with Pyze would unlock element-level detail.
YoudaoDictApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
sync.merck.comApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
collaboration.merck.comApp1 events1 cases
App-level capture
Only application name and dwell time available. Direct instrumentation would unlock deeper detail.
gpteal.merck.comDOM1 events1 cases
DOM-level capture
Pyze JS captures every element interaction — button labels, field IDs, section panels.
Adherence Vectors
Each task instance is scored on four behavioral vectors. Lower values are more adherent to the mined SoP.
0.00
Swivel Rate
events in non-primary app / total
1.00
Zero-Edit Visit Rate
inspection-only clicks
1.04
Avg Revisit Depth
max repeats of any activity
0.00
Cross-App Time Share
time outside primary app
Meta Score Distribution
Instance count in each 10-point adherence bucket. Right-skewed = healthy execution; long left tail = concentrated pain points.
0
10
20
30
40
50
60
70
80
90
Top Concern Vector
Zero-Edit Visit Rate — current value 1.00
Investigate why analysts inspect-and-leave without editing — usually missing upstream data or unclear UI affordances.
Vector Contribution to Adherence Loss
Each vector's weighted contribution to lost adherence. The top row is your highest-leverage fix.
Zero-Edit Visit Rate
value 1.00
Investigate why analysts inspect-and-leave without editing — usually missing upstream data or unclear UI affordances.
Revisit Depth
value 1.04
High revisit depth = mid-flow rework. Trace which sub-step is being repeated; usually a data-readiness gap upstream.
Swivel Rate
value 0.00
Reduce cross-app hops by surfacing the secondary-app data inline (API integration or pre-fetched context panel).
Cross-App Time Share
value 0.00
Time outside primary app dominates this score. Identify which secondary app(s) and whether their function can be in-lined.
Lowest-Adherence Instances
The 5 task instances that scored lowest. These are the concrete cases worth investigating to understand the worst patterns.
Case
Analyst
Score
Duration
Sub-Steps
Swivel
X-App
Revisit
2264672
Analyst 09
80.0
0.6m
2
0.00
0.00
2
2349786
Analyst 09
80.0
40.8m
2
0.00
0.00
2
2240449
Analyst 09
82.5
0.0m
1
0.00
0.00
1
2087257
Analyst 09
82.5
0.1m
1
0.00
0.00
1
2087257
Analyst 09
82.5
0.0m
1
0.00
0.00
1
Users on This Task
2 analysts executed this task during the pilot. Scatter plots productivity (instances per active day) against adherence (meta score). Top-right = healthy high-volume analysts; bottom-right = firefighters; top-left = careful low-volume; bottom-left = struggling.
Adherence (meta score) →
Productivity (instances / active day) →
05
09
User-by-User Breakdown
Click any analyst to see their dominant sequence vs the reference SoP.
Analyst
Instances
Inst / Day
Adherence
Weakest Vector
Swivel
X-App
Min
▸
Analyst 05
4
2.0
82.5
Zero-Edit Visits
0.00
0.00
0
Reference SoP
1
View Combined Review
Analyst 05's Dominant Sequence
— 4 of 4 instances
1
View Combined Review
▸
Analyst 09
49
4.5
82.4
Zero-Edit Visits
0.00
0.00
825
Reference SoP
1
View Combined Review
Analyst 09's Dominant Sequence
— 49 of 49 instances