Analytics Leadership
The leadership craft a Cohere-style Lead DS is judged on — bar-setting, prioritization, narrative, mentoring, and the most underrated skill: killing work that shouldn't ship. Skim if HeyGen-only.
When this chapter applies
This chapter targets the Cohere Lead Data Scientist role, where the JD explicitly says: "manage a team of analysts and data scientists. Set the technical bar, mentor aggressively, and create an environment where exceptional people do their best work." Read this chapter cold if you're interviewing for that loop.
If you're targeting HeyGen (founding-DS, no reports), skim — the cross-functional sections still apply, but the management-of-reports content is out of scope.
If you haven't formally managed reports before, the strongest play is honesty: "I've led projects across teams but haven't been a direct manager. I'd want to be honest about that — the people-leadership craft is something I'd be learning on the job. What I bring is the technical bar-setting and the cross-functional fluency from leading complex analytical work." Then have one story per management dimension below — even if it's "as a tech lead, I did X."
Setting the technical bar
"Setting the bar" is leadership jargon. Concretely, a lead DS sets the bar by:
Reviewing analytical work in detail
Read the SQL. Run the notebook. Ask "what would change my mind?" — and accept that being the most senior reader of every analysis is part of the job for the first 6 months.
Defining what "done" looks like
A reusable rubric for analytical artifacts:
- Question stated up front in one sentence.
- Data sources and definitions explicit.
- Method choice defended (or simplest reasonable method used and noted).
- Uncertainty quantified (CIs, sensitivity).
- Recommendation, not just findings — and one "what would change my mind" sentence.
- Limitations called out by the author, not waiting to be found by a reviewer.
Establishing review rituals
Most leads pick one of: (a) weekly analytics review with all members presenting one piece of work, (b) async PR-style review on every major artifact, or (c) pair review on the most important work. Pick one, hold the line, iterate.
Prioritization across workstreams
The Cohere JD says you'll "define analytical priorities, allocate resources, and push initiatives from question to production." This is the hardest part of the job and the most likely failure mode.
The mistake new leads make
Saying yes to everything. Every PM, every exec, every engineering lead has an analytical question they want answered. If you take them all, the team becomes a ticket queue and stops producing strategic work.
A working framework
- Tier 1 (strategic): 60% of capacity. Tied to company OKRs. Long-horizon analytical bets — pricing tests, new-market sizing, churn-prediction model, etc.
- Tier 2 (operational): 30% of capacity. Recurring stakeholder needs that don't move quarter-to-quarter — board-prep refreshes, regional health metrics.
- Tier 3 (ad-hoc): 10% of capacity. One-off requests. Many go through a self-serve dashboard or get politely declined.
Defending this split with leadership early is the move that earns you the room to do strategic work later.
Building narrative
Analytical work is only as good as the narrative that surrounds it. The Cohere JD: "distilling insights into a concise, actionable narrative."
The shape of a strong narrative
- The decision the work informs. One sentence.
- The recommendation. Pick one. Don't hedge with five options.
- The headline evidence: one or two charts that make the recommendation obvious to a smart non-expert.
- The "what would change my mind" sentence. Specific. ("If churn in segment X next month is above 6%, I'd revisit this.")
- The risks and what we don't know. Named honestly.
- The appendix: methods, sensitivity, full charts, code.
Junior analysts write decks where every chart is interesting. Senior leads cut to the two charts that drive the recommendation, and move everything else to the appendix. The skill of cutting is most of the craft.
Mentorship and growth
"Mentor aggressively, create an environment where exceptional people do their best work." Operationalize as:
1:1s that actually matter
Weekly. Their agenda first. Two questions you should ask in every one: "what's been frustrating?" and "what would you do if you had more capacity?" Both surface signal that doesn't come up otherwise.
Stretch assignments
Identify the next-level work each report needs to grow. For a strong IC, that often means leading a project end-to-end, including stakeholder management — which they'll be bad at the first time.
Feedback discipline
Positive feedback in public, fast and specific. Corrective feedback in private, fast and specific. Not "do better" but "in the X analysis, the assumption Z wasn't called out — that's the line we hold."
Career conversations
At least quarterly. What's the trajectory they want? Where's the gap between current work and that trajectory? What does the next 6 months prioritize?
Killing work
Underrated. The most senior thing a Lead DS can do is identify when a workstream isn't going to produce decision-grade output and kill it cleanly — including their own.
Signs to look for
- The decision the work was meant to inform has been made (or rendered moot) and we're still building.
- The data required for credible inference doesn't exist and isn't going to.
- Stakeholder priority has shifted; the artifact won't be consumed even if produced.
- The method's assumptions are too brittle to be honest about with the consumer.
How to kill cleanly
- Name what you've learned to date. Half-finished work usually has a finding.
- State why you're killing — specifically, not vaguely.
- Identify what would unblock if it appeared (better data, different stakeholder).
- Reallocate the team's time visibly to something the leadership values.
The stories you need ready
Senior DS loops always have a behavioral round. Have a story ready for each of these prompts. STAR format (situation, task, action, result), 90 seconds each:
- A time you led a complex analytical project across teams.
- A time you made an analytical recommendation that was wrong, and what you did.
- A time you killed work that wasn't going to ship. (Underrated — most candidates don't have this story; having it is differentiating.)
- A time you mentored someone through a stretch they were initially bad at.
- A time you pushed back on a stakeholder. Specific stakeholder, specific push, specific outcome.
- A time the data wasn't what you expected and the recommendation changed.
- A time you simplified work that had become over-engineered.
Interview probes
Show probe 1: "How do you set technical bar for a team?"
Three pieces: (1) a reusable rubric for what 'done' means — question stated, methods defended, uncertainty quantified, recommendation explicit, limitations self-called; (2) a review ritual that holds the line — async PR-style review on major work, weekly analytics review on the rest; (3) being the most senior reader on every important piece for the first 6 months, even when it's slow. The bar only sticks if the lead enforces it consistently in the first quarter; after that it becomes culture.
Show probe 2: "How do you prioritize when every team is asking for help?"
Defend a capacity split — 60% strategic OKR work, 30% operational recurring, 10% ad-hoc — with leadership early. Use a self-serve dashboard layer for the ad-hoc bucket so PMs can answer their own questions. For everything else, force the requester to name the decision the analysis would inform. Half the requests evaporate; the other half become real projects.
Show probe 3: "Tell me about a time you killed work."
Have one ready. Format: 'We were six weeks into a churn-prediction model for segment X. As we got into validation, it became clear segment X was so heterogeneous internally that a single model wouldn't beat heuristics. Rather than push through, I killed it, documented what we'd learned about the segment's structure, and reallocated the team to an opportunity-sizing analysis that the GTM org needed before quarter-end. The hard part was the conversation with the original requester — I led with what we'd learned about their problem, not with the kill, and proposed a smaller follow-up that could be done in two weeks instead of twelve.' Anything in this shape works; the point is the kill, the learning, and the cleaner reallocation.
Show probe 4: "What's your management philosophy?"
Don't recite generic principles. Pick three specific commitments and explain why. Example: '(1) 1:1s are the report's agenda, weekly, never canceled — because trust compounds and the easy time to skip is the time skipping does the most damage; (2) feedback is given within 48 hours, specific to the artifact, never vague — because vague feedback feels like character judgment and specific feedback feels like teaching; (3) I aim to give every report one stretch project per half — because the alternative is people staying in their comfort zone and stalling.'
Show probe 5: "How do you handle a senior IC who's technically excellent but doesn't communicate well?"
Name the gap specifically, and frame it as growth into staff-level scope rather than a deficit. 'You're producing strong work, and to move from senior to staff, the differentiator isn't technical — it's stakeholder communication and the ability to influence decisions outside the data team. Here are two concrete commitments for this half: present at the monthly product review with leadership; rewrite one analysis per month as a one-pager that a VP could act on.' Pair with explicit support — help them prep the first presentation, edit the first one-pager.