Review process: Bringing structure and flexibility to experiment reviews

Review process: Bringing structure and flexibility to experiment reviews

At Benchling, the leading life sciences R&D platform, I led design of Review Process, the company’s first structured review system. In its first year it unlocked $9.8 million, and has since grown into a foundation for reviews across products. Launched in 2023 and expanded in 2024, it introduced workflows and role-based steps where none existed before.

The broader context

Benchling’s growth strategy and Review Process’s role

I led the design for Notebook, Benchling’s flagship product: a collaborative digital lab notebook where scientists plan experiments, capture data, and build the foundation of their IP. More than 90% of users rely on it daily, making it central to a $150 million ARR business line and tightly integrated across the platform. As Benchling expanded upmarket and into development and manufacturing, stronger compliance and auditability became essential. Review Process was a key step in that journey, giving Notebook the structure enterprises required while laying the groundwork for downstream expansion.

I led the design for Notebook, Benchling’s flagship product: a collaborative digital lab notebook where scientists plan experiments, capture data, and build the foundation of their IP. More than 90% of users rely on it daily, making it central to a $150 million ARR business line and tightly integrated across the platform. As Benchling expanded upmarket and into development and manufacturing, stronger compliance and auditability became essential. Review Process was a key step in that journey, giving Notebook the structure enterprises required while laying the groundwork for downstream expansion.

The problem

Broken reviews cost scientists time and Benchling revenue

From the start, leadership flagged that the existing review feature was too loose, and customers were asking for something more structured and compliant. It had also become Benchling’s top sales blocker, putting more than $15 million in deals at risk. Designing Review Process meant aligning the needs of scientists, admins, quality teams, and compliance/legal. Through interviews and workflow mapping, we uncovered pain points in the legacy system that both confirmed and challenged our assumptions. The challenge was not simply structure versus flexibility, but whether the system could provide both in the right ways.

From the start, leadership flagged that the existing review feature was too loose, and customers were asking for something more structured and compliant. It had also become Benchling’s top sales blocker, putting more than $15 million in deals at risk. Designing Review Process meant aligning the needs of scientists, admins, quality teams, and compliance/legal. Through interviews and workflow mapping, we uncovered pain points in the legacy system that both confirmed and challenged our assumptions. The challenge was not simply structure versus flexibility, but whether the system could provide both in the right ways.

Our research revealed two major breakdowns in the legacy system: No structure to scale. There were no standardized workflows. Each project relied on one-off setups, and without multi-step approvals, teams had to manage scientific processes outside the system. No flexibility to adapt. Reviews couldn’t reflect team norms, and ambiguous terms like “Approve” implied legal sign-off. There was no way to separate tasks such as peer review and PI review. And without an alternative way to close entries when formal sign-off wasn’t required, scientists often triggered unnecessary reviews just to move work forward, creating friction with compliance and legal. The result: frustrated scientists, delayed IP filings, and lost enterprise deals.

Our research revealed two major breakdowns in the legacy system: No structure to scale. There were no standardized workflows. Each project relied on one-off setups, and without multi-step approvals, teams had to manage scientific processes outside the system. No flexibility to adapt. Reviews couldn’t reflect team norms, and ambiguous terms like “Approve” implied legal sign-off. There was no way to separate tasks such as peer review and PI review. And without an alternative way to close entries when formal sign-off wasn’t required, scientists often triggered unnecessary reviews just to move work forward, creating friction with compliance and legal. The result: frustrated scientists, delayed IP filings, and lost enterprise deals.

The solution

Flexible in practice, structured by design

Based on the research insights, Review Process addressed the breakdowns on two fronts: Structure that scales. Admins defined review processes in global settings so they could be reused across projects. Project owners assigned these processes up front, ensuring every entry followed the right structure. Progress was visible in the summary panel, and every step was traceable for compliance. Flexibility to adapt. Admins could customize stage names, review actions, and completion states. Authors could select the right process and reviewers for their work, with self-review available when formal oversight wasn’t required. This made it easy to reflect team norms without compromising scientific integrity or compliance. From the outset, I designed the framework with extensibility in mind. Today it powers Worksheets and Studies, with plans to expand into Templates and Schemas. The result: a platform-wide framework that made reviews flexible for science and structured for compliance.

In review

Global settings

Admins set standardized review processes in global settings.
Admins set standardized review processes in global settings.

Project settings

Assigning review processes to projects ensures each entry has the right process and reviewers.
Assigning review processes to projects ensures each entry has the right process and reviewers.
Assigning review processes to projects ensures each entry has the right process and reviewers.

Send for review

Authors pick from the project’s pre-defined processes and assign reviewers.
Authors pick from the project’s pre-defined processes and assign reviewers.
Authors pick from the project’s pre-defined processes and assign reviewers.

Summary panel

The status badge shows progress at a glance, and the summary panel centralizes details and actions.
The status badge shows progress at a glance, and the summary panel centralizes details and actions.
The status badge shows progress at a glance, and the summary panel centralizes details and actions.

Self-review

Self-review gives scientists a lightweight way to wrap up entries without triggering a full review process.
Self-review gives scientists a lightweight way to wrap up entries without triggering a full review process.

Extensible framework

Designed for reuse, Review Process now powers features beyond Notebook, including Bioprocess Worksheets (shown here).
Designed for reuse, Review Process now powers features beyond Notebook, including Bioprocess Worksheets (shown here).
Designed for reuse, Review Process now powers features beyond Notebook, including Bioprocess Worksheets (shown here).

The impact

Driving adoption, satisfaction, and measurable results

By optimizing for both structure and flexibility, Review Process unlocked $9.8 million in its first year while meeting 93% of review requirements in enterprise software evaluations. Just as importantly, it received strong and consistent user feedback:

By optimizing for both structure and flexibility, Review Process unlocked $9.8 million in its first year while meeting 93% of review requirements in enterprise software evaluations. Just as importantly, it received strong and consistent user feedback:

Before Review Process, we had to chase down who reviewed what. Now it’s all right there, so we don’t have to follow up constantly.

Startup customer

The flexibility has been a game-changer. We can keep things lightweight when we need to or add structure for high-stakes reviews.

Enterprise customer

When I see ‘Approve’ and my team sees ‘Accept,’ we all know our roles. It’s a small detail that’s made a big difference.

Enterprise customer

Future directions

Toward a smarter, faster review experience

Next, Review Process needs to be both faster and smarter. Two priorities stand out: In-app notifications to keep entries from stalling. Since launch, completion rates improved from 40% to 68% within 30 days of last edit, but still miss the 75% goal. Timely reminders would reduce compliance risk and keep reviews moving. Migration into Workflows to enable branching and conditional logic. A key need is “review by exception,” where only out-of-spec results go to QC. We shipped a standalone system to unblock millions in sales, since Workflows then lacked basics like custom stages and reassignment.

Next, Review Process needs to be both faster and smarter. Two priorities stand out: In-app notifications to keep entries from stalling. Since launch, completion rates improved from 40% to 68% within 30 days of last edit, but still miss the 75% goal. Timely reminders would reduce compliance risk and keep reviews moving. Migration into Workflows to enable branching and conditional logic. A key need is “review by exception,” where only out-of-spec results go to QC. We shipped a standalone system to unblock millions in sales, since Workflows then lacked basics like custom stages and reassignment.

To build momentum, I prototyped a modern review flow and shared it with 14 field representatives and product partners during the “envision the future of Notebook” initiative, aligning them around a shared vision for scaling reviews across enterprise R&D.

This is really exciting! It gave us a clear picture of how reviews on Benchling could evolve. Our customers would absolutely love it!

Benchling Implementation Manager, after seeing an early iteration

Modern review flow

Yi-Ying Lin, 2025

Yi-Ying Lin, 2025

Yi-Ying Lin, 2025