Improving Expert Testimony

Andrew Baker

Expert testimony plays a crucial role in modern litigation, but the complexity of legal disputes, particularly in commercial litigation, demands that judges engage deeply with technical methodologies — an expectation that strains judicial capacity and resources. In a new article in the Berkeley Business Law Journal, Professor Andrew C. Baker suggests reforms aligning evidentiary standards and judicial goals to create more transparent, consistent, and equitable outcomes, improve the judiciary’s gatekeeping role, and foster greater trust in the legal process.

Building on Baker’s previous co-authored works on securities litigation and valuation, “Statistical Learning Can Help the Judiciary Fulfill Its Gatekeeping Role Over Expert Witnesses” makes the case for using statistical learning techniques in expert testimony. Expert testimony is increasingly pivotal in commercial litigation — and often relies on techniques that are well beyond the grasp of a typical judge or jury. 

“Fields such as antitrust, securities litigation, and employment discrimination increasingly rely on expert analyses to bridge the gap between legal principles and technical realities,” Baker writes. “However, this reliance has amplified concerns about the reliability of expert evidence.”

The standard for admitting expert evidence, established by the 1993 Daubert v. Merrell Dow Pharmaceuticals, Inc. case, requires rigorous scrutiny but also places a significant burden on judges. Baker proposes integrating statistical learning and algorithmic modeling to create a more objective and reliable blueprint for expert analysis. 

“This approach not only enhances the credibility of expert testimony but also provides judges with a more transparent and administrable tool for evaluating complex evidence,” he writes.