A PDF version of the article can be found here.
Empirical Legal Research in Singapore: ITS USES, ITS LIMITATIONS, AND THE WAY FORWARD
Darren Ang*
I. Introduction
A. Lawyers against scientists
Let us begin with a simple proposition: the
methods used to reach conclusions in law are inherently different from those
used in the hard sciences.[1]
We can explain this with a quick thought experiment. Ask a lawyer to prove a point, and they would probably point to an authoritative source that is broadly related to the conclusion sought to be reached, then hammer in their conclusion with arguments from principle and logic (and sometimes, rhetoric).
In contrast, ask a scientist to prove a point, and their approach is quite different. They would probably make a hypothesis, then run experiments to collect data before analysing that data using statistical methods. Their conclusions are reached through careful observation, complete with declarations of percentage-accurate degrees of confidence in their results.
If these contrasting methodologies had to be summarised in one word each, it could be said that methods in law are “argumentative”, while those in the hard sciences are “empirical”. More broadly, it could also be said that law is “qualitative" while the hard sciences are “quantitative”.
B. The landscape of empirical legal research
However, over the past half-century, empirical methodologies have increasingly found their way into legal scholarship, particularly in the United States.[2] Entire textbooks have been written on the subject,[3] and legal studies with empirical components have come to take on bolder inquiries—including assessing judges’ behaviour and explaining their individual writing styles.[4]
In Singapore, the
empirical charge has just begun to take hold. Over the past decade, at least
six studies have relied on some form of empirical methodology, and all of them
have relied on published judicial decisions as quantitative data.[5]
It turns out that
there is a well-developed discipline in communications research which neatly
encapsulates methodologies which “proceed from text to results”—its name is “content
analysis”.[6] However, among the six empirical legal studies
in Singapore, only the study by Lo et al has expressly acknowledged that it was
adopting a content analysis methodology.[7]
In light of the recent empirical legal
research charge, this article seeks to shed some light on the nature of content
analysis methodologies, its limitations in the context of analysing published
judicial decisions, and some potential
workarounds to these limitations. It concludes with a brief suggestion that, in
such contexts, the “empirical” methodologies of content analysis achieve their objectives
best when paired with conventional, “argumentative” legal analysis.
II. What is Content Analysis, and why is it appropriate For Legal Research?
A. Content analysis and its methodology
Content analysis is a research technique that seeks to make replicable and valid inferences from texts to the contexts of their use.[8] That is, it seeks to draw meaningful conclusions through analysing large bodies of text quantitatively, such that future researchers applying the same methodology would reach the same conclusions.
The methodology of content analysis is robust—in a leading text on the discipline, Krippendorff identified six “components” of content analysis, which are as follows:[9]
1) Unitizing: distinguishing segments of text which are of interest to an analysis;
2) Sampling: limiting observations to a manageable subset of units that is statistically or conceptually representative of the set of all possible units;
3) Recording/Coding: interpreting the unitized data and stating one’s experiences either in the formal terms of an analysis (recording) or according to observer-independent rules (coding);
4) Reducing: using established statistical techniques or other methods for summarising or simplifying data;
5) Abductively inferring contextual phenomena: bridging the gap between texts and what the texts imply using analytical constructs;
6) Narrating: making the results comprehensible to others.
While a full exposition of each of these six components is out of the scope of this article, some discussion of the component of “abductive inference” is apposite. This component is said to “distinguish content analysis from other modes of inquiry”,[10] and it materialises as “analytical constructs” which function as “the best hypothesis or explanation that the analyst can imagine or defend … backed by knowledge of the context of the analysed texts”.[11]
The distinctive element of “abductive inference” makes content analysis methodologies particularly appropriate for the analysis of published judicial decisions—in this context, the “analytical constructs” can take the form of legal principles derived from conventional legal analysis, and these can be employed to justify quantitative findings.[12] To illustrate this with an example from an upcoming empirical legal study involving the author, a quantitative finding that a traffic offender’s plea of guilt is given mitigatory weight about 80% of the time may be explained with an argument from the sentencing objective of specific deterrence.
B. The history and development of content analysis
in legal scholarship
The earliest examples of content analysis are found in the quantitative analyses of printed matter by the Church in the 17th Century, to which the Church concluded that the printing of non-religious materials were a threat to its authority—this went on to inspire a significant 20th Century movement where various researchers engaged in quantitative analyses of newspapers and propaganda publications in an attempt to uncover, among other things, the profit motives behind newspapers and their negative effects on society.[13] Around that time, the seeds to the content analysis movement in legal scholarship were sown.
It has been said that “[t]he epistemological roots of content analysis [in legal scholarship] lie in Legal Realism”.[14] Legal Realism was a movement within the American legal academic circle that first gained traction in the 1920s,[15] and while the Realists departed from each other at various points, they shared a common scepticism towards conventional legal theories and zeal for reform.[16] Among them, a significant faction of the Realists sought to predict judges’ decisions with some degree of certainty,[17] and the empirical analysis of recorded judicial opinions was a particularly appropriate means towards that end.[18]
While the full extent of Realist thought has since lost most of its force,[19] the Realists’ clarion call to empiricism survived:[20] following the explosion of quantitative studies involving the content analysis of published judicial decisions in the United States in the 1990s-2000s,[21] the systematic content analysis of published judicial decisions is now “a mainstay of legal and political science scholarship”.[22]
C. The place of content analysis in modern legal
scholarship
What, then, is the place of content analysis in modern legal scholarship? It is said that content analysis “trades the pretence of ontological certainty for a more provisional understanding of case law”.[23] That is, conventional legal analysis requires the subjective, “deeply reflective” interpretation of a narrower area of the law,[24] while content analysis reaches an objective, “thinner” understanding of a large number of decisions.[25] They are different tools within the toolbox of legal analysis.
It follows that the role of content analysis in legal scholarship is not to supersede conventional legal analysis; instead, its role is to complement and augment conventional analysis.[26] For example, while conventional legal analysis is best suited for landmark judgments with great legal and cultural significance,[27] content analysis is particularly useful at “proving a negative”—if Principle Y states that Factor X will not be given weight except in exceptional cases, it can only be tested by looking through a sample of cases where Factor X was brought to the court’s attention, and finding that Factor X was given no weight in almost all of the sampled cases. It follows that both tools can be used in tandem to reach more robust conclusions[28]—continuing off the above example, if the content analysis reveals that Factor X is actually given weight in a significant proportion of cases, this would strongly support an argument for Principle Y to be reformed (or repealed).
However, there exists a more compelling reason to use content analysis in tandem with conventional legal analysis—there are several limitations inherent in published judicial decisions which, in most cases, render content analysis methodologies incapable of reaching robust conclusions in and of themselves.
III. Limitations and potential workarounds
A. The problem of unpublished decisions
Not every dispute goes to court, and the ones that do are often resolved without written or published opinions. In Singapore, the existence of unpublished decisions is well-known[29]—judges generally do not owe a duty to issue written grounds of decision. The most common situation for the duty to issue a written grounds of decision to arise, in both civil and criminal cases, is when a notice of appeal is filed.[30]
From these circumstances alone, it could be assumed that published decisions would likely involve more contentious cases, while straightforward cases are more likely to be unpublished.[31] However, this forms an insurmountable hurdle for aspiring researchers at the “sampling” component of content analysis, as the sample of published judicial decisions would never be representative of the whole population of interest of an empirical legal study. [32]
One workaround suggested by Hall & Wright is to acknowledge this hurdle and explicitly limit the scope of the study to published judicial decisions.[33] For example, in the study on the development of Singapore law by Goh & Tan, the authors limited their sample to reported cases, justifying this by arguing that reported cases “perhaps provide more significant influence on our local jurisprudence”.[34] In the author’s view, Goh & Tan’s argument sufficiently addresses the problem of unreported cases while also providing a positive justification for their sample.
However, even in studies where limiting the sample to published or reported cases cannot be similarly justified, it is said that a “skewed view” is better than having no view on the matter, and published decisions, as one of the significant sources of law for lawyers in the common law tradition, are still a “highly valuable source for systematic study”.[35] The limitations to empirical legal studies only mean that researchers must be “less expansive … in drawing conclusions from their findings”.[36] For example, while empirical studies measuring the effects of extra-legal factors on appellate decision-making in the United States had been subject to harsh attacks on their accuracy, more nuanced empirical legal studies that have acknowledged the limitations of their methodologies and employed more sophisticated techniques have been taken to more kindly.[37]
B. The problem with analysing causative relationships between facts and decisions
In addition to the problem of unpublished decisions, Hall & Wright argue that a “circularity problem” arises when content analysis is employed to find causative relationships between legally relevant factors and judicial opinions, as the written facts and opinions may not fully capture the “real world facts” or the entirety of the case process.[38] This raises an issue at the “abductive inference” component of content analysis—abductive inference contemplates finding the best explanation to a particular set of facts, but if the facts themselves are incomplete, any inferences made from them will be similarly imperfect.
Unfortunately, to the author’s knowledge, no workaround is available to deal with this problem, and various empirical legal studies have faced harsh attacks on their validity for failing to take it into account.[39] While some empirical legal researchers have resorted to gathering data by physically attending court hearings,[40] and those studies have a stronger claim to their validity, no empirical study could possibly account for the closed-door and confidential nature of judicial decision-making.[41] Therefore, a similar attitude as with that towards unpublished decisions must be adopted: the problem must be acknowledged, and the conclusions sought to be reached must be restricted accordingly.
To that end, it is suggested that any empirical legal study that seeks to find the “weight” or “significance” attached to factors considered in judicial decisions may overreach the boundaries of content analysis—as “weight” or “significance” is a qualitative inquiry, best suited for conventional legal analysis. Most of the empirical legal studies in Singapore appear to have recognised this, and they have generally involved the counting of factors without any evaluation of causative significance, coupled with qualitative analyses of the findings using more conventional techniques of legal analysis.[42] In the author’s view, this combination the best balance between depth and objectivity of understanding.[43]
IV. Conclusion
This article has set out the basic methodology and historical development of content analysis as a methodology in empirical legal scholarship, as well as its uses, limitations, and potential workarounds to those limitations. In summary, it has been argued that in the context of analysing published judicial decisions, content analysis methodologies achieve their objectives best when paired with conventional legal analysis, and while the problems with unpublished decisions and analysing causative relationships can be mitigated to some degree by such a pairing, they should be explicitly dealt with (or simply acknowledged) where they arise.
The empirical legal research charge has already begun—it should be welcomed with open arms. It is hoped that aspiring empirical legal researchers remain cognisant of the unique capabilities and limitations of their methodologies, and that the introduction of empirical methodologies into legal scholarship in Singapore will be the catalyst for fruitful discussions and developments in the law.
* LL.B. (Hons.)
Candidate, National University of Singapore. The author wishes to express his
most heartfelt gratitude to Mr. Benny Tan (Sheridan Fellow, National University
of Singapore) for leading the empirical legal research charge within the
Singapore Law Review, and for the many illuminating discussions about empirical
methodologies in law—from which the seeds to a forthcoming empirical legal
research paper and this companion piece were sown.
[1] For more variants on this illustration and the inspiration for this section, see Robert Lawless et al, Empirical Methods in Law, 2nd ed (Alphen aan den Rjin: Wolters Kluwer, 2016) at 7-20.
[2] Mark A Hall & Ronald F Wright, “Systematic Content Analysis of Judicial Opinions” (2008) 96:1 Cal L Rev 63 at 72.
[3] See eg, Lawless et al, supra note 1.
[4] James C Phillips & Edward L Carter, “Oral Argument in the Early Roberts Court: A Qualitative and Quantitative Analysis of Individual Justice Behaviour” (2010) 11:2 J App Pr & Pro 325; Keith Carlson et al, “A Quantitative Analysis of Writing Style on the U.S. Supreme Court” (2016) 93:6 Wash ULO 1461.
[5] These are: Goh Yihan & Paul Tan, “An Empirical Study on the Development of Singapore Law” (2011) 23 SAcLJ 176; Lee Zhe Xu et al, “The Use of Academic Scholarship in Singapore Supreme Court Judgments” (2015) 33 Sing L Rev 25; Cheah W L & Goh Yihan, “An Empirical Study on the Singapore Court of Appeal’s Citation of Academic Works: Reflections on the Relationship Between Singapore’s Judiciary and Academia” (2017) 29 SAcLJ 75; Agnes Lo et al, “The Evaluation of Medical Expert Opinions in Litigation: An Empirical Study” (2018-2019) 36 Sing L Rev 247; Jerrold Soh, “A Network Analysis of the Singapore Court of Appeal’s Citations of Precedent” (2019) 31 SAcLJ 246; and Professor Gary Chan’s monograph at Gary Chan Kok Yew, Tort of defamation before the Singapore Courts, 1965-2015: A comparative and empirical study (Singapore: Academy Publishing, 2017).
[6] Klaus Krippendorff, Content analysis: an introduction to its methodology, 2nd ed (California: Sage Publications, Inc., 2004) at 83
[7]
Agnes Lo et al, “The Evaluation of Medical Expert Opinions in Litigation: An
Empirical Study” (2018-2019) 36 Sing L Rev 247 at 254, 255.
[8]
Krippendorff, supra note 6 at
18.
[9] Ibid at 83-85, 126, 171. Also note that these
components do not have to be organised linearly, and a content analysis design
can contain iterative loops: see ibid at 85.
[10] Ibid.
[11] Ibid at 171.
[12] See ibid at 90 for a research design
framework that seeks to “operationalise expert knowledge”; this directly
supports the use of legal principles as analytical constructs.
[13] Ibid at 3-6.
[14]
Hall & Wright, supra note 2 at 76.
[15] Michael Freeman, Lloyd’s Introduction to Jurisprudence, 9th ed (London: Sweet & Maxwell, 2014) at 845.
[16] Ibid.
[17]
Brian Z Tamanaha, Law as a Means to an End: Threat to the Rule of Law
(Cambridge: Cambridge University Press, 2006) at 70.
[18] Karl Llewellyn famously read thousands of cases
randomly selected from various American appellate courts to determine the
factors which would influence a judge’s decision. See Karl N Llewellyn, The
Common Law Tradition: Deciding Appeals (Boston: Little, Brown & Co,
1960).
[19]
Tamanaha, supra note 17 at 1, 72.
[20] Hall & Wright, supra note 2 at 76.
[21]
See Table 1 in Ibid at 72.
[22] Carlson et al, supra note 4 at 1466.
[23] Hall & Wright, supra note 2 at 87.
[24] Ibid.
[25] Ibid at 78.
[26] Ibid
at 88. See also the concept of “triangulation” in the social sciences, in ibid
at 83.
[27] Ibid at 84, citing the well-known case of Roe v Wade 410 U.S. 113
(1973).
[28] Ibid at 81.
[29] For a readily observable example, it appears that the Singapore courts have added the word “unreported” in brackets to indicate that a case cited as authority is an unpublished decision.
[30]
For the position in the civil law, see Rules of Court (Cap 322, s 80,
2014 Rev Ed Sing), O 42 r 8(1). For the position in the criminal law, see Criminal
Procedure Code (Cap 68, 2012 Rev Ed Sing) at ss 377(5) and (7). See also ss
394A-B and 397(3A) of the Criminal Procedure Code for more exceptional
situations where a written grounds of decision must be issued.
[31] This assumption is backed by some literature. Judge Edwards wrote that “any assessment of the work of the courts of appeals that does not include unpublished decisions cannot be seen as complete”, and that according to official statistics, less than 17 percent of all opinions in courts of appeals were published. See Harry T Edwards & Michael A Livermore, “Pitfalls of Empirical Studies that Attempt to Understand the Factors Affecting Appellate Decisionmaking” (2009) 58:8 Duke U 1895 at 1923. In Singapore, this is likely to be the case as well—as an illustration, the State Courts had heard 303,487 criminal cases in the year 2018 alone, while searching the term “Public Prosecutor” in LawNet yields 15,482 results across all years. See “One Judiciary Annual Report 2018”, Supreme Court Singapore, online: <https://www.supremecourt.gov.sg/docs/default-source/default-document-library/ojar_full-8.pdf>.
[32]
In Singapore, there is at least one channel for researchers to access court
archives—the Empirical Judicial Research Programme. However, there appear to be
no channels for application for this programme that are available to the
general public. See “About the Empirical Judicial research Programme”, Singapore
Judicial College, online: <https://www.supremecourt.gov.sg/sjc/empirical-judicial-research>.
[33] Hall & Wright, supra note 2 at 92.
[34]
Goh Yihan & Paul Tan, “An Empirical Study on the Development of Singapore
Law” (2011) 23 SAcLJ 176 at para
43.
[35] Hall & Wright, supra note 2 at 92.
[36]
Harry T Edwards & Michael A Livermore, “Pitfalls of Empirical Studies that
Attempt to Understand the Factors Affecting Appellate Decisionmaking” (2009)
58:8 Duke U 1895 at 1907.
[37] Ibid at 1904-1905.
[38] Hall & Wright, supra note 2 at 95.
[39] See Edwards & Livermore, supra note
36 at 1930-1944.
[40]
See eg, Jessica Jacobson
& Mike Hough, “Personal Mitigation: An Empirical Analysis in England and
Wales” in Julian V Roberts eds, Mitigation and Aggravation at Sentencing
(Cambridge: Cambridge University Press, 2011) at 146-167.
[41] Edwards & Livermore, supra note 36
at 1903.
[42] A notable exception is Lo et al, supra note
8, which the author is immensely grateful to have been involved in—the
methodology of the study involved various coders identifying certain factors
and assigning “scores” from 1 to 5 to them based on their “significance”. While
a critical examination of the methodology used is out of the scope of this
article, it suffices to say that this is a novel methodology which warrants
much closer examination. See also Jerrold Soh, “A
Network Analysis of the Singapore Court of Appeal’s Citations of Precedent”
(2019) 31 SAcLJ 246, the first
empirical legal study to adopt network analysis in Singapore.
[43] Hall & Wright, supra note 2 at 88.