CALL member Erica Friesen, Research and Instruction Librarian & Online Learning Specialist at the Lederman Law Library at Queen's University in Kingston, Ontario recently wrote an article on "Thoughts on the New Lexis+ Brief Analysis Tool: For Law Students and Novice Researchers". It originally appeared on Slaw.ca on September 28, 2022.
It is republished here with permission of the author.
New academic year; new legal research tools. Something new always comes out right as another cohort of students is gearing up to begin their law degree. And, as with many new product launches these days, “artificial intelligence” is often a prominently displayed term with accompanying materials. As legal publishers continue to launch AI-driven research tools in Canada, what do students and other novice researchers need to know to be prepared for their first forays into legal research?
Lexis recently launched the latest version of their legal research platform, Lexis+ Canada, for Canadian law schools. It features significant updates that employ artificial intelligence, including a “Brief Analysis” tool in line with similar products offered by Westlaw Edge (on the US platform) and vLex (already available in Canada). According to Lexis+, this tool helps researchers “expedite completion of briefs and other legal documents.” Once your document is uploaded, it spits out a report including the following categories: 1) recommended cases, 2) cases and legislation cited in your document, 3) jurisdiction, and 4) extracted and recommended concepts.
While I do not intend to comment on the utility of this tool for practitioners, I have no doubt that students beginning law school or working in summer or articling positions are bound to encounter this new feature sooner or later. When you do, you may have questions: should I be using this tool? How or how not? The landing page for this tool in Lexis+ contains no information on how this tool works and the Lexis+ Canada “Help” database so far contains minimal documentation. Confusion or uncertainty is understandable with such opacity.
To gain some insight, let’s run an experiment: take two case comments written by different authors on the same Supreme Court case from 2004. An older decision means that we should retrieve an abundance of recommendations from the intervening 18 years. Both comments include an overview of the area of law at the time of this decision as well as an analysis of how this particular decision would change the area of law. Once Lexis analyzed these documents, I exported the resulting report and compared the results for each comment.
Here are the two documents, in case you’d like to run the experiment yourself:
Comment A: Teresa Scassa, “Recalibrating Copyright Law?: A Comment on the Supreme Court of Canada’s Decision in CCH Canadian Limited et al. v. Law Society of Upper Canada”, Case Comment, (2004) 3:2 CJLT 89.
Comment B: WL Hayhurst, “The Canadian Supreme Court on Copyright: CCH Canadian Ltd. v. Law Society of Upper Canada”, Case Comment, (2004-2005) 41 Can Bus LJ 134.
“Recommendations” is the section of the Brief Analysis report that lists additional case law for you to consider adding to your legal document. Brief Analysis recommended 58 unique cases in total between Comment A and Comment B. However, only 10 of those cases were the same between the two documents (after adjusting for “recommended” cases that were actually cited in the other document). See Figure 1.
Key takeaway: Legal research and writing is an art, not a science. AI-driven tools like these run on the material that you upload, so you can expect substantially different results depending on the quality of analysis and the decisions you made in producing the original document. This does not mean that there is no value in using the tool, but you cannot expect it to help you build something from nothing. Recommendations are based on the content in the document, so if the content in the original document is not already strong, you may find yourself headed down the wrong path. Garbage in; garbage out as they say.
Lexis currently provides almost no information on what is meant by “extracted concepts” using Brief Analysis. But it sure looks cool to upload your document and instantly see a list of legal topics associated with your writing.
Here, too, there is substantial difference between the two documents. Comment A and Comment B had an overlap of 18 common extracted concepts. Comment A had an additional 22 unique extracted concepts while Comment B had an additional 21 unique extracted concepts. See Figure 2 for this data.
The utility of these concepts is highly variable. One illuminating instance is that Brief Analysis believed that “et al” was a relevant legal concept to Comment B. Does this simply mean that the phrase “et al” appears many times in the document?
Key Takeaway: For novice researchers, this tool has limited utility if used in isolation. While an experienced practitioner may quickly be able to understand which extracted concepts are relevant, it is unlikely that a novice researcher will have the necessary context to make these determinations with such little transparency provided by the tool itself.
For example, Brief Analysis extracted “strike” as a relevant concept for Comment B. While “strike” may indeed seem like a potentially important legal concept, it seems to have been picked up here because of an emphasis on “striking a balance” between the rights of authors and the rights of the public. Brief Analysis allows you to edit and reconfigure these concepts on your results page, but novice researchers will need to put in additional research outside of the tool to familiarize themselves with these concepts before they can properly make these assessments.
Cited in Your Document
This part of the report identifies caselaw and legislation that is already cited in your document. Presumably the intention is to warn you of potential negative history or treatment via the QuickCITE signals that the report pulls into the document as a visual representation.
Brief Analysis was fairly accurate in identifying Canadian case citations in the two documents, but still missed a number of Federal Court and Federal Court of Appeal decisions, as well as decisions that were cited as part of a case history. Interestingly, only one Canadian legislation citation was correctly identified as legislation; others were identified as caselaw or not identified at all. Almost no foreign or international citations were correctly identified (legislation, case law, treaties, etc). See Figure 3 for a breakdown of how many citations were correctly identified versus missed.
Key Takeaway: Like any other research platform, AI-driven tools are limited based on the data that serves as their input. If Brief Analysis only runs on Lexis’ Canadian content, then it won’t pick up US or UK citations as part of its analysis. This is a limitation to be aware of in the context of its other features too. None of the recommended caselaw or legislation was foreign or international either, but I might not have noticed this if I hadn’t critically assessed this part of the report. Again, familiarity with an area of law is necessary to use the tool properly and know if it is relevant to go to sources beyond Canadian law for your particular research question.
ConclusionNew tools like Lexis’ Brief Analysis may be aimed at experienced practitioners, but they are often rolled out first to law students and other less experienced legal researchers with minimal guidance or documentation on how to effectively use them. This experiment is intended to illustrate the potential pitfalls of a brief analysis tool like the one newly available on Lexis+ for law students. Only by using these tools with a critical eye can we truly get a sense for where such a product might fit in our legal research process. -- Erica Friesen, Research and Instruction Librarian & Online Learning Specialist
, Lederman Law Library, Queen’s Law
 CCH Canadian Ltd v Law Society of Upper Canada, 2004 SCC 13.