“Which journals are you interested in?”
It’s a question on 1L lips in recent weeks (when we’re not talking about class or where to get drinks after class). It’s also a question that presumes everyone is or should be involved in at least one journal. Now, journals here seem to have mostly grown to include three parts: affinity groups, blogs, and the academic journal itself. I want to focus on the last element here, which still seems to be their core: the “journal” part of a journal.
Is there really any reason for a 1L to get involved in producing a journal, especially at a subciting level? Many of the explanations I hear (e.g., “They’re a good place to meet people outside of your section”) apply to the “affinity group” part of a journal far better than the “subciting” part of a journal. Others beg the question (e.g., “You could be promoted to a Line Editor so you can supervise subciters!’ or “You’ll learn to bluebook perfectly!”). But by far the most frequent line I hear is that working on a journal is somehow a mandatory requirement for your resume.
I see two reasons employers might care about a 1L subciting a journal: (1) that it proves commitment to a subject area and (2) that it proves commitment to jumping through any and all hoops put in front of you. Working for an employer that selects based on the latter criteria seems like a recipe for utter misery, and surely there are ways of proving interest that might be more useful to the world in general.
At this point in my writing, I was concerned I was being too harsh on journals, so I decided to do some research to find out who they’re useful to.
They’re certainly useful to professors. Promotion and tenure requires publishing in law journals. However, law is exceptional in academic scholarship: not only are the vast majority of law journals edited and produced by students (including all of the HLS journals), but the journals are rarely peer reviewed (none at HLS are).
The consequences of this are well known enough to make the pages of the New York Times. One author looked at 25,000 law review articles and found clear editorial bias and poor article selection due to the publication process. He writes:
Examining published articles in law reviews — the dominant venue for scholarship — and subsequent citations to these articles, we find that, with few exceptions, law reviews publish more articles from faculty at their own institution than from faculty at other law schools. Law review publications of their own faculty are cited less frequently than publications of outside faculty. This disparity is more pronounced among higher-ranked law reviews, but occurs across the entire distribution of journals. We correspondingly find that law faculty publish their lesser-cited articles in their own law review relative to their articles published in other law reviews. These findings suggest that legal scholarship, in contrast to other academic disciplines, exhibits bias in article selection at the expense of lower quality. (From http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2336775.)
Despite this shortcoming, law review articles could be valuable for judges, advocates, or simply as a way of pushing forward humanity’s understanding of the world. Unfortunately, as the Times article describes, the influence of law reviews on actual judgments is declining. Even law review’s influence within academia is debatable: Smith (2007) found that “43% of [articles in Shepard’s LAWREV database] have not been cited at all, and about 79% get ten or fewer citations”. In itself, that’s not surprising: the web of citations is a scale-free network, like the Internet or Wikipedia. Few papers receive the bulk of citations. An image from Smith’s paper illustrates:
Smith compares this distribution to High Energy Physics literature (citing Lehmann et al., 2003) and finds that 29% of papers have never been cited and 74% have less than eleven citations. So perhaps law journals are not significantly worse than the rest of academia.
This data talks about the whole universe of law reviews, but 1Ls here can only choose from journals at HLS. Among law reviews, the Harvard Law Review remains elite, ranked first by Google Metrics, second to the Stanford Law Review on the Washington + Lee Law Rankings, second to the Columbia Law Review based on its Eigenfactor, and seventh based on Article Influence. These last two rankings are from Eigenfactor.org (published by the University of Washington), which uses a PageRank-like algorithm. The Eigenfactor is an indicator of the value of the journal as a whole, and Article Influence is the expected value of an individual article. Of course, 1Ls can’t work on the Harvard Law Review.
What about the rest of them? If your role scales per article, or you’re going to be working on particular articles (e.g., subciting or article editing), then the importance of your work is probably best measured by Washington and Lee’s Impact Factor or Eigenfactor.org’s Article Influence, despite the small size of its dataset. The Harvard Environmental Law Review (HELR) and Harvard International Law Journal (HILJ) compare well on both metrics. Based on Impact Factor, online-only and newer journals appear to have less impact than better-established journals. Surprisingly, JOLT, HILJ, Harvard Journal on Legislation, and HELR are all ranked more highly than the Harvard Law Review itself; a colleague of mine suggested this was due to the “Notes” in the Review.
Here are the results:
Article Influence and Impact Factor have the advantage of integrating the entire web of citations and are useful for distinguishing between journals, but it’s difficult to tell whether an Article Impact of 0.8 means a lot of people will read an article you work on. To get a better sense of it, I used Google Scholar to count the number of citations for each article published since 2009 in each of the Harvard law journals.
Only in JOLT and HILJ are more than half of the articles cited at least 10 times. If we for a moment ignore the delay between publication and citation — an admittedly dubious idea — we can put this idea another way: if you work on an article in any other journal, it is more likely than not for that article to be cited fewer than 10 times. Moreover, according to this data, if you work on an article in the Latino Law Review or Unbound, it’s more likely than not that no one will cite your article. More than a quarter of articles in the Business Law Review, the Journal of Racial and Ethnic Justice, and the National Security Journal are also likely to not be cited.
If you’re working on the production side, and the amount of work you do is invariant to the number of articles (e.g. Marketing Director), then you’ll likely be more interested in the impact by the journal as a whole. This is best measured by the overall number of citations in journals and cases in the journal, or, more abstractly, by the Eigenfactor of the journal. Both metrics show the Harvard Law Review is the most important (From 2006-2013, Review articles were cited 6799 times in journals and 176 times in cases), followed distantly by the Journal of Law and Public Policy (1779/59). The Negotiation Law Review (462/8) and Human Rights Journal (448/2) sit around the middle of the HLS journal pack.
The full results are presented below (Source data from Washington+Lee Law Rankings and Eigenfactor.org, respectively). Note the scale in the first graph is logarithmic.
What does all this tell you? Maybe it tells you which journals you want to focus your energies on. Maybe it tells you that your dream journal could be improved and that you could be the one to improve it. Maybe it tells you to take a second look at that writer application for your journal’s website. I merely hope it might help you decide where you want to spend your attention and time.
I was extremely pleased to discover this site. I need to to thank you for
ones time due to this fantastic read!! I definitely loved every part
of it and I have you bookmarked to check out new things in your website.
I don’t know if it’s just me or if everyone else experiencing
problems with your site. It appears as though some of the text
in your content are running off the screen. Can somebody else please comment and let me know if this is happening to them too?
This could be a issue with my browser because I’ve had this happen previously.
Many thanks