Leiter Reports: A Philosophy Blog

News and views about philosophy, the academic profession, academic freedom, intellectual culture, and other topics. The world’s most popular philosophy blog, since 2003.

  1. Fool's avatar
  2. Santa Monica's avatar
  3. Charles Bakker's avatar
  4. Matty Silverstein's avatar
  5. Jason's avatar
  6. Nathan Meyvis's avatar
  7. Stefan Sciaraffa's avatar

    The McMaster Department of Philosophy has now put together the following notice commemorating Barry: Barry Allen: A Philosophical Life Barry…

European Science Foundation’s Ranking of Philosophy Journals

Philosopher Gualtiero Piccinini (Missouri/St. Louis) has the details, links and a short explanation of the rating scale, as well as some comments on the significance of this list for European philosophers.  The philosophy list is here.  And a longer explanation of the rating scale is here

The rankings consist of three gradations:  A being the highest, C the lowest.  The ratings strike me as fairly plausible, though there are some odd results.  (Thom Brooks (Newcastle) also comments on some oddities in the ranking.)  European Journal of Philosophy, for example, should clearly be an A, now that it is the most important forum for historical work on post-Kantian Continental philosophy, as well as publishing important articles in many contemporary areas.  And why is British Journal for the History of Philosophy an A, while History of Philosophy Quarterly gets a B?  All the Kluwer X and Philosophy journals (where X is Biology, Economics etc.) got an A, which may explain how Law and Philosophy got an A, while the journal I co-edit, Legal Theory, got a B, even though we reject many papers that end up being accepted at Law and Philosophy

One explanation for the handful of peculiarities may have to do with the committee that oversaw the philosophy journal rankings, which consisted of Francois Recanati, Manuel Garcia-Carpintero, Diego Marconi, Kevin Mulligan, and Barry Smith–a strong group of philosophers, but none of whom work mainly in moral, political, or legal philosophy, or in the history of philososphy.  (Marconi and Mulligan do some work in these areas.)  Overall, though, they deserve credit for coming up with a fairly reasonable listing, which, as Professor Piccinini suggests, may exert a positive influence on professional standards and practices in philosophy in Europe.

What do readers think?  Post only once (comments may take awhile to appear), and non-anonymous comments will, as usual, be strongly preferred.  Since this appears to be an "initial" list, it may be that feedback from philosophers will influence the final listing.

Leave a Reply to Tony Cole Cancel reply

Your email address will not be published. Required fields are marked *

44 responses to “European Science Foundation’s Ranking of Philosophy Journals”

  1. The key question regarding this rating is methodology. Were they going by reputation? Acceptance rate? Citation frequency? Refereeing practices? Nothing in the document provides a clue. Given the mystery of method, two things jumped out at me. First, Archiv fur Geschichte der Philosophie was rated a B. Archiv and Journal of the History of Philosophy are, to my mind, the two premier history journals, with BJHP a very strong journal as well. They should all be on a par. Second, there is a clear English language bias. I was surprised that the two French journals I have familiarity with (Archives de Philosophie and Revue de Metaphysique et de Morale) both rated Cs. They are at least as good as the Review of Metaphysics or the Southern Journal of Philosophy in what gets published. Most non-english journals have C rankings. Why?

  2. David Velleman

    Are there any online journals on the list?

  3. Dan López de Sa

    I guess that online journals were not considered, for otherwise it would be quite surprising that *Philosophers' Imprint* was not listed.

  4. Roberta Millstein

    Just a few quick notes – first, there is a separate ranking for History and Philosophy of Science (the link is available from Piccinini's blog). Second, this introduces some oddities: Biology and Philosophy is an 'A' Philosophy journal, but only a 'B' History and Philosophy of Science journal? How can that be? Third, this seems to be part of a general trend for the History and Philosophy of Science rankings, namely that the biology-related journals are ranked lower than the general-science-related and physics-related ones, e.g., Studies in History and Philosophy of Science and Studies in History and Philosophy of Modern Physics both get an 'A,' whereas the corresponding Studies in History and Philosophy of Biological and Biomedical Sciences gets a 'B'. Finally, 'B' seems to be a huge "grab bag" of a category, encompassing journals that many would draw distinctions among.

  5. Lisa: As Gualtiero points out, the A, B and C category system is given the following explanation:

    A: Journals that "make the discipline".
    B: Journals with international audiences, authors, and editorial boards.
    C: Other European journals

    Journals in French do not tend to "make the discipline" or have international audiences.

  6. The Canadian J. Phil is a 'B' and Erkenntnis an 'A'? Absurd. Being a European journal seems to be worth something, independently of quality.

  7. Brit,
    I think that French philosophers might disagree, at least with the 'making the discipline part', though those journals are probably read by those in francophone countries. The ranking is a European one, and I would imagine that one of the issues within the European context is precisely how to define 'the profession.' Different nations have different schemes for moving grads through the ranks. This is particularly true of France (it is near impossible to get a job teaching in France at any level from high school on up without taking the CAPES (and you won't be prepared to do that without having received a French education). (My guess is there are analogous issues in countries like Spain, Germany and Italy. And I would also wager that how to integrate European university systems is part of the background to this ranking.) It was in this regard that I made my initial remark. And it is still unclear what criteria is required for 'making the discipline'.

  8. "Being a European journal seems to be worth something, independently of quality"

    Neil: if you were right, then it would be difficult to explain why so many European journals are ranked lower than A.

    Lisa: though I am not an expert, I think the hope was that the ESF's ranking of the journals would break old habits. As Gualtiero put it, "by creating a ranking of journals based on international reputation, the ESF is putting long-term pressure on the relevant European academic communities to break their bad habits" (e.g. the habit of publishing only in one's native language in journals published by one's local friends).

  9. I clicked through the last link on Piccinini's post (the FAQ link), and found a relevant question and answer:
    "Does the categorisation A, B and C reflect differences in quality?
    The distinction between the categories A,B and C is not primarily qualitative; rather, the categorisation is determined by a combination of characteristics related to scope and audience (see the guidelines (PDF 79.3 KB) for definition).[My coding skills are not up to incorporating the link to the file]. Journals have different profiles, and their scope of audience and field vary. Papers in journals with wide international prestige are not automatically of higher quality than papers in journals which are known and read only in a very specialised field. Similarly, a paper published in the native language in a journal which has only local importance can lead to a strong impact for a certain type of research. The Expert Panels emphasise that high quality research appears in journals throughout all three categories."

  10. Expanding on Lisa Shapiro's point at 8:32 PM:

    If "the discipline" includes what people read in non-English oriented philosophy departments in post-colonial countries (i.e., countries in Latin America, the Caribbean, Africa, Asia), then I would guess you would see the influence of French, German, Italian, and Spanish journals trickling down that way as well.

    To be clear: German is included more for the reason that it is considered a scholarly language you have to master to read Kant, Hegel, Marx, Nietzsche, Husserl, Gadamer, Heidegger, Wittgenstein, and Habermas, rather than because of colonial influence in those countries. For better or worse, this tradition has influenced contemporary philosophy in those countries more than the tradition of Russell, Frege, Carnap, Quine, Strawson, Dummett, and Kripke has. The same would hold true in post-Soviet Eastern European countries, I think.

  11. This methdology is indeed curious. Those of us who work in what has generally come to be understood as 'applied ethics' are completely without ground here. Almost all of the premier journals in applied ethics — environmental ethics, medical ethics, bioethics, feminist philosophy, business ethics — receive scores of B or lower. Indeed, the ranking falls to shreds in these subfields. Almost every branch of applied ethics has _at least one_ leading journal, and these journals are instrumental in shaping the countours of that branch. Surely the criterion of "making the discipline" is important, even for a discipline as diverse as philosophy; but philosophy's diversity, particularly when it snakes its way into the interdisciplinary arena, makes it pretty difficult to establish just what qualifies as having made what aspects of the discipline.

  12. I find it somewhat disingenuous how the ERIH rankings have immediately come under fire by some commentators. Professional philosophers, and readers of this blog especially, should be the first to acknowledge that rankings can at best give an indication of the relative merits of a journal (or department, or graduate program…), and are hardly the only relevant factor. Any ranking is bound to contain some minor (or perhaps even major) glitches, and declaring the whole project "absurd" merely on the basis that journal X got an "A" and journal Y got a "B", when it should be the other way around, is itself highly dubious. Note that the ERIH/ESF team has dubbed these rankings "initial lists" and specifically asks for feedback from researchers in order to fine-tune the lists (the first update of which is expected as early as 2008). As for some of the more specific questions that have been raised: Yes, it does seem that some (in fact, a very small number of) European journals are ranked higher than comparable non-European titles. But, then, why shouldn't it be that way? The European Science Foundation's aim in preparing the rankings is to raise the "visibility of European Humanities research" (quote from their website). In other words, the rankings are part of broader policy to promote European research in the humanities, and it seems entirely acceptable that this policy should also extend to promoting European publications. None of the individual cases that have been criticized are clear-cut (for example I do think that the BJHP is, on average, slightly better than History of Philosophy Quarterly; and the higher ranking of Erkenntnis with respect to the Canadian Journal of Philosophy may well have to do with a certain nostalgia — and why not?), so perhaps one should read the rankings with a grain of salt instead of engaging in petty nitpicking. This is not to say that there aren't some fundamental questions. The bias in favour of English-language publications is real and, I think, inevitable, given the trends in global academia. (As a German with a British Ph.D. teaching at an Asian university, I think I can attest to the inevitability bit…) It makes little sense to lament this fact since there really is no other lingua franca. (And what would be the alternative? Require all academics to publish in English, French, and German, or perhaps return to Latin? And what about those non-European traditions? Sanskrit, anyone?) At the same time it hardly seems fair to brush off genuine worries about linguistic diversity, let alone to ridicule those who (also) publish in French, German, Italian, etc. as "insular" or as publishing "only in their native languages in journals published by their local friends and colleagues" (the latter – quite outrageous – claim is from a different blog). A slightly more cosmopolitan attitude might be in order here. Regarding the HPS vs. Philosophy issue, I don't see any inconsistency here. HPS does not form a subdiscipline within Philosophy, so it should not be surprising to find some journal garnering an A on the philosophy list but only a B on the HPS list. I am not in a position to assess the merits of 'Biology and Philosophy', but I would assume they do not publish much history of biology, so their impact in the HPS community might be limited. If, within the HPS rankings, there indeed turns out to be a bias in favour of general science and/or physics-related titles, leading to lower-than-justified rankings of biology-related titles, then this issue should be raised; again, the feedback procedure on the ERIH/ESF website would be a good starting point. Finally, I agree with Brian that the ERIH/ESF deserves credit for coming up with a surprisingly reasonable ranking, and I think one should applaud them for defining broad categories (I don't see any problem with the huge 'grab bag' of category B) and not giving in to the temptation of defining ever more fine-grained subcategories (which would only increase the error margin).

  13. Eric Schliesser

    It is worth noting that these rankings were changed after consultations. The result was general grade inflation: the earlier list of rankings had far fewer As and Bs. In my view, the original list had uncontroversial top journals. But it's this change that produced some of the oddities that Roberta Millstein and Lisa Shapiro have noticed, the most important one being the differences between the philosophy and the philosophy of science listings. I also agree with claims about the surprisingly low status of biology journals; the weirdly low ranking for Archive fur Geschichte der Philosophie; the Kluwer factor; the bias against French journals, etc. Note another oddity: Kant Studien gets an A and Hume Studies a B. More important, the rankings have a clear bias toward journals with a formal fetish. (This might account for the low status of biology journals.) Economics & Philosophy (a journal I referee for) gets an A, the journal of the Philosophy of Social Sciences a B, while the Journal of Economic Methodology is not even listed.

  14. At the very least this listing is a significant improvement over the initial proposal that was released a while ago. Certainly there are some bizarre results, but it is not generally ridiculous.

    I think one of the main things to keep in mind is that, as Brit notes, there is a large concern for international influence, rather than quality, in determining the rankings. Thus, both the Deutsche Zeitschrift fur Philosophie and the Zeitschrift fur Philosophische Forschung get a C, even though both are excellent journals. But, both are also in German, and to quote a former German law professor of mine "that does nobody any good" (since very, very few people bother learning German). This is also a problem, although a lesser one, with French journals.

    What does strike me as real problem, though, is that this approach to a large extent undermines the hope that Brian raises that the list can be useful in getting continental countries out of the "publishing friends and acquaintances" approach that often still dominates. After all, German readers know that these journals are good, so aren't likely to be impressed by the fact that they get a C rating. Instead, they will simply regard this as a reason to reject the list as useful for evaluating the quality of journals publishing work in German – something they'll be quite right to do.

    There is also the obvious issue that the list of people putting the ranking together, while European, is obviously rather heavily aimed at the analytic – something that will hardly go unnoticed on the Continent, where traditional continental philosophy is still very popular (particularly via Husserl), and is often worked on within the same department as analytic philosophy (unlike the stronger division that still usually characterizes English-language departments), and indeed not rarely by the same person. Again, that just provides a reason to ignore the list.

    By comparison they can point to the fact that the Bulletin of Symbolic Logic, the Journal of Philosophical Logic, the Journal of Symbolic Logic, the Notre Dame Journal of Formal Logic and Studia Logica all get an A. Undoubtedly all these journals contain some important logic work, but can we plausibly say that they all "make the discipline", unless we narrowly define the discipline as "logic" (and ignore the question of the proportion of the journal's papers that reach that standard, since logic journals, for obvious reasons, tend to publish a huge number of articles).

    Basically, while I applaud the goal behind the list, and its execution certainly isn't ridiculous, the execution is so heavily weighted toward the analytic and those journals already internationally famous that it is likely to be ignored in those areas where such a list could potentially do the most good – and if it isn't ignored it will result in remarkably unjust results, where people are basically punished for not writing in English, and/or for not doing analytic philosophy (even if this is a/the dominant approach where they work, and even if they do it well).

  15. I haven't looked more carefully at the assessment criteria given by Brit above, but do these not strike others as strange?

    On at least some readings of "making the discipline", this phrase (the mark of an 'A' journal') implies some evaluative comment. But that's not necessarily true of the criteria for categories B and C. Being international doesn't necessarily make a journal better than a parochial equivalent, and nor does having an international readership (especially if a journal happens to be written in Polish).

    Given that, these criteria seem to slide between the evaluative and the descriptive, and that can hardly be satisfactory.

    Moreover, the criterion of "making a discipline" also looks flawed. Radical Philosophy is the first point of reference for people working on the issues that it covers, just as the Journal of Consciousness Studies is a very important touchstone for people working on issues related to embodiment in the philosophy of mind. In this case, they certainly seems to "make their disciplines" – but the C and B rankings given here don't reflect that.

    Presumably what this implies is that their disciplnes are supposed by the evaluators to be less well made than the disciplines covered in the Philosophical Review etc. But if this is the rational behind their scores, then further criteria are being used over and above the criterion for scoring an A listed by Brit. That can't be satisfactory either.

    Overall, my feeling looking at these rankings is that I don't know what they're supposed to be telling me.

  16. Laurence B McCullough

    Unless I read the statement on methods wrongly, no information about citations was used by the expert panel. This is odd, to say the least, since citation information is available and provides a non-subjective criterion for assessing the quality of a journal. The resulting rankings are reputational, i.e., reflective of what the philosophy expert panel thought of the journals. Given the number of journals on the list, the small expert panel should not be expected to be familiar with all of the journals. As Leibniz would put it (correctly), the extrinsic denominations (what is said of the journals, i.e., the ranking of each) may be very poorly founded in many, perhaps most, cases. Philosophy needs to catch up to the standards of science, such as the ISI Web of Knowlegde rankings by impact factor, immediacy factor, and other objective criteria. The rankings from our colleagues in Europe will have little or no plausibility among our colleagues in the sciences, quite reasonably so. This probably will create problems at the level of university-wide rank-and-tenure committee meetings when a dossier invokes these new rankings of philosophy journals.

    Larry McCullough
    Center for Medical Ethics and Health Polocy
    Baylor College of Medicine

  17. Roughly, if the name of the journal is in English or Latin, it gets an A or a B, otherwise a C.

  18. I'm a little surprised to see neither Philo nor Philosophia Christi, both important enough journals in philosophy of religion that they should have been listed.

  19. Axel Gelfert wrote:

    “Yes, it does seem that some (in fact, a very small number of) European journals are ranked higher than comparable non-European titles. But, then, why shouldn't it be that way? The European Science Foundation's aim in preparing the rankings is to raise the "visibility of European Humanities research" (quote from their website). In other words, the rankings are part of broader policy to promote European research in the humanities, and it seems entirely acceptable that this policy should also extend to promoting European publications.”

    One way in which a ranking of journals could be part of a broader policy of promoting European Humanities research would be by making public a fairly accurate, objective ranking in terms of quality, thereby rewarding those journals that rate more highly and providing an incentive to those journals that rate less highly to try to move up (supposing that the ranking will be ongoing). There certainly is nothing objectionable about this. But of course if “European journals are ranked higher than comparable non-European titles”, the objectivity and accuracy of the ranking comes into question.

    Another way a ranking of journals could be part of a broader policy of promoting European Humanities research would be by inaccurately giving a high rating to European publications in the hope that the ranking will have some influence on opinion because of factors other than the accuracy of the ranking, e.g., the prestige of the board doing the ranking or the body sponsoring it. (As, e.g., when an advertiser uses a celebrity to tell us that some product rates highly.) Can’t we assume this sort of thing would be objectionable? And if the ESF ranking is systematically biased towards European journals, even if only slightly, in order to promote European publications, isn’t this what is going on?

    Axel Gelfert also wrote:

    “The bias in favour of English-language publications is real and, I think, inevitable, given the trends in global academia. (As a German with a British Ph.D. teaching at an Asian university, I think I can attest to the inevitability bit…) It makes little sense to lament this fact since there really is no other lingua franca. (And what would be the alternative? Require all academics to publish in English, French, and German, or perhaps return to Latin? And what about those non-European traditions? Sanskrit, anyone?)”

    The first thing that struck me when I looked at the European languages listed here was the omission of Spanish. Don’t an awfully lot of people speak Spanish? More than, German or French certainly. For that matter, do more people speak English than Spanish? I thought I had best look into the matter, if only briefly on the Web—particularly given that the “Empirical Philosophy” T-shirts have just gone on sale!

    Here’s what I got for a top 10 list of number of people who can speak a language:

    http://www.soyouwanna.com/site/toptens/languages/languagesfull.html

    No surprise that Mandarin most widely spoken. But English indeed is number 2. Number 3: Hindustani. The next two are Continental European languages, but they are not French and German, it’s Spanish (4) and Russian (5). The only other European Languages on the list are Portuguese at 8 (remember all those Brazilians who do have a lovely, active philosophical community) and French at 10.

    What follows? I don’t even hazard a guess.

  20. Andrei Buckareff

    Jeremy,

    That Philo and Philosophia Christi are not on the list is not as problematic as the lack of Religious Studies on the list. The two journals you mention are not as influential in philosophy of religion circles as Faith & Philosophy, Religious Studies, and International Journal for Philosophy of Religion. Granted, P and PC have not been around as long as F&P, RS, and IJPR. My guess is that P and PC would each get a C given that F&P and IJPR each get a B (which seems fair if we are interested in the influence of these journals in philosophy broadly). I'm more dismayed by the lack of Sophia–a longer running philosophy of religion journal that was a major player for some time and has been around longer than either F&P or IJPR.

  21. Another clarification, since nobody seems to bother to actually look at the ESF website. Of course Philosophia Christi, Religious Studies, Faith & Philosophy and others are all included in the ERIH index, precisely where they belong: in the list of Theology & Religious Studies journals.

  22. Michael Shaffer

    Social Epistemology is on neither the philosophy nor the HPS list. It seems like it should be on both.

  23. Gualtiero Piccinini

    Two clarifications:

    1. Re: "to ridicule those who (also) publish in French, German, Italian, etc. as "insular" or as publishing "only in their native languages in journals published by their local friends and colleagues" (the latter – quite outrageous – claim is from a different blog)." I suppose this is a reference to my blog. But I never claimed that ALL those who publish in their native languages … I was referring to those that do (to some degree or another). And I was not ridiculing them. As Simone Gozzano pointed out in his comment to my post, publishing in their native language in their local journals is often an "academic need". I was, however, expressing hope that their academic communities will move towards more transparent and less parochial methods, which will probably lead to more internationally noticeable publications.

    2. Re: why didn't they use citation indexes and the like? ERIH was intentionally done without looking at citation patterns. Those are already available and according to my source in the committee, are believed by the ESF to have a pro-U.S. bias. (I don't know why they felt that way.)

    And one question:

    Re: descriptive vs. normative evaluation. Having an international audience, authorship, reputation, and standards (as opposed to local ones) is surely descriptive, but isn't it also a good thing for a journal to have, other things being equal?

  24. Margaret Atherton

    One worry that always comes up in my mind is how these rankings are going to be used. I have no problem with ranking BJHP higher than HPQ–although both publish excellent articles, BJHP does represent the standards of the subdiscipline, while HPQ has a particular mission, to publish papers that relate historical figures to contemporary philosophical issues, which does not reflect the way the subdiscipline is moving. But I would really worry if tenure committees decided to take less seriously papers publisned in HPQ. (And another tiny voice of complaint: Why is Oxford Studies in Ancient Philosophy included but not Oxford Studies in Early Modern Philosophy?)

  25. In my own judgement, I believe that these lists are potentially quite damaging in several ways. The first list that was released was to my mind horrendous, with an extended "excluded" list of journals which counted for nothing, including Journal of Social Philosophy, Harvard Law Review and several other law journals, and so on. That list has certainly been improved to some extent since (and I agree that there has been some grade inflation along the way), although I continue to be shocked that the Journal of Social Philosophy is missing. I am frightened to think what the list would look like in any incarnation without the help of Jo Wolff in keeping us up to date.

    I do think the treatment of journals in moral, political, and legal philosophy is shocking and several cases I note on my blog linked by Brian, so I won't rehash them here. There clearly need to be some major changes here.

    However, and perhaps Axel Gelfert as also read the lists carefully, some may notice that some journals on some lists are ranked "A" in one place and "B" on another. This raises some issues:

    (a) The lists need to become uniform: if a journal is to be considered A, B, or C, then it should be consistent across lists.

    (b) If journals are noted on more than on one list (which seems to me quite reasonable as many journals are interdisciplinary), then why some and not others? I can't see any problem with adding some journals in the philosophy of science nor philosophy of religion/theology to our beloved philosophy list. Why some and not others?

    (c) Moreover, it strikes me that interdisciplinary journals really should be noted (when appropriate) on more than one list. Our friends in philosophy of science may publish in journals of great reputation for scientists and merit inclusion in these lists. However, their journals are no less "philosophy" than any other area: why not include them and avoid confusion? Otherwise, those of us considering the merits of different journals in, say, philosophy will not satisfy our curiosity by looking at the philosophy list alone but, well, all of them.

    Finally, I note that a few publishers I spoke with knew absolutely nothing about these rankings. (I'm currently working with the UK's Political Studies Association in our approach to the political science list the ESF will draw up soon.) This may be a good thing: publishers will surely go to battle to help their work, thinking of profit over quality. However, these lists could be very damaging in harming a journal's status in its publisher's eyes and leading to budget considerations, potentially closing off an important area of research.

    My real concern is for journals less than A. Of course, it is true that the list notes other factors beyond the journal ranking should be considered. It's just a pity it nowhere notes that the quality is in the particular pieces of work. It is useful to know the relative status of journals–and I've always accepted lists provided here. The lists provided by the ESF are generally ok, but there are major flaws and having such a document whose purpose is to be a factor in major funding grant applications for academics in Europe, such as myself and many other readers, is a source of great worry and concern. We all deserve better.

  26. W. V. Quine (Theories and Things):

    The mass of professional journals is so indigestible and so little worth digesting that the good papers, though more numerous than ever, are increasingly in danger of being overlooked. We cope with the problem partly by ignoring the worst journals and partly by scanning tables of contents for respected names. Since the stratification of the journals from good to bad is imperfect, this procedure will miss an occasional good paper by an unknown author.

  27. Roberta Millstein

    To respond to Axel Gelfert's remarks about Biology and Philosophy:

    If it were the case that B&P didn't publish much history of biology, then, yes, it would make sense that it had a lower ranking in HPS than in Philosophy. But that is not the case, as a scan of any issue of the journal would show. I think it is fair to say that most of us who read and publish in the journal consider ourselves to be in the field of HPS, and so make up a large portion of the HPS community by that fact alone, aside from any impact the journal has to the broader community. Moreover, many of us believe strongly that the best phil sci is phil sci that is informed by hist sci, so the idea that a journal could have more impact in philosophy than in HPS is a bit puzzling. And as others have pointed out, it is strange that there would be no 'A' journals for a particular subdiscipline, and strange to think that B&P, and other journals such as Studies in History and Philosophy of Biological and Biomedical Sciences, are not journals that "make the discipline" of the philosophy of biology.

  28. As commented on by others the fate of applied philosophy & ethics seems poor according to this ranking schema.
    But then a fair amount of arbitrariness appears to dominate these lists, who for example in the bioethics field considers the Hasting Centre Report an A while the Journal of Medical Ethics and Bioethics are only a B. Worse still the American Journal of Bioethics doesn't even get a look in.

    I share Thom's concern with the impact on those journals that get a B, but what about all of those journals that are left out?

    According to the lists I've come up with myself, just in terms of English language journals there are at least 40 odd bioethics journals and another 30-40 applied ethics journals. I'd be surprised if there weren't similar numbers in other areas of philosophy. It seems rather unfair to not rate these journals at all especially since some of them do on occasion have excellent papers in them.

    It is even more baffling when some of the journals are European based such as the Critical Review of Social and Political Philosophy.

  29. I would quickly take issue with one comment several people have made, namely that there is something odd about a journal being, for example, ranked A in Philosophy, but B in HPS. That seems entirely appropriate to me. After all, just because a journal is a top general philosophy journal doesn't mean it is also a top HPS journal. It may publish enough HPS articles to warrant inclusion on the HPS list, but still be less important a journal in that field than in philosophy generally.

  30. I couldn't agree more with Thom that such lists are potentially damaging, both to individual researchers and to the profession as a whole. It all comes down to how such lists are going to be used; as I think made clear in my earlier comment, rankings of any sort are best used with caution and as only one indicator amongst many. However, just as there is potential for damage, there are also potentially beneficial uses. All I am saying is that the current lists — which, to repeat, are only "initial" lists, by the ESF's own admission — are better than most other informal and subjective rankings that are available, for example on the internet, and simply dismissing them as "absurd" does not do justice to the significant effort that seems to have gone into preparing them (and this includes, as Thom rightly points out, the effort that has gone into criticising and refining earlier versions). Where there are omissions or biases, there need to be interventions from those who are knowledgeable in the relevant areas of specialization, and it seems — at least judging from their website — that the ESF has a genuine interest in receiving such feedback. So, hopefully, in the long run the lists will provide some useful guidance and transparency. (On a different point raised by Thom, I still do not see why consistency requires that a journal that appears in different subject lists should have the same score on each of them? Surely it is conceivable that a journal that mainly publishes papers on subject X, but occasionally carries articles on subject Y, can rank differently when compared with other X- or Y-journals, respectively. It seems to me that rankings relative to a (broad) subject area (hence the division into Philosophy, HPS, Religious Studies/Theology) are more realistic than absolute rankings that strive to compare, say, the Journal of Economic Methodology with Faith & Philosophy. But I don't have any strong views either way on this particular issue.)

  31. To respond to Tony, I take it that what puzzles Roberta is the simple fact (which any historian or philosopher of biology will confirm) that Biology & Philosophy is without question the best journal in philosophy of biology and one of the two or three best journals in history and philosophy of biology more generally. So unless there is some evidence that history and philosophy of biology is somehow less well respected among historians and philosophers of science than other areas of HPS, the B&P rankings make no sense.

  32. I think it's important to see what the lists are. They aren't just lists of good journals in philosophy, etc. They're lists "useful in determining the international standing of the research activity carried out in a particular field in a particular country." This implies several things:

    1) A corresponding list based on citations would be easy to develop. This would measure something different and would be more useful, for example, for schools to use in tenure and promotion decisions or in allocating resources between universities.

    2) If you want to move up from C to B, the strategy seems easy: internationalize your editorial board, advertise more overseas, and (maybe) think about the centrality to the discipline of the subjects you publish on, or the languages that papers are written in. If the list keeps getting updated and becomes influential, then I think in 10-20 years it would be reasonable to conclude that journals still getting a C want to remain local.

    3) The category of A journals is limited to 10-25% of the total. This seems odd. If the goal is to find the most influential journals internationally, it may be better to limit the As to, let's say, a half dozen journals per field. After all, how many really influential journals can there be on the international scene? That would be enough to fight grade inflation. The best journals could fight for As, and everyone else would decide how international they want to be.

    4) One result might be to encourage journals to publish in multiple languages. A journal that has articles that are 60% in English, 20% German, and 20% French might end up being more widely read than one that's 100% anything. Wouldn't that be good for European (and American) scholarship?

    5) A lot of jourals set the standard in a small subfield, but aren't widely read by philosophers generally. Then they shouldn't even be be considered for an A rating, unless they do have outstanding influence outside their subfield.

  33. "Unless I read the statement on methods wrongly, no information about citations was used by the expert panel. This is odd, to say the least, since citation information is available and provides a non-subjective criterion for assessing the quality of a journal. The resulting rankings are reputational, i.e., reflective of what the philosophy expert panel thought of the journals"

    Larry, three things: first, citation information is not objective. Second, if the rankings were based on citation information, history journals and narrow analytic journals would have a clear disadvantage compared to science journals, language journals, etc. Third, it is not clear why the ESF should rank journals based on citation information; citation information is not typically taken into account when philosophers, Ph.D. programs, philosophical projects, and so on, are ranked or evaluated.

  34. A follow-up on Thom's 9:57 post. It seems to me that whether journals should appear on more than one list, and whether a journal should receive the same ranking on every list on which it appears, are not independent issues. If it receives the same ranking on every list, there would be little need for it to appear on more than one list: one could just assume that it would receive the same ranking on any other list. On the other hand, if it makes sense to give a journal different rankings on different lists, then each journal should appear on every list relevant to it.

  35. The panel should be congratulated for such extraordinary thoroughness and scope. Given the length of the list, some oddities are to be expected. Here, however, are two:

    The criteria explicitly say that unsolicited submissions should be possible in a high-quality journal. Yet Proceedings of the Aristotelian Society got an A. (And there are other by-invitation-only journals on the list.)

    Of the journals with high Google Scholar Hirsch numbers (see http://leiterreports.typepad.com/blog/2007/04/using_google_sc.html), only the Canadian Journal of Philosophy got B. Yet it does do well on all the criteria, including international submissions, active editorial board, etc.

  36. A benefit of the new lists that, I think, has not been mentioned yet, is that it provides (at least in principle) a more transparent way of ranking journals.

    For the moment, one of the 'rankings' that's being used (at least by the funding agency I depend on) makes a difference between journals that are indexed by ISI/Web of Science and those which are not; the former being regarded the only who really matter.

    Yet, this strikes me as much more arbitrary. For instance, some journals ranked A in the new list have been trying to get into ISI, but do not succeed. For all I know, does not seem to motivate their decisions; they even hardly communicate…

  37. Colin Farrelly

    The sheer volume of journals in the field is simply amazing (I started counting the number on the list but gave up after I reached 36 under the letter ‘A’ alone!).

    To be honest, I wonder how any one philosopher would feel competent to make even the modest ranking discriminations of A/B/C on so many journals in all different areas. To make such discriminations for just 10 journals in my own area of specialisation would be tricky enough. So how anyone could feel they could make informed comparisons among so many journals (without relying on citation info, etc.) in all areas of the discipline is baffling to me.

    Cheers,
    Colin

  38. I'm unclear about the point of ranking philosophy journals. Seriously, what's the thinking behind producing such a ranking? It's often claimed, for example, that part of the usefullness of PGR is that it provides useful information to budding graduate students that would otherwise be very difficult to attain. That seems reasonable. In the case of philosophy journals, however, I would think that nearly every professional philosopher could roughly — i.e. less extensively — produce a list of journals by quality that would track the results of the ranking under discussion; or, rather, I would think that nearly every anyone who'd be interested in this ranking could produce such a list. Would this kind of ranking be useful, say, for tenure and promotion committees?

  39. Again, this list by the ESF (European Science Foundation) is for their own use in awarding research grants. Moreover, funding agencies, such as the UK's Arts and Humanities Research Council, have talked about the introduction of metrics and one fear is that they may use the ESF's list. This issue then is of some importance to those of us in the UK. It is true that the list notes it shouldn't be the only factor, but its primary purpose—and I'm happy to be corrected—is to offer some quantitative measure of output quality with which to measure the research grant applications to the ESF. My own view is that the quality should be determined by the published work itself. To be frank, we have all found poor articles every now and then in even the very best journals—and important work in lesser ranked journals (and edited books). Whether or not something possesses high quality is in the thing itself, not its branding. Nor do I have much faith in citations: every Hegel scholar will be unsurprised to find Popper's The Open Society and Its Enemies score well, even if it is one of the very worst books written on Hegel's political philosophy. I simply do not approve of these shortcuts to actually reading outputs. Moreover, I wonder —as many projects culminate in books— if a book measure will be produced as well. This may be even more problematic.

    As for the number of journals, the ESF notes only a small fraction of the journals that are out there. The most comprehensive list is the American and International Directories of Philosophers published by our good friends at the Philosophy Documentation Center. I simply cannot recommend these two publications enough. Everything one wants to know about journals in the US, UK, and elsewhere is in its pages.

  40. I noticed Religious Studies, but I thought their reason may have been that Religious Studies is a broader journal than just philosophy of religion. Sophia is a philosophy journal, however, as are Philosophia Christi and Philo. Philosophia Christi is produced by an organization for philosophers, and its companion theology association has its own journal. Characterizing it as a theology journal is tantamount to calling a philosophy of mind journal a psychology journal or calling a philosophy of physics journal a physics journal. We begin to lose all sense that you can have a first-order subject matter along with a second-order philosophical examination of that subject. It strikes me as rather strange also to say that these journals are not philosophy of religion, while Faith and Philosophy is. Philosophia Christi and Faith and Philosophy do not have the same standards for acceptance, but they do have the same kinds of content.

  41. The question was raised as to why the BJPS got an "A" ranking. That's because it is one of the very best journals in the philosophy of science. Indeed, the last time I looked at the ranking of "impact factors" of journals (for those of interest to researchers in the sciences and the social sciences), the BJPS had a slightly higher numerical score than did Philosophy of Science. The impact-factor analysis, I am given to understand, is carried out with attention to citation rates and other measurables. Perhaps it is time for philosophers to create a simlar ranking of their own journals, rather than letting outsiders do it for them?

  42. Greg Frost-Arnold

    Question: Is there a real citation index for philosophy journals rivalling ISI Thomson or Scopus? I realize that some philosophy journals show up in these indices, but only because some scientists cite them occasionally — as a result, the scores given to philosophy journals in such indices do not even approximate the actual reputations of the journals. (For example, the 2006 ISI Web of Knowledge ranking of journals in "History and Philosophy of Science" is way off: the impact factor of Agricultural History is #1 and Agriculture and Human Values is #2. BJPS is #8 and Philosophy of Science is #26.)

    Is there something out there that I am missing that gives a reasonable ranking of the impact factor of philosophy or HPS journals based on citation rates?

  43. Greg Frost-Arnold

    My apologies — The particulars of my previous post are completely wrong, though my general point still more-or-less holds: the journals with the highest impact factors are (in order) Social Studies of Science, Biology & Philosophy, Public Understanding of Science, and J. of the History of Medicine. BJPS is #5, and Philosophy of Science does not appear until #16 (out of #28).

  44. Readers may be interested in a new twist in the ESF journal ranking story, reported on my blog here: http://the-brooks-blog.blogspot.com/2007/07/breaking-news-on-journal-rankings.html#links

    The short version is as follows, after having several communications with various people at the ESF on this topic over the last few weeks:

    The ESF *is* producing journal ranking lists for all subjects in humanities, including Classics and Philosophy (already posted on their site). However, the surprising thing is that the ESF *is*not* producing journal ranking lists in social science, including Law and Political Science.

    This is surprising as many of us have been led to believe that all subjects covered by the ESF would have journal ranking lists (reserved for use in assessing grant proposals to the ESF—and feared to become a relevant factor in new metrics arrangements assessing research quality in the UK and elsewhere in Europe). This will not be the case and different areas of the ESF (such as humanities and social science) will assess items differently, with the former using journal rankings and the latter not using them.

    The bottom line: editors of journals in the areas of legal and political philosophy need not worry about lists in Law and Political Science in addition to the current Philosophy list. This is itself interesting (and curious) news.

Designed with WordPress