tag:blogger.com,1999:blog-31003419.post2896473535121836606..comments2023-11-02T07:50:38.614-05:00Comments on Lemmings: RankingsBrit Brogaardhttp://www.blogger.com/profile/17944929071368873218noreply@blogger.comBlogger10125tag:blogger.com,1999:blog-31003419.post-24949553486831695252007-02-21T22:53:00.000-06:002007-02-21T22:53:00.000-06:00but I am still wondering why the fact that Penn St...but I am still wondering why the fact that Penn State was recently in receivership should indicate that they are not now a good place to receive an education and training in at least some areas of philosophy? Something is missing. Maybe it is the consequences that having been in receivership has had on the program? or some continuity between the reasons the department went into receivership and what is going on in the program now?<BR/><BR/>I'm also skeptical about the strength of the correlation between the high ranking and the quality of education and training one receives. I think mentoring is a very important aspect of graduate education and none of these lists speak to that in the slightest.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-31003419.post-3151443409315751252007-02-21T22:08:00.000-06:002007-02-21T22:08:00.000-06:00anonymous, i don't see how. The ranking purports t...<I>anonymous, i don't see how. The ranking purports to tell us about faculty scholarly productivity by looking to specific publication venues. Accuracy of the list should be judged according to whether it accuratly does that, no?</I><BR/><BR/>Of course, ranking a department which was recently in receivership in the top 10 doesn't in itself undermine the accuracy of the ranking <I>qua</I> ranking of faculty scholarly productivity. But I take it that one of the chief reasons one might care about such a ranking is because one takes faculty scholarly productivity to correlate reasonably well with program quality. Including a department which was recently in receivership in the top 10 seems to me to cast serious doubt on either (i) this correlation or (ii) the accuracy of the ranking <I>qua</I> ranking of faculty scholarly productivity. The more one likes the correlation (and it does seem there's something to it), the more likely one will be to conclude that the ranking isn't really accurate. I guess I was too brief in my previous post. Apologies.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-31003419.post-40410476455430764392007-02-21T09:25:00.000-06:002007-02-21T09:25:00.000-06:00anonymous, i don't see how. The ranking purports t...anonymous, i don't see how. The ranking purports to tell us about faculty scholarly productivity by looking to specific publication venues. Accuracy of the list should be judged according to whether it accuratly does that, no?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-31003419.post-85388380977596695812007-02-20T14:24:00.000-06:002007-02-20T14:24:00.000-06:00Didn't the Penn State department only recently com...Didn't the Penn State department only recently come out of receivership? Ranking such a program in the top 10 would seem to be <I>prima facie</I> evidence against the accuracy of the list.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-31003419.post-16071392761275409832007-02-19T22:40:00.000-06:002007-02-19T22:40:00.000-06:00Yes, the expert advice is indeed limited. But LOT...Yes, the expert advice is indeed limited. But LOTS of experts are consulted, and so, only very few areas are under-represented. Of course, the ideal would be to expand FSPI to take account of mainstream journals, book chapters etc. We would then have two comparable lists. If they were still very different, we could create a super-list based on Leiter and the expanded version of FSPI.Brit Brogaardhttps://www.blogger.com/profile/17944929071368873218noreply@blogger.comtag:blogger.com,1999:blog-31003419.post-17730278948121606532007-02-19T22:04:00.000-06:002007-02-19T22:04:00.000-06:00On the other hand, as has frequently been mentione...On the other hand, as has frequently been mentioned as criticicism of the PGR, expert advice is limited to the expertise of the experts. If few of the people consulted work on Husserl or even Deleuze, then the PGR is simply not capable of giving any expert advice whatsoever on quality of departments where a good deal of work on those figures takes place. While quantity measures seem like a bad idea, the "quality" measure is just as questionable.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-31003419.post-83245413643039072242007-02-19T11:30:00.000-06:002007-02-19T11:30:00.000-06:00The fact that very few philosophy journals are tra...The fact that very few philosophy journals are tracked explains a lot. The fact that book chapters are not tracked explains less as that should affect everyone equally. When I said "the FSPI clearly cannot measure originality or quality of articles as well as the PGR", I wasn't aware that they tracked very few good journal articles. In fact, I was assuming that they had taken the mainstream journals into account. What I meant was this. The PGR is based on the opinions of a large number of experts in the field. Expert advice is usually more accurate than random stats.Brit Brogaardhttps://www.blogger.com/profile/17944929071368873218noreply@blogger.comtag:blogger.com,1999:blog-31003419.post-21469635850034751532007-02-19T10:55:00.000-06:002007-02-19T10:55:00.000-06:00A quick follow up on Richard's comment. It doesn't...A quick follow up on Richard's comment. It doesn't look like any provision has been made to take into account invited chapters of books; it's mentioned that book publications in that period are recorded via Amazon, and peer-reviewed journals using Scopus, but I couldn't see any suggestion that they were tracking book chapters. If I'm right, that would presumably also help to explain why some departments seem so strangely un-prolific.Aidanhttps://www.blogger.com/profile/16164506970522004673noreply@blogger.comtag:blogger.com,1999:blog-31003419.post-48499829003496763562007-02-19T09:57:00.000-06:002007-02-19T09:57:00.000-06:00Would you please give justification for the statem...Would you please give justification for the statement "The FSPI clearly cannot measure originality or quality of articles as well as the PGR?" I find such an assertion to need more discussion in this context.khadimirhttps://www.blogger.com/profile/12960757465883819380noreply@blogger.comtag:blogger.com,1999:blog-31003419.post-35049773371483007952007-02-19T01:36:00.000-06:002007-02-19T01:36:00.000-06:00That weird ranking is easily explained: the databa...That weird ranking is easily explained: the database they use for article publications includes only Open Access journals. Hence, it includes very few good philosophy journals. The list of journals classified as Arts and Humanities/Philosophy, is:<BR/>Journal of Mundane Behavior<BR/>Philosophy of the Social Sciences<BR/>Ethik in der Medizin<BR/>Journal of Consciousness Studies<BR/>Ethique et Sante<BR/>Library Philosophy and Practice<BR/>Journal of the History of Ideas<BR/>Social Philosophy and Policy<BR/>Studies in East European Thought<BR/>Review of Metaphysics<BR/>Kennedy Institute of Ethics Journal<BR/>Ethics<BR/>Philosophical Magazine<BR/>Journal for General Philosophy of Science<BR/>Journal of Value Inquiry<BR/>Pastoral Psychology<BR/>There are a couple philosophical journals classified in other areas, e.g., for some reason, the Southern Journal of Philosophy under Medicine.Richard Zachhttps://www.blogger.com/profile/10074252272606254341noreply@blogger.com