- 最后登录
- 2012-8-10
- 在线时间
- 14 小时
- 寄托币
- 389
- 声望
- 4
- 注册时间
- 2004-7-10
- 阅读权限
- 20
- 帖子
- 1
- 精华
- 1
- 积分
- 341
- UID
- 169383
- 声望
- 4
- 寄托币
- 389
- 注册时间
- 2004-7-10
- 精华
- 1
- 帖子
- 1
|
A guide to graduate study in economics: ranking economics departments by fields of expertise.Comment: | A guide to graduate study in economics: ranking economics departments by fields of expertise. | Author: | Grijalva, Therese C.; Nowell, Clifford | Article Type: | Report | Geographic Code: | 1USA | Date: | Apr 1, 2008 | Words: | 12904 | Publication: | Southern Economic Journal | ISSN: | 0038-4038 |
1. Introduction
Each year, thousands of undergraduates apply for admission tograduate schools in economics intending to obtain a Ph.D. Many of thesestudents have little idea on how to choose a graduate program, and manygo to an undergraduate adviser looking for advice. Prospective graduatestudents and their advisers have little published research to help themin the process of choosing what schools best match theundergraduate's skills and interests.
This study highlights many of the characteristics of departmentsthat offer doctoral degrees in economics and provides information onboth overall productivity and productivity by subject field. Thisresearch is significant for those looking to obtain a Ph.D. in economicsbecause the choice of where to attend graduate school has been shown tobe important in both academic and nonacademic job markets. Research intothe careers of Ph.D. economists (Barbezat 1992; McMillen and Singell1994; Stock and Alston 2000; Siegfried and Stock 2004) consistentlyindicates that graduates from top-rated schools fare better in academicand nonacademic job markets than their peers from lower-ranked programs.
Based on the finding that the quality of the school influencesoutcomes in the job market, the best advice for those applying tograduate school in economics may simply be to apply to the best schoolsto which you will likely be admitted. Yet this advice is of little valuefor those who are unlikely to be admitted into a top program yet have astrong interest in one of the many subject fields of economics and astrong desire to pursue a particular field. This group of students isleft getting advice from an undergraduate adviser who cannot be expectedto know the strengths of economics departments across the country or tosearch the Web pages of all the programs that offer a Ph.D. looking forclues as to what school is the best match.
In this article, we provide information to undergraduate studentsand their advisers on the research strengths of 129 economicsdepartments that offer Ph.D. degrees in the United States and toidentify schools that are ranked highly in the many different subjectfields of economics. This article should also provide guidance todepartments hiring new Ph.D. candidates within a specific field and tojob candidates looking for information on potential academic employers.
This article differs from the many papers ranking the quality ofeconomics departments by identifying the relative strength of all Ph.D.programs and by specifically providing information on all the majorsubject fields in economics. Although Tschirhart (1989) ranksdepartments in fields of expertise, only a limited set of fields isidentified, and departments are ranked using data that are now over 20years old. U.S. News and Worm Report (1) also provides a ranking ofeconomics departments by field. Their ranking is based on surveyresponses of department chairs who were asked to rank all departments ona five-point scale. Department rankings by field can also be found onthe EconPhd.net website (http://www.econphd.net). This site ranksdepartments by field, using publications in 63 highly ranked economicsjournals during the 1993-2003 period. The data we used as the basis forthis article are more comprehensive and cover a larger time frame. Weused all journals in which economists at the Ph.D.-granting institutionsin the United States had published during a 20-year period. Our data setconsists of publications in 254 journals over the 20-year period1985-2004. This analysis provides by far the most detailed, completeranking of departments by field in the literature.
In addition to simply identifying the top 20 schools in each field,other information, not found elsewhere, is provided on the relativeimportance of the field at the school and how the scholarly output isdistributed across the department's faculty. To measure theconcentration of faculty in a field, we calculate a Herfindahl-HirschmanIndex (HHI). The HHI is particularly important for an undergraduate toconsider. Planning to obtain a Ph.D. from a school in hopes of studyingwith a single person is a risky undertaking not only because the facultymember may move but also because any single faculty member can mentor only a limited number of students.
We recognize that ranking departments is fraught with danger.Thursby (2000) has pointed out that using single measures of departmentproductivity suggests differences between many departments that aremeaningless, a finding we reiterate when solely aggregate measures ofperformance are used. However, by providing detailed information ondepartments by field and by identifying the publication patterns of thefaculty within the field, we are able to highlight some differences thataggregate measures gloss over.
2. Methods
Similar to Tschirhart (1989), the data-gathering stage consists offour basic steps: (i) identifying all Ph.D.-granting institutions ineconomics as of the 2004 spring semester, (2) (ii) identifying alltenure-track or tenured faculty as of the 2004 spring semester, (iii)acquiring a list of faculty publications, and (iv) determining thequality of each publication.
To identify the universities offering doctoral degrees ineconomics, we used the website maintained by the University of Albany.(3) This site contained a list of all economics departments with Ph.D.programs at American and Canadian universities and was verified withPeterson's Guide to Graduate Schools. (4) Based on this, weidentified 129 programs located in the United States that offereddoctoral degrees in economics as of the spring of 2004.
The second step, identifying all tenure-track or tenured facultyfor each university, was accomplished by accessing economics departmentWeb sites. A slight shortcoming of this approach is that faculty listsare highly dependent on whether a department maintains and updates theirfaculty lists. Removing faculty members without any publicationsresulted in over 2600 faculty names. In the few cases where facultyappeared on multiple department websites, we included the faculty memberin the department where he or she had a permanent and currentaffiliation. We recognize that there are some faculty who are members ofa department other than economics (e.g., the Department of ManagerialEconomics and Decision Sciences at Northwestern University) yetcontribute to the education of graduate students and are productive inthe field of economics. Determining who these faculty are and the extentto which they are involved in the economics department made itimpractical to include them in the analysis.
The third step focused on acquiring journal publications for eachfaculty member listed in the Journal of Economic Literature databaseEconlit. The database was queried for the publications of tenure-trackfaculty identified by the 129 departments. Faculty were dropped from theanalysis if Econlit indicated that they had no published articles. Thisstudy focused on articles published between 1985 and 2004. Over thistime period, Econlit cataloged over 38,000 publications of faculty whowere employed in Ph.D. economics programs as of the spring of 2004. (5)Further, Econlit provided four essential pieces of information thatwould be needed for analysis: (i) article source, (ii) page numbers,(iii) number of authors, and (iv) Journal of Economic Literature subjectcodes. The article source would be needed in order to assess the qualityof the article. The credit each author received for a publication wasweighted by the number of authors and page length. The greater thenumber of coauthors, the less credit assigned to each coauthor, and thegreater the length of the article, the greater the credit assigned toeach coauthor. (6) The subject codes would be needed to sort articles bya field of expertise.
The final step was assigning a quality index, [Q.sub.j], to eachjournal. We used both the impact factors published in the 2004 SocialScience Citation Index (SSCI scores) and rankings based on"citations per character in 1990" for articles publishedbetween 1985 and 1989 (JEL scores) proposed by Laband and Piette (1994).(7) Many publications contained at least one or both an SSCI and a JELscore. There were 107 journals containing both an SSCI and a JEL score.There were an additional 131 with only an SSCI score and an additional16 with only a JEL score. Thus, the total number of journals indexed inthe SSCI that we used in our analysis was 238, and the total number ofjournals indexed in the JEL that we used in our analysis was 123.Publications that had neither an SSCI nor a JEL score were dropped fromthe analysis. It should be noted that although the SSCI indexes 172journals in the economics discipline, we use all publications identifiedby Econlit and indexed in the SSCI, even if outside the economicsdiscipline, in calculating productivity.
Following Tschirhart (1989), articles were adjusted by number ofauthors and page length. The first step consisted of dividing the numberof pages of article i, [pages.sub.i], by the number of authors (n), thusensuring that each author received 1/n credit times the number of pages.The second step consisted of taking the value from the first step([pages.sub.i] divided by n) and dividing it by the average length ofall articles from the same journal j ([[bar.p].sub.j]). The weightingthat each coauthor of article i in publication j, [W.sub.ij], receivesis given by
[W.sub.ij] = [pages.sub.i]/[n.sub.i]/[[bar.p].sub.j]
The quality, [Q.sub.j], of each article was then multiplied by[W.sub.ij], yielding a productivity value, [P.sub.ij], indicating theweighted quality assigned to each article assigned to the author. Theseweighted productivity values were summed by individual and then byschool. The results presented in this study are based primarily on theSSCI scores because of the broader coverage of the SSCI and because theSSCI includes many of the newer journals that began publication after1985.
In preparing to rank schools by subject fields, the JELclassification system was used. (8) The JEL classification systemconsists of 18 different subject fields. We eliminated one subjectfield, M (business administration and business economics, marketing, andaccounting). The remaining 17 subject fields are listed in Table 1. Thesubject field with the greatest number of faculty publications was D,microeconomics, and the field with the least number of facultypublications was JEL code B, methodology and history of economicthought.
3. Results
After gathering and cleaning the data and making the previouslymentioned calculations, rankings are computed. The results are presentedin Tables 2 and 3.
The second column of Table 2 provides the overall productivity rankof all 129 departments. This ranking was computed by summing [P.sub.ij]for each university, with the top university having the greatest overallproductivity sum. Although it is similar to rankings found in Graves,Marchand, and Thompson (1982) and Dusansky and Vernon (1998), somedifferences are apparent. These differences can be attributed to thedifference in time periods analyzed, the inclusion of all articleslisted in Econlit rather than a subset, and the use of the SSCI for thequality index.
The third column in Table 2, "Z-Score," indicates thenumber of standard deviations the school's productivity rank isabove or below the mean productivity rank. Only 44 of the 129 schoolshave a positive Z-score, indicating that the distribution of overallproductivity is skewed to the right. A noticeable feature of thisskewness is that distinction between schools diminishes as the rankdeclines. For example, the top-ranked school, Harvard, has a Z-score of5.08, and the fifth-ranked school, Yale, has a Z-score of 2.18, asubstantial difference. However, as we move lower in the rankings, the70th-ranked school, the University of Massachusetts, has a Z-score of-0.43, and the 80th-ranked school, the University of Delaware, has aZ-score of -0.50, a very small difference. The ordinal rankingspresented in much of the literature that ranks economics departmentsmiss the fact that below a relatively small group of top programs, thedifferences in aggregate productivity become fairly small.
The fourth column of Table 2, "Per Faculty Rank," showshow each school ranks when their total productivity sum is divided bythe number of publishing faculty within the department; it representsthe average productivity of publishing faculty in a department and maybe the best indicator of the quality of the faculty for potentialgraduate students. For example, the California Institute ofTechnology has an overall rank of 38 and an average rank of 7,suggesting that thelower overall rank of the department is greatly influenced by thesmaller size of the department and not due to the productivity of eachpublishing faculty member. A student attending this institution wouldlikely obtain an education from "top 10" faculty even thoughthe relatively small department size dampens the overall productivityranking. The fifth column of Table 2 indicates the overall productivityranking of departments based on the journal rankings of Laband andPiette (1994) that appeared in the Journal of Economic Literature.Notice that rankings using the SSCI or those calculated by Laband andPiette (1994) identify the same top 10 schools, and there is only onedifference in the top 20 schools.
The sixth column of Table 2, "Top Field," indicates eachdepartment's best subject field. Top field was determined bysumming each department's productivity for each JEL category, usingthe first JEL code identified by the author as a guide and then choosingthe subject field with the highest sum. The seventh column of Table 2shows the HHI for each school. The HHI is typically used to measure thedegree of market concentration for a particular industry. In this study,the HHI provides information on how concentrated the research is amongthe number of faculty publishing in the department. The HHI is found bysquaring the faculty member's share of the department's totalproductivity and then summing the results:
HHI = [n.summation over (i=1)] [s.sup.2.sub.i]
where s represents the productivity share of the ith facultymember. Values for the index can range from 0 to 1, depending on thedistribution of publication patterns across the faculty at the school. Avalue of 1.0 indicates that all the publications result from a singleindividual, and a value of 0 implies that the publications are spreadequally among the faculty in the area. (9)
The eighth column of Table 2, "Field Strength Index,"demonstrates how well each department does in its top field relative tothe department that is the number one rank in that particular field. Forexample, Harvard's top field is financial economics, and it is thetop-ranked department in financial economics; hence, Harvard has a fieldstrength index of 1.0. Princeton University's top field ismicroeconomics (JEL code D), although its field strength index inmicroeconomics is 0.88, indicating that it produces 88% of the researchof the top-ranked school in the microeconomics category. (10) It isimportant to note that some universities may not offer a field in theirtop field (see footnotes for Table 2). Finally, the last column of Table2, "Average Ph.D. Graduates (2002-2007)," provides informationon the size of each program, and is included to provide additionalinformation to potential applicants. (11) A significant portion ofgraduate education is obtained from one's classmates. As such, thisfigure provides information regarding the activity level of the graduateeducation within a department. A department may have many productivescholars but may not be as actively engaged in its graduate education.
Table 3 identifies the field rankings for each of the 129departments using the first JEL code identified by the author. Allarticles were assigned to a field on the basis of the assumption thatthe first JEL code listed represents the primary subject field of thearticle. Once an article was categorized, the productivity value foreach article, [P.sub.ij], was summed by subject and university, yieldinga total productivity score within a particular field for a particulardepartment. While this information is useful to potential graduatestudents and others, it should be noted that not all fields are offeredat each university. Thus, potential graduate students should confirmthat a field of interest is available at a particular university beforeapplying.
Table 4 identifies the top 20 schools in each field. Thistablealso identifies the number of faculty in each school who publish in thefield regardless of where they publish or whether the journal is listedin the SSCI. Table 4 also shows the HHI for each of the top 20 schoolsin the field. For example, referring to Carnegie Mellon University, thevalue for the HHI in general economics and teaching (JEL subject codeA)is 0.18, whereas for Cornell University the HHI is 1.0. AtCarnegie Mellon, publication in this field is spread out among theeight membersof the faculty who publish in this area. At Cornell, however, all thepublications listed in SSCI are attributed to a single faculty member.(Although at Cornell, three people have published in this area, onlyoneperson has published in journals listed in the SSCI.) As anotherexample, for JEL subject code I (health, education, and welfare),Stanford University has nine faculty members who have published in thisarea and an HHI of 0.63. Michigan State University is ranked slightlylower than Stanford and has 10 faculty publishing in the area with anHHI of 0.16. If a student wishes to pursue a graduate degree ineconomics at Stanford University with an emphasis in health, education,and welfare, he or she should realize that the scholarly activity inthis area at Stanford is concentrated in a few of the nine people whopublish in the area, while at Michigan State University, thepublications are more evenly distributed across the faculty in thisarea.
The fifth column in Table 4, "Importance Index,"demonstrates the importance of a particular field for a departmentrelative to its overall productivity. The importance index simplydivides a department's productivity score for a particular field bythe department's overall productivity score. Refer to PrincetonUniversity, which ranks as the top department in JEL subject codes BandF, methodology and history of economic thought and internationaleconomics, respectively. For methodology and history of economicthought, Princeton has an importance index of 2%, and for internationaleconomics, Princeton has an importance index of 12%. This indicatesthatmethodology and history of economic thought is more likely aspillover category and not the primary focus of the department'soverallresearch agenda.
4. Conclusion
The primary objective of this article is to provide information toundergraduate students and to their advisers on the research strengthsof 129 economics doctoral programs in the United States. We provide bothtotal and average, or per capita, research productivity measures forpublishing faculty and identify schools that are highly ranked in themany different subject fields of economics.
A noticeable feature of our total productivity rankings is that thedistinction between schools diminishes as their rank declines. The datademonstrate that per capita and total productivity measures result indifferences in quality rankings, where total productivity is influencedby both the number of publishing faculty and the productivity of eachfaculty member. Students searching for graduate schools may benefit fromconsidering both the average quality of the faculty and the totalquality of the department.
For students who have a strong interest in a specific subject fieldof economics, we identify the schools that may best fit with thestudent's desires. As a cautionary note, we provide HHI measuresthat alert students to the possibility that some departments may have atop reputation in a subject field due to having a single, veryinfluential faculty member.
Although this information should be helpful to students applying tograduate school, applicants should be mindful of several things. First,one should apply to many different Ph.D. programs. The loss from aredundant application is much smaller than the loss of not applying to aplace that could become one's best offer (or maybe help to get abetter deal elsewhere). Second, although a student will benefit byattending a university ranked highly in his or her preferred field, amajor consideration should still be the overall quality of thedepartment. There are several benefits of attending a highly rankedschool: (i) a student often learns a lot from his or her classmates, whoperhaps are better students; (ii) students may change their preferencesduring their studies, and our study shows that highly ranked departmentsoverall are strong in many fields; and (iii) students may be moresuccessful in their job search if they graduated from a department thatis highly ranked overall. While this article can be a useful tool tostart with, when actually choosing between competing offers, prospectivestudents should check out department websites and relevant curriculavital themselves. (12)
Finally, our work shows that many top-ranked programs based ontotal productivity measures are able to provide an education that isbroad in nature and that gives access to many of the subject fields ofeconomics. For students who are interested in a specific subject field,attending a traditionally top-ranked program will likely not limit thestudent's ability to conduct future research in an applieddiscipline. At the same time, however, for students who will not attenda top-ranked school based on total productivity measures, they willlikely attend a program with actively publishing faculty, and if theychoose their programs correctly, it will still be possible to obtain atop-ranked education in one of the subfields of economics.
Assistance in data gathering was provided by Adrienne Strong.
Received January 2007; accepted August 2007.
References
Barbezat, Debra A. 1992. The market for new Ph.D. economists.Journal of Economic Education 23:262-76.
Dusansky, Richard, and Clayton J. Vernon. 1998. Rankings of U.S.economics departments. Journal of Economic Perspectives 12:157-70.
Graves, Philip E., James R. Marchand, and Randel Thompson. 1982.Economics departmental rankings: Research incentives, constraints, andefficiency. American Economic Review 5:1131-41.
Journal of Economic Literature. 1991. Classification system: Oldand new categories. Journal of Economic Literature 29:xviii-xxviii.
Laband, David N., and Michael J. Piette. 1994. The relative impactsof economics journals: 1970-1990. Journal of Economic Literature32:640-66.
McMillen, Daniel P., and Larry D. Singell, Jr. 1994. Genderdifferences in first jobs for economists. Southern Economic Journal60:701-14.
Siegfried, John J., and Wendy Stock. 2004. The labor market for newPh.D. economists in 2002. American Economic Review Papers andProceedings 94:272-85.
Stock, Wendy, and Richard M. Alston. 2000. The effect of graduateprogram rank on success in the job market. Journal of Economic Education31:389-401.
Thursby, Jerry G. 2000. What do we say about ourselves and whatdoes it mean? Yet another look at economics department research. Journalof Economic Literature 38:383-404.
Tschirhart, John. 1989. Ranking economics departments in areas ofexpertise. Journal of Economic Education 20:199-222.
Therese C. Grijalva * and Clifford Nowell ([dagger])
* Department of Economics, Weber State University, Ogden, UT84408-3807, USA; E-mail tgrijalva@weber.edu; corresponding author.
([dagger]) Department of Economics. Weber State University, Ogden,UT 84408-3807, USA: E-mail cnowell@weber.edu.
(1) Available at http://www.usnews.com/usnews/edu ... dhumindex_brief.php (July 2007).
(2) Departments offering doctorates in agricultural economics werenot included in the analysis.
(3) Available at http://www.albany.edu/econ/eco_phds.html (July2007).
(4) Available athttp://www.petersons.com/graduate_home.asp?path=gr.home (July 2007).
(5) Coauthors listed as "et al." rather than by name inEconlit are not identified specifically by Econlit.
(6) Articles with four or more authors or in articles wherecoauthors are not specifically identified (i.e., et al.) are treated ashaving four authors.
(7) An alternative to using impact factors is to use totalcitations per journal per year. We chose to use impact factors to beconsistent with past research (e.g., see Tschirhart 1989).
(8) In 1991, JEL modified its classification system. We followedthe JEL recommendations in mapping pre-1991 subject codes to post-1991subject codes (Journal of Economic Literature 1991).
(9) It should be noted that in the case of an HHI of 1.0, more thanone faculty member may publish in this area, yet because other facultymembers' publications may not be indexed in the SSCI, they are notrecognized in our data as contributing to the department's researchproductivity.
(10) The field strength index measures only the department'srelative productivity in its top field. It is possible that a departmenthas a higher field strength rating in a field other than its top field.
(11) These data were acquired by calling and e-mailing the graduateadvisers or the department administrators at each university. In somecases, multiple attempts were made to contact the department and acquirethis information.
(12) We thank an anonymous referee for pointing out thesecautionary notes. |
|