Taking a Snapshot: Four Bibliometric Indicators to Track Engineering Education Research Evolution

In recent years, as engineering education research (EER) has evolved as an area of study, questions of its disciplinary status, global reach and diffusion of educational innovation have been raised. Bibliometric analysis, mainly employing author affiliation data and citation analysis, has been shown to be effective in gathering relevant data on these issues. In this paper, the authors broaden the scope of previous analyses by introducing reference discipline as an indicator. The study presents data on 169 articles published in seven EER journals in 2011 based on the use of four indicators: citation analysis, reference discipline, author geographical location and author disciplinary field data. In addition, demonstrating the value of this approach in establishing benchmarks for longitudinal research, the citation analysis data for the 7 journals in 2011 are compared with a similar sample from the same journals in 2009. The portrait that emerges is a rich and complex one and it shows the existence of some silos disciplinary silos and to a lesser extent geographical silos.


INTRODUCTION
This work uses bibliometric analysis of publications to provide data to track cross-fertilization of engineering education research (EER) between discipline areas [1] and across international borders [2,3], and it contributes to the study of indicators for the maturity of EER as a field of research [4,5]. Accordingly, 2011 data for 4 indicators are provided: citations, reference discipline, author geographical location and author disciplinary field. Citation analysis provides an indication of which EER journals the published authors have consulted, geographical location helps build a picture of global reach for EER as a discipline, while reference discipline and author discipline data provide information on disciplines and scholars outside the immediate domain of EER that have informed engineering education researchers. The data from 169 articles published in 7 engineering education journals in 2011 is presented and discussed. In addition the value of this approach for longitudinal research is illustrated by comparing citation analysis values for the 7 journals in 2009.

A. Previous work using bibliometric analysis
A wide-ranging study by Jesiek and colleagues [3] based on analysis of over 800 articles presenting empirical data on a large number of US, European and Australian EER publications between 2005 and 2008 identified 38 categories of research. A contrasting approach was taken by De Graaff and Kolmos [6] who used 8 categories of research papers to classify the papers in two volumes of the European Journal of Engineering Education (EJEE). A similar but more detailed approach applied to publications in computing education used a multi-dimensional taxonomy to classify research papers. Malmi and colleagues [7] used 7 dimensions whereas Simon [8] in an earlier paper had used 4.
Whereas all three approaches mentioned above imply a degree of subjective categorization and hence involve a process of cross-checking or inter-rater standardization, citation analysis has the advantage of being more objective. Wankat [9,10] used this method to analyze the citations in all of the 2009 papers in 9 US engineering education journals and proceedings (1,721 papers in all). He noted that the narrow range of sources cited in papers published in the disciplinary engineering journals suggested a silo effect (i.e. groups of researchers who do not talk to or read papers by researchers outside their group) that probably limits cross-fertilization within engineering education research, and could help explain the slow rate of diffusion of proven engineering education innovations. A comparison of two leading EER journals over a 40-year period also showed indications of a silo effect [11].
In addition to the human-curated studies mentioned above, recent developments in big-data analysis have led to a growing number of machine-curated bibliometric studies [12,13] which demonstrate the potential of computational analysis in the field of EER. The big data analysis confirmed the presence of silos [13]. The authors see these two approaches as being complementary in that the two forms of data curation can bring out different perspectives on the evolution of the field.

B. Research question
The proposed research question, how can bibliometric analysis assist in tracking the evolution of EER?, falls into the category Shavelson and Towne [14] classify as "description -what is happening?" To answer this question, the authors propose a set of four indicators, illustrate their use across a range of 7 journals in 2011 to provide a snapshot of four dimensions of EER journal publications in that year, and draw preliminary conclusions based on the data. The potential of these indicators for future longitudinal research is shown by comparing data from comparable samples in 2009 and 2011 with respect to citation analysis.

C. Theoretical framework
Whereas citations and author analysis have been used in a number of previous studies, reference discipline data has not been used to our knowledge in EER. It has however been frequently employed over the past two decades in other emergent disciplinary fields such as Information Systems and Enterprise Engineering research [15,16,17]. Figure 1. Evolution of a discipline from a reference discipline perspective [15] Reference discipline analysis is the study of the disciplines referenced and cited in research papers to track the developing maturity of a field of research. It assumes that an emerging research area will depend largely on existing disciplines in the early stages, gradually generate more internal theoretical concepts, frameworks and models; and finally in a mature stage, serve as a reference discipline for other disciplines. As the assumptions underlying Figure 1 have not been definitively confirmed from empirical data [15] in this paper the reference discipline approach is used simply as an indicator of interdisciplinarity which could provide data on the evolution of EER if collected longitudinally.

II. METHODOLOGY
A total of 169 articles published in 7 engineering education journals in 2011 (delineated in Table 1) were analyzed for reference discipline and the authors' country affiliation. The journals were selected to obtain a mix between US based (JEE and AEE) and non-US general engineering education journals (AJEE, EJEE and IJEE), and included two disciplinary engineering education journals, CEE and IEEE Trans Educ. Given the variation between the number of articles per issue in each journal and to get a broad snapshot of the content of the 5 journals in 2011, the authors took broadly similar numbers of papers for each. As the number of citations was lower in CEE a larger sample of articles was chosen for that journal. In the case of reference discipline analysis the % reference discipline per paper was calculated.
For the research discipline analysis, each journal article was classified independently by two authors and the results later discussed until classifications were consensual. The classification system applied required the authors to specifically show that the discipline underpinned their research. Such references were normally found in the background or methodology sections of the papers and were frequently mentioned as keywords. Disciplines which only appeared in the bibliographic references section were not considered reference disciplines. For the citation, author affiliation and disciplinary field analysis, objective data was available in the journals and inter-rater consistency is not an issue.

A. Reference Discipline analysis
47 reference disciplines were identified in the 169 Engineering Education articles analyzed ( Table 2). The two disciplinary journals had very few examples of reference disciplines (CEE 3 % citations/paper, IEEE ToE none), in three of the general EER journals approximately 25% of papers analyzed included reference disciplines (IJEE 25%, AEE 27% and AJEE 29% citations/paper respectively) while the value for EJEE was 70%. JEE stands out in having 104% i.e. a little more than one reference discipline per paper on average.   As might be expected, the existing fields of Learning Theory and Psychology were the most drawn upon (Table  3). From Table 4 which gives a breakdown of the subfields of the reference disciplines summarized in Table 3 with respect to each journal, it can be observed that the research published in these samples does draw upon a wide range of interdisciplinary source.

B. Author Disciplinary Field Analysis
Given that a number of articles referred to reference disciplines somewhat further from the locus of engineering education the authors were interested to see the strategies EER scholars adopted to integrate these other disciplines.
The affiliation information in the journals allowed determination of the number of articles written by EER scholars collaborating from colleagues from other fields and this is shown in Table 5. JEE is particularly notable in that for the issues sampled, 41% of the authors appeared to be from other disciplinary areas based on their affiliations.

C. Geographical analysis
The geographical locations of the authors are summarized in Table 6. From a total of 493 authors, 50.5% are from US, 16.4% from Australia, 10.3% from Spain and 22.8% from 29 other countries. In 2011 AEE, AJEE and JEE were mainly local in terms of authors' geographical location whereas EJEE, and IEEE ToE had quite broad ranges of participation. IJEE and CEE were in between these two extremes.

D. Citation Analysis
The 169 papers studied contained a total of 5695 citations with the AEE papers having an average of 36 references per paper while the CEE papers had 16. First the authors checked the number of times the 9 sources listed by Wankat (2011Wankat ( , 2012  In addition, a number of additional indicators which help to capture more details of the prior studies used by authors in the journals (Table 7) were included. Thus, additional categories were created to show cited references which in their title had some form of the following terms: Chemical Education; Physics & Computing Education; Math Education; Psychology. For example, in the first category books or papers with chemistry or chemical in their title are included.

IV. INTERPRETATION OF RESULTS
What do these data mean? The picture of Engineering Education Research that has emerged based on the citation data and bibliometric analysis is consistent with, but does not prove, the existence of silos -disciplinary silos (e.g., chemical engineering education, computer education, engineering education research) and to a lesser extent geographical silos (e.g., US, Europe, Australia). Different missions for the journals studied may explain part of the lack of cross-citation. The presence of geographical silos suggests that previous assessments of EER globalization [2,3] may have been over optimistic.
A 2004 CEE readership survey and a 2012 survey of chemical engineering educators [19] found that most of these chemical engineering educators read CEE, PRISM, and JEE (in this order) fairly regularly, but few of these educators read any other engineering education journal on a regular basis. Since one would expect that engineering educators who read a paper that significantly affected their work will cite this paper, the citation data tends to confirm the conclusion that few engineering educators regularly read several engineering education journals/proceedings outside their discipline. If true, this lack of reading other engineering disciplines EER papers helps to explain the slow rate of diffusion, dissemination and propagation of educational innovations.
The reference discipline and author disciplinary field data suggest that there is a considerable range in scholarly involvement in interdisciplinary research. Journals like JEE and EJEE reflect significant collaboration between EER scholars and those of other disciplines and this is allied with research practices which frequently draw upon insights from outside EER. The research of authors in the more discipline oriented journals on the other hand would seem to be less informed by non EER disciplines which again would support the inference that a silo effect may be playing a part in research design and the dissemination of findings.
Finally it is suggested that the snapshot presented here shows the value of using these four indicators to capture the evolution of EER as a field of research and that a longitudinal application to larger samples would prove valuable as a way of monitoring the persistence of disciplinary and geographical silos and of tracking the development of EER as an interdisciplinary field of scholarly activity.