Relative performance of academic departments using DEA with sensitivity analysis

Relative performance of academic departments using DEA with sensitivity analysis

Evaluation and Program Planning 32 (2009) 168–177 Contents lists available at ScienceDirect Evaluation and Program Planning journal homepage: www.el...

191KB Sizes 0 Downloads 12 Views

Evaluation and Program Planning 32 (2009) 168–177

Contents lists available at ScienceDirect

Evaluation and Program Planning journal homepage: www.elsevier.com/locate/evalprogplan

Relative performance of academic departments using DEA with sensitivity analysis Preeti Tyagi a,*, Shiv Prasad Yadav a,1, S.P. Singh b,2 a b

Department of Mathematics, IIT, Roorkee - 247667, India Department of Humanities and Social Sciences, IIT, Roorkee - 247667, India

A R T I C L E I N F O

A B S T R A C T

Article history: Received 25 January 2008 Received in revised form 30 September 2008 Accepted 11 October 2008

The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of academic programs based on multiple criteria. Keeping this in view, this paper attempts to evaluate the performance efficiencies of 19 academic departments of IIT Roorkee (India) through data envelopment analysis (DEA) technique. The technique has been used to assess the performance of academic institutions in a number of countries like USA, UK, Australia, etc. But we are using it first time in Indian context to the best of our knowledge. Applying DEA models, we calculate technical, pure technical and scale efficiencies and identify the reference sets for inefficient departments. Input and output projections are also suggested for inefficient departments to reach the frontier. Overall performance, research performance and teaching performance are assessed separately using sensitivity analysis. ß 2008 Elsevier Ltd. All rights reserved.

Keywords: DEA Performance assessment Efficiency India

1. Introduction India’s development through education and research in the scientific and technical disciplines has achieved global reach and stature. In the list of the best technical institutes in India, the first name comes into insight is a group of institutions called Indian Institute of Technology. From Kharagpur in 1950 to Roorkee in 2001, there are now seven potentially world-class institutions in this family. These institutions have international recognition. The one member of this group is the Indian Institute of Technology Roorkee, which is the successor institute of the University of Roorkee, charted in 1949 and the Roorkee College founded in 1847, which was later renamed as the Thomason College of Civil Engineering. It is worth mentioning that the Roorkee College was the first engineering college established in the British Empire. This institute has the glorious past over 160 years and has been acclaimed for its excellence in education, research and training. The institute has three types of academic programs, namely, Undergraduate (UG) Programs (Bachelor of Technology degree in

* Corresponding author. E-mail addresses: [email protected] (P. Tyagi), [email protected] (S.P. Yadav), [email protected] (S.P. Singh). 1 Tel.: +91 1332 285269/285340. 2 Tel.: +91 1332 285337. 0149-7189/$ – see front matter ß 2008 Elsevier Ltd. All rights reserved. doi:10.1016/j.evalprogplan.2008.10.002

different disciplines), Postgraduate (PG) Programs (Master of Technology degree in different disciplines and Master in Business Administration degree) and Doctoral Programs (Ph.D.). Nineteen academic departments offer these programs. The purpose of this paper is to assess the relative performance of these departments based on the multiple criteria. Second objective is to measure how efficiently the departments work in the institute and to identify efficient and inefficient performers. Thirdly, we apply sensitivity analysis to test the robustness of results and assess the performance of departments for different activities like research and teaching. In the literature, several approaches are applied for measuring efficiency like performance indicators, parametric methods (such as ordinary least square method, stochastic frontier method) and nonparametric methods (such as DEA and Free Disposal Hull). Each method has its strengths and weaknesses. The ratio style performance indicators can work well only when a single input and single output are involved. But in multi-input and multi-output context, it is unable to draw right inferences. Parametric methods require explicit functional form for technology as well as distribution of inefficiency. But non-parametric methods do not require any functional form and work well with multiple inputs and outputs. The paper applies data envelopment analysis (DEA) methodology as it is the most suited methodology for measuring the performance of non-profit organization such as academic

P. Tyagi et al. / Evaluation and Program Planning 32 (2009) 168–177

departments. It is particularly appropriate when the researcher is interested in investigating the efficiency of entities that convert multiple inputs into multiple outputs. Here such an entity is academic department. The paper is organized as follows: Section 2 comprises methodology and brief survey of literature of DEA in education sector. Section 3 describes the application procedure of DEA. Description of inputs and outputs and models used in the paper is also given. Section 4 gives information about data collection and computation. The overall performance of the departments is discussed in Section 5. Section 6 explains the teaching and research performance of the departments and robustness of the estimated efficiency scores using sensitivity analysis. Efficiency measurement of engineering departments estimated separately is also given in this section. In the last section, conclusions are given. 2. Methodology DEA is developed by Charnes, Cooper, and Rhodes (1978) and extended by Banker, Charnes and Cooper (1984). It is used for measuring efficiency in the cases where multiple input and output factors are observed and when it is not possible to turn these into one aggregate input or output factor. The DEA methodology is especially adequate to evaluate the efficiency of non-profit entities that operate outside the market, since for them performance indicators, such as income and profitability, do not work satisfactorily. DEA provides comparative efficiency of the units to be evaluated. The units analyzed are called Decision Making Units (DMUs). The performance of DMUs is assessed using the concept of efficiency and productivity, which is the ratio of total outputs to total inputs. Efficiencies are estimated relative to the best performing DMU (or DMUs). The best-performing DMU is assigned an efficiency score of unity and performance of other DMUs vary between 0 and 1 relative to this best performance. More detailed theoretical introduction of DEA may be found in Coelli, Prasada Rao, and Battese (1998), Cooper, Seiford, and Zhu (2004), Ramanathan (2003) and Thanassoulis (2001). The mathematical formulation of DEA is given in Appendix A. DEA provides some useful information to policy makers that may be helpful to improve the performance of a DMU. We can find input slacks (quantity of access resource used) and/or output slacks (deficient output produced). DEA also identifies the reference set, also known as the peer group. A peer group contains two or more efficient DMUs for an inefficient DMU. Thus, an efficient DMU may be a peer for one or more inefficient DMUs. A DMU, which appears frequently as a peer for more inefficient DMUs or has a high peer count, is considered as an example of good performance. 2.1. Literature survey In recent years, several studies have been undertaken to analyze the efficiency for academic departments in universities. Each study differs in its scope and meaning. Among them, some important studies are briefly reviewed as below. Bessent, Bessent, Charnes, Cooper, and Thorogood (1983) used DEA in measuring the relative efficiency of education programs in a community college in USA. Educational programs (DMUs) were assessed on outputs such as revenue from state government, number of students completing the program, and employer satisfaction with training of students. These outputs represented significant planning objectives. Input included were student contact hours, number of full time equivalent instructors, square feet of facilities for each program and direct instructional

169

expenditure. The author demonstrated how DEA could be used in improving programs, terminating programs, initiating new programs or discontinuing inefficient program. Tomkins and Green (1988) studied the overall efficiency of UK University accounting departments. They ran a series of six efficiency models of varying complexity where staff number was an input and students number an output. The results indicated that different configurations of multiple inputs and outputs produced substantially stable efficiency scores. Beasley (1990) studied productive efficiency of Chemistry and Physics departments in UK, where financial variables such as research income and expenditure were treated as inputs. Outputs consisted of numbers of undergraduate and postgraduate students as well as research ratings. In a follow up study, Beasley (1995) analyzed the same data set in an effort to determine the research and teaching efficiencies jointly, where weight restrictions were used. Johnes and Johnes explored various models in measuring the technical efficiency of economics departments in UK in terms of research output. They discussed the potential problems in choosing inputs and outputs. The authors also provided a good guide to interpreting efficiency scores. Stern, Mehrez, and Barboy (1994) examined the relative efficiency of 21 academic departments in Ben-Gurion University, Israel. Operating costs and salaries were taken as inputs while grants, publications, graduate students and contact hours were used as outputs. The analysis suggested that the operating cost should be reduced in 10 departments. Nunamaker (1985) investigated the effect of changing the variables-mix on DEA scores. Arcelus and Coleman (1997) investigated how a particular fixed budget formula affected the input/output structure of academics units of University of New Brunswick, Canada. 3. Research design 3.1. Selection of DMUs The first step in research design is to decide the DMUs to be compared. Two factors influence the selection of DMU for a study. These two factors are – homogeneity and number of DMUs. The DMUs must be homogeneous units. They should perform the same tasks and should have similar objectives. The inputs and outputs characterizing the performance of DMUs should be identical, except for differences in intensity or magnitude. We select the departments of IIT Roorkee for our study. They are homogeneous because they perform the same task and have similar objectives. We mean that all the department use academic, non-academic staff and operating costs for teaching and research purposes. The number of DMUs to be compared depends upon the objective of the study and on the number of homogeneous units whose performance in practice has to be compared. However, some considerations have been specified in the selection of the number of DMUs. The number of DMUs is expected to be larger than the product of number of inputs and outputs in order to discriminate effectively between efficient and inefficient DMUs (Avkiran, 2001; Darrat, Topuz, & Yousef, 2002). Therefore, we take three inputs and three outputs with 19 academic departments in our study. In literature many of authors (like Arcelus & Coleman, 1997; Beasley, 1990; Johnes & Johnes, 1993, 1995; Stern et al., 1994; Tomkins & Green, 1988) have evaluated the performance taking academic departments as DMUs. 3.2. Selection of inputs and outputs The next step in research design is to determine inputs and outputs for the DMUs to be compared. The criteria of selection of

170

P. Tyagi et al. / Evaluation and Program Planning 32 (2009) 168–177

these inputs and outputs are quite subjective. There is no specific rule in determining the procedure for selection of inputs and outputs. Normally inputs are defined as resources utilized by DMUs or conditions affecting the performance of DMUs while outputs are outcomes or produced goods and services or benefits generated as a result of operation of DMUs. An input–output set for measuring the performance in education sector has often been criticized for being inadequate and conducive to analyze efficiency. The opinion may vary person to person for particular input–output. Clearly, an acceptable universal set for educational program does not exist. Therefore, the set can be changeable according to the requirement of the study. Here in this paper, we aim to measure the overall performance of departments and so we try to include each activity of departments in the choice of inputs and outputs. Performance on teaching and research activities is also measured separately. However, no new parameter is used for them. These parameters are manipulating according to the requirements of the study from the same set of parameters. In this paper, we choose the following inputs and outputs to assess the performance of the departments. 3.2.1. Inputs 3.2.1.1. Academic staff. This is the main human resource used by all departments for teaching and research activities. Clear argument is that institute employs people to produce enrolments and generate research outputs. So selecting it as an input is relevant for the study. The number of academic staff is also considered as input in Arcelus and Coleman (1997), Avkiran (2001), Johnes and Johnes (1993, 1995) and Tomkins and Green (1988). 3.2.1.2. Non-academic staff. This input is also in the form of human resource. There are 19 academic departments in the institute; each has its building, office, library and laboratories. So, each department has its own non-academic staff that works for academic staff and students. This is also used as an input in Arcelus and Coleman (1997) and Avkiran (2001). 3.2.1.3. Departmental operating cost. Each department disposes of certain amount of funds intended to the development of its activities. This financial resource along with above stated two inputs is considered for our study. Allocation of operating cost for each department is based on the number of students enrolled in each program and research papers published in conference proceedings and journals. So, it is related to output activities of the department. This input is also used as a variable in Arcelus and Coleman (1997), Abbott and Doucouliagos (2003), Beasley (1990) and Bessent et al. (1983).

Using these criteria, each UG level student is counted as 1, each PG level student is counted as 1.3 and Ph.D. level student as 2. Some departments teach the students of other departments also, so we include these students also in the total enrolments. By this, we aim to measure whole contribution of the department for teaching that comes out as the output variable ‘‘Total Enrolled Students’’. Total enrolled students ¼ Number of UG students þ 1:3ðNumber of PG studentsÞ þ 2ðNumber of Ph:D: studentsÞ The number of students also appears as output in Arcelus and Coleman (1997), Abbott and Doucouliagos (2003), Avkiran (2001) and Beasley (1990). 3.2.2.2. Progress. There are two types of academic departments – Science and Engineering Departments. Science Departments have PG and Ph.D. students only, while Engineering Departments have UG, PG and Ph.D. students. Combining the two activities (teaching and research), we get two parameters for each department: (a) number of students placed for different jobs; and (b) number of Ph.D. degree awarded. Combining these two parameters, we form a new output variable ‘‘Progress’’. Abbott and Doucouliagos (2003) used a similar type of output variable ‘‘EFTS’’ which contains number of postgraduate and graduate degrees conferred. 3.2.2.3. Research index. Research is one of the principal activities of a department. When measuring the research output, we have to consider several parameters together, which seem necessary to measure the performance because separately each indicator has its shortcomings. The selected parameters comprise the most significant aspects in the research activity of a department. This index includes number of publications in journals and conference proceedings, number of research projects taken by the department, number of conferences organized and attended by the faculty members of the department. Here, we aim to summarize each activity related to research that contributes to the performance of the departments. There is a contradiction between the opinions of people about the quality of research works. Some may appreciate the publication of paper in journal and some may give more preference to research project. Based on the general opinion of the institute’s administrators and academic staff, we form the ‘‘Research Index’’ as Research index ¼ Number of papers in journals of the department þ 0:5ðNumber of papers in conferences for the departmentÞ þ 1:2ðNumber of research project taken by departmentÞ þ 0:7ðNumber of conferences organized by departmentÞ þ 0:3ðNumber of conferences attended by faculty members of the departmentÞ

3.2.2. Outputs 3.2.2.1. Total enrolled students. The primary function of each department is teaching. It is always useful to identify parameters that provide information about the quality and quantity of activities of a department. ‘‘Total Enrolled Students’’ is such an output that represents the quantity and quality of teaching of the departments. It is directly related to the staff. Here in each department of the institute, three programs (UG, PG and Ph.D.) are running. So, we select all the students of UG, PG, Ph.D. levels after making them equivalent on the basis of educational background, number of years required for the completion of degree and credits completed for the degree.

Correlation coefficients among these inputs and outputs are given in Appendix B. 3.3. Choice of DEA model A variety of DEA models have been discussed in the DEA literature (Coelli, Prasada Rao, & Battese (1998), Ramanathan, 2003; Thanassoulis, 2001). DEA models can generally take two forms: either input oriented or output oriented. Input orientation (also known as input minimization or contraction) aims to minimize inputs while satisfying at least the given output levels. On the other hand, output orientation (also known as output

P. Tyagi et al. / Evaluation and Program Planning 32 (2009) 168–177

171

DEA solver software (Cooper, Tone, & Seiford, 1999) is used for every computation related to DEA technique.

maximization or expansion) attempts to maximize outputs without requiring more of any of observed input values. In applications that involve inflexible inputs (not fully under control), output-based formulation would be more appropriate. However, in application where outputs are decided by the goals of the management rather than by extracting the best possible performance of DMUs, input based formulation may be appropriate (Avkiran, 2001; Ramanathan, 2003). In our study, there are three inputs: ‘‘Academic Staff, Nonacademic staff and ‘‘Departmental operating cost’’. All these are inflexible for a year. At the same time, output variables like ‘‘Progress’’ and ‘‘Research Index’’ cannot be decided in advance. Only one output variable ‘‘Total Enrolled Students’’ can be fixed as number of students admitted are fixed in a department. So output oriented model is appropriate for our analysis. Another analysis option in DEA is a choice between constant returns to scale (CRS) and variable returns to scale (VRS). CRS assumes that there is no significant relation between scales of operations and efficiency. That is, large departments are just as small ones in converting inputs to outputs. On the other hand, VRS means that a rise in input is expected to result in disproportionate rise in outputs. It is preferred when a significant correlation between DMU size and efficiency is found in a large sample. The CRS efficiency score represents technical efficiency (TE), which measures inefficiencies due to input/output configuration and as well as size of operations. On the other hand, VRS efficiency score represents pure technical efficiency (PTE) which is a measure of efficiency without scale efficiency (SE). It is, thus, possible to decompose TE into PTE and SE. Scale efficiency can be calculated by dividing PTE into TE. Number of academic and non-academic staff can represent the size of the department. In our study, we use both CRS and VRS approaches only for Overall performance assessment model, while for all the remaining models, we use VRS option.

5. Overall performance assessment of the departments For overall assessment of the departments, three inputs and three outputs, as described in Sections 3.2.1 and 3.2.2, have been taken. The analysis has been carried out using both CRS and VRS assumptions with output orientation. The results are shown in Table 1. The investigation shows that out of 19 departments, only 7 departments, namely, Chemistry, Earth Sciences, Humanities and Social Sciences (HSS), Management Studies, Mathematics, Paper Technology and Physics are technically efficient. All the remaining departments are relatively less technical efficient as they have the CRS scores less than one. The average efficiency score under CRS assumption is 0.8682. Seven departments, Architecture and Planning (ARP), Biotechnology, Chemical Engineering, Civil Engineering, Hydrology, Metallurgical and Materials Engineering (MME) and Water Resources Development Training Centre (WRDTC)) have scored lower than the average efficiency score. The lowest efficiency score (0.3893) is calculated for the ARP. So the overall performance of ARP department is very poor. The Chemistry department appears as a peer for maximum number of departments, so it is the most technical efficient department. VRS efficiency score for all departments is one, except for ARP, Biotechnology, Chemical Engineering, Hydrology, MME and WRDTC. So these six departments are inefficient. Although 13 departments form the VRS efficient frontier, four of them, viz. Electrical Engineering, Earth Sciences, Mechanical & Industrial Engineering (MID) cannot be the role models for the inefficient departments to monitor their performance, as they do not appear in the reference set of the inefficient departments. It is interesting to note that several departments, which attain relatively low CRS scores, have VRS scores equal to one. For example, the Civil Engineering department have relatively low CRS score (0.7439), but obtains unit VRS score. This clearly shows that this department is able to convert its inputs into outputs efficiently, but its technical efficiency is low due to its disadvantageous size. Again the Chemistry department has the highest peer counts; therefore it is the most pure technically efficient department.

4. Data and computation Data for this study is collected from the Annual Report Book of the Institute for the year 2003–2004. Some information is collected from the Dean of Faculty Affairs and Planning Office of the Institute.

Table 1 Efficiency scores based on overall performance model. Dept. code

D1 D2 D3 D5 D5 D6 D7 D9 D8 D7 D11 D12 D14 D13 D14 D15 D18 D17 D19 Average S.D.

Dept. name

AHEC ARP Biotechnology Chemical Chemistry Civil Earth Science Earthquake Electrical ECE HSS Hydrology Management Mathematics MID MME Paper Tech Physics WRDTC

CCR Model

BCC Model

S.E.

TE (CRS score)

Peer Dept.

Peer count

PTE (VRS score)

Peer Dept.

Peer count

0.8827 0.3893 0.6148 0.8445 1 0.7439 1 0.9003 0.9890 0.9299 1 0.8412 1 1 0.9319 0.8331 1 1 0.5950

D5, D11, D18 D5, D13, D14 D5, D11, D13, D14 D5, D13, D17 – D5, D7, D13 – D11, D18 D5, D13, D14, D17 D13, D14 – D5, D7, D13 – – D5, D7, D13 D5, D7, D13 – – D5, D7, D18

– – – – 10 – 5 – – – 3 – 9 4 – – 2 3 –

1 0.3936 0.6699 0.8452 1 1 1 1 1 1 1 0.9845 1 1 1 0.8655 1 1 0.6397

– D5, D5, D5, – – – – – – – D1, – – – D5, – – D1,

1 – – – 5 0 0 0 0 0 2 – 3 1 0 – 0 0 –

0.8682 0.1718

0.9157 0.1644

D11, D13, D14 D11, D14 D9, D13, D17

D5, D11, D13

D10, D15

D5, D13

0.8827 0.9891 0.9177 0.9992 1 0.7439 1 0.9003 0.9890 0.9299 1 0.8544 1 1 0.9319 0.9626 1 1 0.9301 0.9489 0.6645

P. Tyagi et al. / Evaluation and Program Planning 32 (2009) 168–177

172

Seven departments, viz., Chemistry, Earth Science, HSS, Management Studies, Mathematics, Paper Technology and Physics are scale efficient as they have the value of their SE scores equal to one. So they are operating at the optimal scale and there is no adverse impact of scale size on their performance. Here, 12 departments are scale inefficient. These departments are either too big or too small relative to their optimal size. The lowest SE is calculated for the Civil Engineering department. Up to this section, we have assessed only the overall performance of the departments. As the departments are involved in research and teaching activities, it is interesting to observe their performance based on these activities separately. In addition, a key feature of DEA is that the efficient frontier is formed by the best performing units. This is in contrast to the techniques, such as, regression analysis. This feature can be a source of problem in DEA because there is no direct way of assessing whether DMUs’ deviation from the frontier is statistically significant or not. Hence, it is necessary to check the robustness of DEA results. To test the stability of DEA results and to assess performance of the departments from different criteria (like which department is better for ‘‘Placements’’, or for ‘‘degree awarded for Ph.D.’’, or for ‘‘research projects’’, etc.), we use the sensitivity analysis. 6. Performance assessment of departments with sensitivity analysis According to DEA technique, it is possible for a department to become efficient if it achieves exceptionally better results in terms of one output but performs below average in terms of other outputs. However, such an unusual department will not be the peer for many inefficient departments. Thus, if a department is initially identified as efficient by DEA, a supplementary sensitivity analysis can be conducted by checking the number of inefficient departments for which it is a peer. If number is high, then the department is genuinely efficient. If number is low, then the efficiency of department should always be viewed with caution. Then other evidence for establishing the superiority of its performance is necessary. Another way of testing the robustness of DEA results is conducting the analysis by omitting an input or output and then studying the results (p 176, Ramanathan, 2003). We use these testing methods for our study.

6.1. Assessment of research performance In this section, we assess the performance of all the departments based on research outputs (as number of ‘‘Ph.D. degree awarded’’, ‘‘Number of research project’’ and ‘‘Research Index’’). To describe the more specific performance of departments related to research, we make bifurcation in the output variable ‘‘Research Index’’. Five models are developed for this analysis. The results are given in Table 2. 6.1.1. Model-1 For this model, three inputs ‘‘Academic Staff’’, ‘‘Non-academic Staff’’, ‘‘Departmental Operating Cost’’ and two outputs ‘‘PhD’’, and ‘‘Research Index’’ have been considered. PhD includes number of students enrolled for research and number of Ph.D. degrees awarded in a particular year. ‘‘Research Index’’ is explained in Section 3.2.2. DEA has been carried out using VRS output oriented model. As a result of this analysis, it is observed that out of the 19 departments, 11 departments are efficient as they obtain VRS score equal to one. Among these 11 efficient departments, 9 departments form the peer group. These peers and their corresponding frequency to the other departments are AHEC (1), Biotechnology (1), Chemistry (7), Civil (1), Earth Sciences (2), Hydrology (2), MID(1) and Management Studies (3). Three departments, namely, Earthquake Engineering, HSS and Physics, which are included in the formation of VRS frontier, do not appear in the peer group for any inefficient department. The higher peer counts of a department represent the extent of its robustness compared with the other efficient departments. We find that the Chemistry department has the highest peer count, followed by Management Studies, Earth Sciences and Hydrology. So these departments are good examples for inefficient departments to follow their work practices to improve their performance for the outputs ‘‘PhD’’ and ‘‘Research Index’’. DEA analysis also suggests the sources of inefficiencies and changes for inputs and outputs to inefficient departments to become efficient. For model-1, the inefficient departments and their corresponding percent improvements for outputs ‘‘PhD’’ and ‘‘Research Index’’ are ARP (156.34, 156.34), Chemical Engineering (36.69, 36.69), Electronics & Computer Engineering (150.15,

Table 2 Comparison of VRS efficiency scores of research performance models. Dept. code

Dept. name

Model-1

Model-2

Model-3a

Model-3b

Model-4

D1 D2 D3 D4 D5 D6 D7 D8 D9 D10 D11 D12 D13 D14 D15 D16 D17 D18 D19

AHEC ARP Biotechnology Chemical Chemistry Civil Earth Science Earthquake Electrical ECE HSS Hydrology Management Mathematics MID MME Paper Tech Physics WRDTC

1 (1)a 0.3901 1 (1)a 0.7316 1 (7)a 1 (1)a 1 (2)a 1 0.6767 0.3997 1 (0)a 1 (2)a 1 (3)a 0.8667 1 (1)a 0.8290 0.6162 1 0.6408

1 0.3750 1 (4)a 0.7273 1 (11)a 1 (4)a 0.6329 0.5523 0.6322 0.3685 1 1 (2)a 1 0.8667 0.5578 0.5955 0.4917 0.5546 0.5365

1 (1)a 0.3750 1 (2)a 0.7273 1 (8)a 1 (1)a 1 (0)a 1 (1)a 0.6322 0.3688 1 (0)a 1 (3)a 1 0.8667 1 (0)a 0.7723 0.6103 0.9613 0.5978

1 0.3750 1 (0)a 0.7273 1 (9)a 1 (2)a 1 (1)a 0.8636 0.6322 0.3688 1 (0)a 1 (2)a 1 0.8667 1 (0)a 0.7626 0.4000 0.8000 0.5393

1 (0) 0.3750 1 (3)a 0.7273 1 (10)a 1 (2)a 0.8536 0.7303 0.6322 0.3685 1 (2)a 1 (2)a 1 0.8667 1 (0)a 0.7723 0.5471 0.8571 0.5978

0.8500 0.2058

0.7311 0.2304

0.8375 0.2154

0.8071 0.2290

0.8067 0.2103

Average S.D. a

Peer counts for departments.

P. Tyagi et al. / Evaluation and Program Planning 32 (2009) 168–177

150.15), Mathematics (15.38, 41.54), MME (21.59, 20.62), WRDTC (56.06, 56.06), Paper Technology (62.28, 62.28), and Electrical Engineering (47.76, 47.76). This result specifies that these departments should expand their number of students enrolled and awarded for the degree of Ph.D. as well as pay attention to make enhancement in ‘‘Research Index’’ to become efficient. Percent reductions in ‘‘Non-academic Staff’’ are needed for ARP (16.99), Chemical Engineering (60.50), MME (48.74), Paper Technology (88.50) and WRDTC (18.55) to become efficient. Two departments, viz., Mathematics and Paper Technology, should reduce their ‘‘Academic Staff’’ by 29.47% and 21.76%, respectively, to become relatively efficient for Model-1. This shows that academic staff in these departments is not working as efficiently for the research work as their counterparts in the other departments. 6.1.2. Model-2 In this model, we drop out one output ‘‘Research Index’’ and take only one output ‘‘PhD’’. By doing this, we aim to see the effect of this change on efficiencies and also want to assess the ability of departments in the field concerning to doctoral programs. The analysis shows that only seven departments, viz. AHEC, Biotechnology, Chemistry, Civil Engineering, HSS, Hydrology and Management Studies have efficiency scores equal to one. However, AHEC, HSS and Management Studies are not peer for any inefficient department. So they are not relatively efficient for doctoral programs. All inefficient departments, except Mathematics, have attained efficiency scores lower than the average efficiency score 0.7311. The lowest efficiency score (0.3685) is calculated for the department of Electronics and Computer Science (ECE). Therefore, performance of this department is very poor for doctoral program. It has to reduce its academic staff by 1.37% and departmental operating cost by 23.52% and increase its ‘‘PhD’’ output by 171.36% to reach the efficient frontier. Other inefficient departments with their percent increment in ‘‘PhD’’ output are ARP (166.67), Chemical Engineering (37.50), Electrical Engineering (58.17), Earthquake (81.07), Earth Sciences (58), Mathematics (15.38), MID (79.26), MME (67.91), Physics (80.32), Paper Technology (103.38) and WRDTC (86.37). For Model-2, it is observed that every department has scored lower than efficiency scores for Model-1. Only the Mathematics department has same score (0.8667) for both the models. This shows that the Mathematics department is performing consistently for all research activities, but no other departments are as good for doctoral programs as they are for other activities 6.1.3. Model-3a In this model, we take out two parameters ‘‘Publications’’ and ‘‘Research Projects’’ from output ‘‘Research Index’’. ‘‘Research

173

Projects’’ includes number of projects undertaken by the department and ‘‘Publication’’ contains the number of publications in journals and conference proceedings. By doing this, we aim to determine which activity (Ph.D., Publications, Research Projects) of research is required changes for a department to become efficient. For this model, 10 departments AHEC, Biotechnology, Chemistry, Civil Engineering, Earthquake Engineering, Earth Sciences, HSS, Hydrology, MID and Management Studies have scored one. Peer departments and their corresponding peer counts are AHEC (1), Biotechnology (2), Chemistry (8), Civil Engineering (1), Earthquake Engineering (1), Earth Sciences (0), HSS (0), Hydrology (3) and MID (0). The Management Studies department is not present in peer group. This peer analysis points out that the Earth Sciences, HSS, MID and Management Studies are not treated as the good performers for this model. Most of the departments have scored lower than the Model-1 but more than Model-2. This shows that all the departments are performing well for activities ‘‘Publications’’ and ‘‘Research Projects’’ in comparison the doctoral programs. The suggested improvements in outputs and reductions in inputs for inefficient departments are given Table 3. 6.1.4. Model-3b Here we omit input ‘‘Departmental Operating Cost’’ from Model-3a. By doing this, our aim is just to check the robustness of results. Comparing the results of Model-3a and Model-3b, it is found that among the efficient departments only the Earthquake has lost its efficiency score from 1 to 0.8636. Some inefficient departments (MME, Physics, Paper Technology and WRDTC) have also scored lower than Model-3a. 6.1.5. Model-4 In this model, we omit the output ‘‘Research Projects’’ from Model-3a to know which departments are sensitive to this output. By the result of this analysis, it is observed that the Earthquake Engineering and Earth Sciences departments are performing well for output ‘‘Research Projects’’ rather than ‘‘Publications’’ as they have scored one for Model-3a. Physics, Paper Technology and WRDTC departments have scored lower efficiencies for Model-3a than Model-4. This indicates that these departments are better performer for ‘‘Publication’’ than ‘‘Research Project’’. Eight departments constitute the efficient frontier for this model. But out of eight, only five departments, viz. Biotechnology, Chemistry, Civil, HSS and Hydrology are genuinely efficient. To test the robustness and stability among these five models, Karl Pearson and Spearman correlation coefficients are calculated. These coefficients range from 0.7043 to 0.9970. The coefficient indicates that results are robust for these models (see Table 4).

Table 3 Input reductions and output augmentations required for inefficient departments in Model-3a of research performance model. Dept. code

D2 D4 D9 D10 D14 D16 D17 D18 D19 a

Dept. name

ARP Chemical Electrical ECE Mathematics MME Paper Technology Physics WRDTC

D.O.P., departmental operating cost.

Input reduction (%)

Output augmentation (%)

Academic staff

Non-academic staff

D.O.P.

0 0 0 1.34 29.47 3.63 11.85 7.60 0

23.68 60.00 1.31 0 0 9.66 74.36 0 30.54

4.05 23.82 3.85 23.50 4.65 0 0 0 0

a

Ph.D.

Publications

Research projects

166.67 37.50 58.17 171.12 15.38 54.15 63.85 45.85 67.26

209.62 44.49 61.90 198.94 37.89 29.48 63.85 4.03 67.26

262.50 91.43 74.36 171.12 266.67 74.09 63.85 4.03 110.92

P. Tyagi et al. / Evaluation and Program Planning 32 (2009) 168–177

174

Table 4 Karl Pearson (r) and Spearman rank (r) correlation coefficients among research performance models.

Model-1 Model-2 Model-3a Model-3b Model-4

Model-1 (r, r)

Model-2 (r, r)

Model-3a (r, r)

Model-3b (r, r)

Model-4 (r, r)

1, 1 0.704, 0.997, 0.956, 0.944,

1, 1 0.721, 0.756 0.794, 0.846 0.843, 0.872

1, 1 0.962, 0.961 0.947, 0.895

1, 1 0.967, 0.954

1, 1

0.705 0.967 0.923 0.881

6.2. Assessment of teaching performance For the assessment of teaching performance, we develop three Models: Model-5, Model-6 and Model-7. At this juncture, we give attention only to the teaching output. The results are presented in Table 5. 6.2.1. Model-5 Here ‘‘Academic Staff’’ and ‘‘Non-academic Staff’’ are taken as inputs and ‘‘STUDE’’, ‘‘STUDO’’, ‘‘STUDP’’ are taken as outputs. ‘‘STUDE’’ represents number of students enrolled in a department. ‘‘STUDO’’ contains number of students taught by academic staff for other departments. Number of students placed for jobs are represented by ‘‘STUDP’’. Clear vision of taking this combination is that staff contributes to produce placement and attract the students to their departments for study. The average efficiency score for Model-5 is 0.5941. Three departments, namely, ECE, Mathematics and Management turn out to be efficient. Remaining 16 departments are inefficient. Out of them, six departments have scored above the average efficiency score. These departments and corresponding efficiency scores are – Civil (0.9737), Electrical Engineering (0.8813), Chemical Engineering (0.8448), MID (0.7807), HSS (0.7430) and ARP (0.6436). So performance of these departments may be considered satisfactory but they are not included in the group of efficient departments. The Earthquake department has the lowest efficiency score (0.1679). 6.2.2. Model-6 For this model, we omit one output ‘‘STUDP’’ from Model-5. By doing this, we assess the performance of each department only for Table 5 Comparison of VRS efficiency scores of the teaching performance models. Dept Code

Dept. Name

Model-5

Model-6

Model-7

D1 D2 D3 D4 D5 D6 D7 D8 D9 D10 D11 D12 D13 D14 D15 D16 D17 D18 D19

AHEC ARP Biotechnology Chemical Chemistry Civil Earth Sciences Earthquake Electrical ECE HSS Hydrology Management Mathematics MID MME Paper Tech. Physics WRDTC

0.3165 0.6436 0.3327 0.8448 0.3951 0.9737 0.4821 0.1679 0.8813 1 (14)a 0.7430 0.1681 1 (12)a 1 (5)a 0.7807 0.5368 0.3630 0.4065 0.2531

0.3165 0.6436 0.3327 0.8448 0.3402 0.9737 0.4821 0.1679 0.8544 1 (14)a 0.7430 0.1681 1 (12)a 1 (5)a 0.7692 0.5368 0.3279 0.4605 0.2531

0.0043 0.1267 0 0.4744 0.3380 0.7017 0.1865 0.0038 0.8421 1 (16)a 0 0 1 (13)a 0.5719 0.7807 0.4034 0.3630 0.1297 0

0.5942 0.2913

0.5874 0.2933

0.3684 0.3452

Average S.D. a

Peer counts.

enrolments in that department and for teaching activities in other departments. Efficiency scores for 16 departments out of 19 are same for both the models. Three departments Electrical Engineering, MID and Paper Technology have scored lower for Model-6 than Model-5. This shows omission of the output ‘‘STUDP’’ affects these departments. From this, we conclude that these three departments are better for the output ‘‘STUDP’’ than other two outputs ‘‘STUDE’’ and ‘‘STUDO’’. Every department has its own specific quality. For example, comparatively Mathematics department has more teaching load than other departments because Mathematics as a subject is offered to students of many disciplines of different departments. At the same time, WRDTC has no students from other departments and no contribution for placements individually. The reason is that this department emphasizes much for research and training activities. So it is not possible to allocate equal distribution of students to each department. This may be the reason why only few departments have scored one in Model5 and Model-6. 6.2.3. Model-7 The main purpose of each department is to produce the maximum placement every year. Therefore, in this model we take only STUDP as output. By doing this, we omit other two outputs used in Model-5 to make sensitivity analysis. The result explains that only two departments, namely, ECE and Management Studies are surviving well for placements. All other departments are not good for placement activity. Six departments, viz. Electrical Engineering, MID, Civil Engineering, Mathematics, Chemical Engineering and MME have scored above the average efficiency score 0.3684. Some environmental factors may be responsible for such result. For example, during the last few years, demands for engineers from ECE and postgraduate students of Management Studies have increased faster than that of other disciplines, especially basic sciences. So leaving out these external factors, results seem accurate according to data. To test the robustness and stability among these three models, Karl Pearson coefficient of correlation is calculated. This is in the range of 0.810–0.998. The higher values of the coefficient indicate that results are robust for these models (see Table 6). 6.3. Performance assessment of the engineering departments The institute has a great recognition in the field of engineering. So now our target is to take only engineering Table 6 Karl Pearson (r) and Spearman rank (r) correlation coefficients among teaching performance models.

Model-5 Model-6 Model-7

Model-5 (r, r)

Model-6 (r, r)

Model-7 (r, r)

1, 1 0.998, 0.998 0.821, 0.824

1, 1 0.810, 0.806

1, 1

P. Tyagi et al. / Evaluation and Program Planning 32 (2009) 168–177

175

Table 7 Comparison of VRS efficiency scores of engineering departments for UG and PG levels.

Table 8 Suggested input reduction and output augmentation for engineering departments for UG level (Model-8).

Dept. Code

Dept. Name

Model-8

Model-9

Model-10

Dept. Code

D2 D4 D6 D9 D10 D15 D16 D17

ARP Chemical Civil Electrical ECE MID MME Paper Tech

1 1 (1)a 0.7011 0.8615 1 (4)a 0.7902 0.7178 0.5714

1 1 (3)a 1 (2)a 0.8109 1 (3)a 0.6633 0.4530 0.2788

1 1 (1)a 0.9737 0.8138 1 (4)a 0.7629 0.5591 0.4114

0.8303 0.1522

0.7758 0.2660

0.8155 0.2121

Average S.D. a

D6 D10 D15 D16 D17

Dept. Name

Civil Electrical MID MME Paper Tech

Input reduction (%)

Output (%)

improvements

Academic staff

Non-academic staff

STEUG

STPUG

46.67 11.11 11.11 0 0

67.52 25.59 71.43 0 18.99

46.62 21.47 26.55 66.07 162.90

47.73 16.07 32.65 39.32 75

Peer count.

departments into consideration. Three Models: Model-8, Model9 and Model-10 are developed and the results are compared in Table 7. 6.3.1. Model-8 In this model, assessment is done for the Undergraduate (UG) programs only. Two inputs ‘‘Academic Staff’’ and Non-academic staff and two outputs ‘‘STEUG’’ and ‘‘STEPG’’ are taken. ‘‘STEUG’’ and ‘‘STPUG’’ include number of students enrolled and number of students placed for jobs, respectively. The analysis explains that three departments, viz. ARP, Chemical Engineering and ECE have scored 1. ARP department is not peer for any inefficient department. Therefore, only Chemical Engineering and ECE are the role models at the UG level. Among the inefficient departments, the Electrical department has scored above the average efficiency score 0.8615. 6.3.2. Model-9 In this model, we assess the performance for Postgraduate (PG) Programs only. ‘‘STEPG’’ and ‘‘STPPG’’ are taken as outputs. ‘‘STEPG’’ and ‘‘STPPG’’ represent students enrolled and placed only for PG programs in the department. The result shows that ARP, Chemical Engineering, Civil Engineering and ECE have scored one. However, ARP is not peer for any inefficient department. Therefore, ECE, Civil Engineering and Chemical Engineering lead at the PG level. Comparing the efficiency scores for Models 8 and 9, it is noticed that all engineering departments (except Civil Engineering) get score lesser for Model-8 than for Model-9.This points out that all engineering departments are performing better for UG programs than for PG programs. Only exception being the Civil Engineering department that performs better for PG than UG Programs. 6.3.3. Model-10 Now, we investigate both UG and PG programs together. ‘‘STEUP’’, ‘‘STPUP’’ are taken as outputs. They are defined as STEUP ¼ Total number of students in UG programs þ 1:3ðTotal number of students in PG programsÞ STPUP ¼ Total number of students placed for UG program þ 1:3ðTotal number of students placed for PG programsÞ The result explains that ARP, Chemical Engineering and ECE have scored one. ARP is not peer for any inefficient department. Comparing the results for Models 8 and 10, we find that all the departments have scored lesser for Model-10 than for Model-8. Therefore, inclusion of the PG programs

results in the reduction of efficiency scores. This confirms again that all engineering departments should pay attention to their PG programs. Reduction in inputs and improvements in outputs for the inefficient departments are reported in Table 8. 7. Conclusion This paper has evaluated the performance of academic departments of IIT Roorkee through DEA models using different combinations of input and output variables. The principle objective is to analyze activity-wise performance assessment of the departments. This means that we want to evaluate which department is good for which specific activity (like teaching, placements and research). For this purpose, we did four assessments, namely, overall performance assessment, research performance assessment, teaching performance assessment and assessment for engineering departments by using 10 models. We have done sensitivity analysis in these 10 models by changing inputs and outputs. Among all models, the highest mean (0.9157) and the lowest standard deviation (0.1644) in technical efficiency are reported for Model-1. Therefore, overall performance is satisfactory for all the departments. The lowest mean (0.3684) and the highest standard deviation (0.3452) are calculated for Model-7 of teaching performance assessment. This confirms that improvements are needed in the field of placements. Models related to research assessment (Models 1–4) show that Model-2 has lower mean score (0.7311) in comparison to other models. Therefore, it is advised to focus on number of Ph.D. degrees awarded and enrolled to improve the performance of the inefficient departments. For overall performance assessments, four departments, namely, Chemistry, HSS, Management Studies and Mathematics are good example to follow by the inefficient departments to monitor and improve their performance. Among research performance assessment, the highest mean (0.8500) and the lowest standard deviation are calculated for Model-1. This indicates that the research performance is satisfactory when all activities related to research are combined into one output ‘‘Research Index’’. Four departments, viz. Biotechnology, Chemistry, Civil Engineering and Hydrology are the good performers for the doctoral programs as well as all other activities like publications and research projects. The Mathematics department has scored 0.8667 that is above the average efficiency scores for Models (1–4). Therefore, Mathematics department, though below efficient frontier, performs consistently for all activities of research. ECE and Management Studies are the efficient departments in the field of placement as well as teaching activities while

P. Tyagi et al. / Evaluation and Program Planning 32 (2009) 168–177

176

Mathematics is performing well only for enrolled students and students taught for other departments. ECE and Chemical Engineering are the best performers for both PG and UG programs among all engineering departments. The Civil engineering department is performing well only for PG programs. Thus our study provides information about every activity of the departments and policy makers can use suggested improvements and reductions to improve the performance in different areas. Finally, we can give some concluding remarks for the departments.

 For every department, it is essential to focus on number of Ph.D. degrees awarded and enrolled students to improve their performance.  Only ECE and Management studies are doing well for placements. So all other departments should take care for placement activity.  The ECE and Chemical engineering departments are the best for both UG and PG programs among all engineering departments. Other engineering departments are advised to change their inputs and outputs to become efficient.  Some departments are not utilizing effectively their staff (both academic and non-academic) for some specific activities related to research and teaching.

 Overall performance assessment is good for all science departments. Other departments need improvements in their activities.  Only Biotechnology, Chemistry, Civil Engineering and Hydrology departments are efficient in every area of research. All other departments should pay attention for research works.

The authors are highly thankful to reviewers for their fruitful comments and suggestions which helped us to improve our earlier versions of the paper.

Appendix A. Mathematical form of DEA

subject to

Let there be N DMUs whose efficiencies have to be compared. Let us take one DMU, say mth DMU and maximize its efficiency subject to the constraint that efficiency of other DMUs cannot exceed 1 with same weights as for DMU m PJ v y j¼1 jm jm maxEm ¼ PI i¼1 uim xim

Acknowledgement

I X uim xim ¼ 1 i¼1 J X

I X uim xin  0;

j¼1

i¼1

v jm y jn 

v jm ; uim  e i ¼ 1; 2; . . . ; I; j ¼ 1; 2; . . . ; J

subject to PJ

The general form of BCC model can be written as

j¼1

v jm y jn

i¼1

uim xin

0  PI

v jm ; uim  0;

 1;

n ¼ 1; 2; . . . ; N

max z ¼

j X

v jm y jm þ u0m

j¼1

subject to

i ¼ 1; 2; . . . ; I; j ¼ 1; 2; . . . ; J

where Em is efficiency of mth DMU, Yjm is the jth output of mth DMU, v jm is the weight of that output, xim is ith input of mth DMU, uim is the weight of that input.The above mathematical program, when solved will give the values of weights u and v. If efficiency is unity, then DMU is said to be efficient and will lie on frontier. Otherwise, the DMU is said to be relatively inefficient.The general form of CCR (Charnes, Cooper Rhodes) DEA model can be written as max z ¼

n ¼ 1; 2; . . . ; N

J X

I X uim xim ¼ 1 i¼1 J X

I X uim xin þ u0m  0;

j¼1

i¼1

v jm y jn 

v jm ; uim  e;

n ¼ 1; 2; . . . ; N

i ¼ 1; 2; . . . ; I; j ¼ 1; 2; . . . ; J

u0m is unrestricted in sign.

v jm y jm

j¼1

Appendix B. Correlation coefficients between input and output variables

Academic staff Non-academic staff Departmental operating cost Total enrolled student Progress Research index

Academic staff

Non-academic staff

Departmental operating cost

Total enrolled student

1 0.6959 0.8769 0.3466 0.6082 0.8433

1 0.6698 0.0835 0.5101 0.6454

1 0.1898 0.8206 0.7867

1 0.3238 0.0547

Progress

Research index

1 0.4177

1

P. Tyagi et al. / Evaluation and Program Planning 32 (2009) 168–177

References Abbott, M., & Doucouliagos, C. (2003). The efficiency of Australian universities through data envelopment analysis. Economics of Education Review, 22, 89–97. Arcelus, F. J., & Coleman, D. F. (1997). An efficiency review of university departments. Journal of System Sciences, 28(7), 721–729. Avkiran, N. K. (2001). Investigating technical and scale efficiencies of Australian universities through data envelopment analysis. Socio Economic Planning Sciences, 35, 57–80. Banker, R. D., Charnes, A., & Cooper, W. W. (1984). Some models for estimating technical and scale inefficiencies in data envelopment analysis.Management Science. 30, 9, 1078–1092. Beasley, J. E. (1990). Comparing university departments. Omega International Journal of Management Science, 18(2), 171–183. Beasley, J. E. (1995). Determining teaching and research efficiencies. Journal of the Operational Research Society, 46(4), 441–452. Bessent, A. M., Bessent, E. W., Charnes, A., Cooper, W. W., & Thorogood, N. C. (1983). Evaluation of educational program proposals by means of DEA. Educational Administration Quarterly, 19(2), 82–107. Charnes, A., Cooper, W. W., & Rhodes, E. (1978). Measuring the efficiency of decision making units. European Journal of Operational Research, 2(6), 429–444. Coelli, T., Prasada Rao, D. S., & Battese, G. E. (1998). An introduction to efficiency and productivity analysis. Norwell, MA: Kluwer. Cooper, W. W., Tone, K., & Seiford, L. M. (1999). Data envelopment analysis: A comprehensive text with models, applications references and DEA solver software. Norwell, MA: Kluwer. Cooper, W. W., Seiford, L. M., & Zhu, J. (2004). Handbook on data envelopment analysis. Norwell, MA: Kluwer. Darrat, A. F., Topuz, C., & Yousef, T. (2002). Assessing cost and technical efficiency of banks in Kuwait. Paper presented to ERF’s 8th annual conference in Cario. Johnes, G., & Johnes, J. (1993). Measuring the research performance of UK economics department: An application of data envelopment analysis. Oxford Economic Papers, 45, 332–347.

177

Johnes, G., & Johnes, J. (1995). Research funding and performance in U.K. university departments of economics: A frontier analysis. Economics of Education Review, 14(3), 301–314. Nunamaker, T. R. (1985). Using data envelopment analysis to measure the efficiency of non-profit organizations: A critical evaluation. Managerial Decision Economics, 6(1), 50–58. Ramanathan, R. (2003). An introduction to data envelopment analysis: A tool for performance measurement. New Delhi: Sage. Stern, Z. S., Mehrez, A., & Barboy, A. (1994). Academic departments efficiency via DEA. Computers and Operations Research, 21(5), 543–556. Thanassoulis, E. (2001). Introduction to theory and application of data envelopment analysis: A foundation text with integrated software. Norwell, MA: Kluwer. Tomkins, C., & Green, R. (1988). An experiment in the use of data envelopment analysis for evaluating the efficiency of UK university departments of accounting. Financial Accountability and Management, 4(2), 145–165.

Preeti Tyagi did M.Phil in Mathematics in 2002 from CCS University Meerut (India). She is doing Ph.D. in Department of Mathematics, Indian institute of Technology, Roorkee (India).

Shiv Prasad Yadav did Ph.D. in Mathematics from Institute of Technology, B.H.U. Varanasi (India). He is Professor in the Department of Mathematics, Indian Institute of Technology, Roorkee (India). His Research interests are in the area of DEA, Fuzzy systems and reliability and Optimal Control Theory.

S.P. Singh did Ph.D. in Economics in 1995 from CCS University Meerut (India). He is professor in department of Humanities and Social Sciences, Indian Institute of Technology, Roorkee (India). His research interests are in the area of DEA based efficiency and productivity analysis, Rural development, Agricultural Economics, Labour Economics, Irrigation and Water Resource Economics.