Matches in SemOpenAlex for { <https://semopenalex.org/work/W416069486> ?p ?o ?g. }
Showing items 1 to 62 of
62
with 100 items per page.
- W416069486 abstract "Quality measures are often derived from weighted sums of diverse indicators, including ordinal Likert items. This procedure can be dangerously misleading because it takes no account of correlations among indicators. It also takes no account of whether indicators are input measures, e.g. prior achievement of incoming students, or outcome measures, e.g. proportion getting a good degrees or student satisfaction. UK Higher Education data for England and Wales 200405 were analyzed taking these issues into account. Multiple regression showed, unsurprisingly, that ‘bright’ students with high prior achievement did well on all outcome indicators. A good research rating was associated with good degree performance and high completion rates, but not destination (in work or further training), or student satisfaction. Vice chancellor salaries and academic staff pay were positively associated with good destination outcomes. Worryingly, higher vice chancellor pay was associated with lower student satisfaction. The implications for evaluating university quality are discussed. Misleading ‘quality’ measures in Higher Education: problems from combining diverse indicators that include subjective ratings and academic performance and costs The ‘quality’ of Higher Education Institutes (HEIs) is a major concern: for actual and potential students, for government, for industry and for the staff of the HEIs themselves. So how should ‘quality’ be measured? Obviously, HEIs are complex and have many goals and so have many potential quality dimensions. The simplest solution is to decide on some indicator for each dimension and then take a weighted sum of the indicators. This is the approach taken by Newspapers when Radical Statistics Issue 94 2 constructing league tables, national or international. There are at several problems with this approach. 1. Some indicators are metric, e.g. school achievement, proportion getting a ‘good’ degree, or money spent on libraries. Other metrics are derived from ordinal (Likert) items by inappropriate averaging. Thus the National Student Survey(NSS) of final year undergraduates asks students to give the extent of their agreement with 22 statements on a scale from 1-5 (5 Definitely agree, 4 Mostly agree, 3 Neither agree nor disagree, 2 Mostly disagree, 1 Definitely disagree, N/A Not applicable). The NSS score is then a Likert scale formed by averaging the scores from the 22 items. This procedure makes the unstated, and almost certainly wrong, assumption that the difference between strongly agree and agree is the same as between neutral and disagree. 2. The final metric is arbitrary. So that the consequences of a difference of 10, or even 100, points out of a maximum possible score of 500 in terms of chance of a student getting a good job or a good degree is unfathomable. 3. The weightings are arbitrary, with different weightings giving different rank orders. Thus, the Guardian, Telegraph and Times give different rank orders for UK universities. 4. The indicators, unsurprisingly, tend to be correlated. So HEIs with students with good prior achievement also have high rates of good degrees. HEIs who take students that are more able thus receive extra points for good degrees even though those same students might have done just as well at other universities. 5. League tables at the institution level take no account of discipline. Since most indicators are discipline specific the final results are likely to be biased by discipline mix. HEIs with many courses in popular disciplines (high average school achievement), or easy disciplines (low drop out rates), will achieve high scores. Bad news for physics and statistics! For this reason, many newspaper league tables do indeed provide discipline specific ratings. However, problems 1 to 4 apply at the discipline level as well as the institution level. HEI quality indicators are potentially of use to many different groups. Intending students and their parents and sponsors (e.g. governments, charities etc.) want to choose the ‘best’ university. Governments, other sponsors and the HEIs themselves want to know if improvement is taking place both absolute and relative. This paper takes a deeper look at what might constitute quality for a University in terms of benefits to undergraduates. The data comes from the Higher Education Statistics Agency (HESA) for Academic Year 05-06, available via The Times Higher Education Supplement, and the Sunday Times. The first step was to separate input measures, such as school achievement (A & AS level points in UK system), from output Radical Statistics Issue 94 3 measures such as percent getting a good degree (1st or 2.1 in UK system). Then two approaches are taken. The first is unashamedly exploratory, namely principal components analysis. What indicators of quality ‘go together’? Perhaps, surprisingly this question has rarely been asked. The second approach models which input indicators predict particular output indicators. From the point of view of the intending students, this enables choice of the best HEI available, given their own prior achievement. From the point of view of HEIs or government it enables assessment of performance given both indicators within their control (spending of various sorts) and without their control (prior achievement of incoming students). A separate investigation explores the extent to which different weighting systems can affect the rank order of different HEIs. This preliminary study looks only at the institution level in order to explore the approaches. Any serious evaluation of HEI quality should be at the discipline level, as this is where decision making takes place, be it for students, for the HEI or the government. The following resources discuss the use of performance indicators and benchmarking to enhance Higher Education Performance (Bekhradnia & Thompson, 2003; Bruneau & Savage, 2002; DfES, 2003; HEFCE, 2006; HESA, 2006b; Magd & Curry, 2003; Pursglove & Simpson, 2004). In addition, the problems faced in evaluating HEI quality are very similar to those faced for other complex situations such as ‘level’ of economic development of countries. In this situation also, composites are made by weighted sums of disparate indicators, with little attention to input as opposed to output indicators. The role of natural resources in economic development might be argued to be somewhat analogous to discipline in higher education. The first task is to describe the HEI data set. Then the results of new analyses are described. Finally, the general implications for constructing quality measures are discussed. Data Set for UK Universities 2005-2006 The data analyzed comes from the Higher Education Statistics Agency, (HESA, 2006a) which collects mandatory statistics from all UK Higher Education Institutions. HESA then sells CDs/books with summary details on students, staff, finance and destination. Cost is typically £50 per CD per Year. HESA makes some data available for free, but at the institution level or country level, but not the discipline level. HESA Performance Indicators available for free include, drop out rates by institution (but not discipline). In addition, the HOLIS page (http://www.hesa.ac.uk/acuk/maninfo/compareintro.htm ) allows comparison of one chosen HEI by discipline and type of student for indicators such as drop out rate and class of degree with the average Radical Statistics Issue 94 4 of selected other institutions. This is extraordinarily frustrating, as one cannot get a useful list of performance for all institutions in a given discipline, although one could get the information by making N/2 pairwise comparisons. Table 1 shows the indicators used in this study, obtained from Times Higher Education Supplement statistics summaries (University Performance, 2007) that are obtained from HESA. Table 1. Indicators of Higher Education Quality from Times Higher Education (THES) Statistics Indicator Code Type Mean Min Max APoint : mean AA assessment and feedback; academic support; organisation and management; and learning resources. Note b: Universities not included, as no data: Cambridge, City, E. London, Oxford, South Bank, Warwick. All Scottish. Note c: Universities excluded from because average academic pay was so low as to suggest a mis-recording: Lancaster (£21,686 Napier (£25,486 down 26% on previous year). Radical Statistics Issue 94" @default.
- W416069486 created "2016-06-24" @default.
- W416069486 creator A5027305138 @default.
- W416069486 date "2007-01-01" @default.
- W416069486 modified "2023-09-26" @default.
- W416069486 title "Misleading ‘quality’ measures in Higher Education : problems from combining diverse indicators that include subjective ratings and academic performance and costs" @default.
- W416069486 cites W1506686600 @default.
- W416069486 cites W2012972094 @default.
- W416069486 cites W2029988085 @default.
- W416069486 hasPublicationYear "2007" @default.
- W416069486 type Work @default.
- W416069486 sameAs 416069486 @default.
- W416069486 citedByCount "0" @default.
- W416069486 crossrefType "journal-article" @default.
- W416069486 hasAuthorship W416069486A5027305138 @default.
- W416069486 hasConcept C111472728 @default.
- W416069486 hasConcept C120912362 @default.
- W416069486 hasConcept C138885662 @default.
- W416069486 hasConcept C149782125 @default.
- W416069486 hasConcept C15744967 @default.
- W416069486 hasConcept C162118730 @default.
- W416069486 hasConcept C162324750 @default.
- W416069486 hasConcept C2779530757 @default.
- W416069486 hasConcept C41008148 @default.
- W416069486 hasConcept C50522688 @default.
- W416069486 hasConceptScore W416069486C111472728 @default.
- W416069486 hasConceptScore W416069486C120912362 @default.
- W416069486 hasConceptScore W416069486C138885662 @default.
- W416069486 hasConceptScore W416069486C149782125 @default.
- W416069486 hasConceptScore W416069486C15744967 @default.
- W416069486 hasConceptScore W416069486C162118730 @default.
- W416069486 hasConceptScore W416069486C162324750 @default.
- W416069486 hasConceptScore W416069486C2779530757 @default.
- W416069486 hasConceptScore W416069486C41008148 @default.
- W416069486 hasConceptScore W416069486C50522688 @default.
- W416069486 hasLocation W4160694861 @default.
- W416069486 hasOpenAccess W416069486 @default.
- W416069486 hasPrimaryLocation W4160694861 @default.
- W416069486 hasRelatedWork W1516768957 @default.
- W416069486 hasRelatedWork W1527117961 @default.
- W416069486 hasRelatedWork W1605179121 @default.
- W416069486 hasRelatedWork W1757920582 @default.
- W416069486 hasRelatedWork W2020008740 @default.
- W416069486 hasRelatedWork W2048357334 @default.
- W416069486 hasRelatedWork W2054921339 @default.
- W416069486 hasRelatedWork W2070630433 @default.
- W416069486 hasRelatedWork W2077300280 @default.
- W416069486 hasRelatedWork W2124274938 @default.
- W416069486 hasRelatedWork W2160948195 @default.
- W416069486 hasRelatedWork W2178567184 @default.
- W416069486 hasRelatedWork W2187214369 @default.
- W416069486 hasRelatedWork W2188360330 @default.
- W416069486 hasRelatedWork W2290499885 @default.
- W416069486 hasRelatedWork W2912068120 @default.
- W416069486 hasRelatedWork W3124669413 @default.
- W416069486 hasRelatedWork W3158065566 @default.
- W416069486 hasRelatedWork W41185726 @default.
- W416069486 hasRelatedWork W421943743 @default.
- W416069486 isParatext "false" @default.
- W416069486 isRetracted "false" @default.
- W416069486 magId "416069486" @default.
- W416069486 workType "article" @default.