HERI is managed by Dr. Juan-Claude Lemmens. The unit, consisting of four researchers, focuses particularly on institution-wide research into teaching, learning and student success.
Predictive analytics in collaboration with the Predictive Analytics Reporting (PAR) Framework (a division of Hobsons Inc.), a project that forms part of the current Siyaphumelela Project at the University of Pretoria commenced in September 2016. The objectives of the project are to enhance UP's ability to make evidence-based decisions to increase student access, throughput and diversity, to build high-level data analytics capacity with advanced statistical analysis of individual student data and to identify trends in student academic readiness, needs and success indicators. The information from the PAR Framework informed a new initiative in 2018, which consists of a Nudging campaign to all first-time entering first-year students enrolled for 3 and 4 year programmes. A nudge is a short, targeted message to students at a strategically appropriate time with a single call for action. The nudging campaign at UP focusses around programme credits. Students are nudged to evaluate their credit load during the registration period and actioned to consult with Faculty Administration to get the optimal credit load for the first year. The ratio of the credits failed will also be evaluated after the first semester and similar nudges will be forwarded to students.
HERI currently uses many sources to determine the factors associated with retention or attrition of undergraduate students. Various structured questionnaires have been developed to assess this phenomenon; for example, the Student Academic Readiness Survey (STARS), exit interviews with first-year students who opted to discontinue their studies are also conducted. Institutional information (for instance, on high-risk modules) - from BIRAP, is used to determine ‘risk’ on individual, module and programme level.
The Career App.tizer was developed in 2015 and became operational in 2016. The Career App.tizer is a career exploration tool aimed at high school learners. As the name suggests it aims to get high school learners to start exploring different courses and careers offered by the University of Pretoria. The Career App.tizer introduces learners to their career interests with an online career interest survey; maps their interests to courses offered by UP and link courses to careers. The app was developed by staff from the Department for Education Innovation together with students from the Department of Informatics at UP with funding from The Kresge Foundation: Siyaphumelela programme. The career guidance software has a web-based interface and an Android app on the Google Play store, available at www.careerapptizer.co.za.
The first-year experience from an academic perspective is measured by three different research sources of information (STARS; drop-out evaluation and the Cluster analysis). The information from the three sources are primarily used to identify students that could benefit from additional academic development, as provided by our Faculty Student Advisors. The first of these instruments is the Student Academic Readiness Survey (STARS), a baseline test that identifies students with needs for academic support and social support (for integration into university environment) and financial support. Most new students are surveyed during orientation week using the STARS.
As part of the second semester intervention programme, a cluster analysis is performed to identify students who had poor academic performance in the first semester. Cluster analysis is a primary statistical method for finding relatively homogeneous clusters of cases based on measured characteristics. The K-means algorithm clustering method was utilized to analyse the data. The variables that were used were the ratio of credits registered versus credits failed; the average mark for first semester, the ratio of poor. The data set was also split into sciences and non-sciences groups and the K-means cluster analysis was performed on these two groups separately. The at-risk students were required to consult the Faculty Student Advisor and join extra tutoring during the second semester, especially if they were enrolled for high impact modules (HIMs).
The aim of the cross-sectional trend analysis on student drop-out is to identify and prioritize the reasons and pre-disposing factors affecting student withdrawals at first-year level over time, as volunteered by students themselves. The primary reason for withdrawal has consistently been ‘wrong study or career choice’ since 2008 and remained the main reason in 2015 although this study only investigated the trends in drop-out in the first semester of 2015. This trend necessitates a proactive focus on career exploration and/ or career guidance before students apply at University. Career exploration and the need for career guidance is purposefully included into the STARS so that students can be identified at an early stage and then referred to Faculty Student Advisors.
Student engagement research has been popularised by the launch of the National Survey of Student Engagement (NSSE) in the US since 2000, which resulted in the formulation of the National Benchmarks of Effective Educational Practice (Kuh, 2001). It is said that the driver for the development of the student engagement concept was to shift the conversations of higher education quality to focus on students and their learning and not exclusively on the resources and reputation of higher education institutions that the ranking solely focused on.
Both the South African Survey of Student Engagement (SASSE) and the Lecturer Survey of Student Engagement (LSSE) completed the second phase of implementation at the University of Pretoria. In the first phase in 2014, we established the baseline and the areas in which our efforts in engagement should be directed. During 2015 the results were shared at the Senate sub-Committee for Teaching and Learning and a roadshow at all nine faculties teaching and learning committees saw the results disaggregated at faculty level. The presentations focussed on Effective Teaching Practices, which produced insightful evidence to make practical improvements on student engagement in 2016. The SASSE and LSSE was administered during August of 2016 with expectations that the response rate will be higher due to the expanded marketing campaign. Unfortunately, the participation rate remained the same as in 2014, with 6% of all undergraduate students and 15% of all lecturers teaching undergraduate modules. The feedback workshop was facilitated by UFS in 2017, after receipt of the institutional reports. The data were disaggregated by faculty, presented at our Tshebi data analytics committee and shared with Deputy Deans.
In response to the uncertainty around the NSC qualification, Umalusi commissioned a pilot study to investigate the ability of NSC results to act as predictors of academic success at higher education institutions in 2014. In particular, it investigated whether results in three NSC matriculation subjects – namely, English, Mathematics and Physical Science, which are commonly used by higher education institutions in their admission process – could predict the academic success of students who have been admitted to their chosen programmes, and whether this relationship has changed since the introduction of the NSC in 2008. A further investigation was commissioned at eight South African Institutions in 2016. The aim of the 2016 study is also to investigate the possible relationship between NSC results in three core school subjects (English, Mathematics and Physical Sciences) and academic success at higher education institutions as well as to understand changes in this relationship across time. In order to follow a comparable approach, researchers from the eight participating institutions met in initial workshops in 2016 and agreed on a common research approach with regard to the sample of students to be used in the study, the definition of academic success, and the analysis (in this case, Ordinal Correlation) methodology. An Excel tool was also developed to support representatives from the various institutions to run standardised analytics required for the project, with related visualisations of the results and a performance indicator dashboard. The 2016 study focuses on the academic performance of full-time, first-time entering South African students registered in 2013 to 2016 at each of the participating higher education institutions. Each faculty was analysed separately because the NSC school subjects are likely to differ in their ability to predict academic success for faculties that are also different in the nature of their research disciplines and techniques. The results from UP that there is a relative level of consistency in the correlation of the NSC subjects with GPA over time in most of the faculties. These subject may then be useful covariates to understand readiness and preparedness of first-time entering first year university students, taking into account that there is a large set of unknown variable that are not accounted for in our analyses.
The orientation programme is evaluated annually by HERI with a questionnaire. The aim of the questionnaire is to determine whether the programme attained the expected outcomes; to obtain information about how the students experienced the orientation programme and to collect suggestions about how the programme can be improved. The questionnaire is administered electronically on Qualtrics and/or as a paper-based survey, depending on the needs of the coordinator of the orientation programme.
The peer mentorship programme is managed by the Department of Student Affairs. The purpose of the programme is to facilitate the transition from school to university in order to have students fully integrated into the university environment within the first quarter of the first year. At the end of the programme HERI is responsible to evaluate the programme. As part of the evaluation process, mentees are asked to complete a survey to rate their mentor and the programme. Mentors also evaluate the coordinator and the coordinator had to evaluate each mentor’s performance. The triangulated data provide information on the effectiveness of the mentor as well as the mentee’s satisfaction with their mentor. Focus group discussions are also held with a number of mentees, mentors and coordinators to evaluate the effectiveness of the programme as a whole.
Contact us: Higher Education Research & Innovation Staff
Copyright © University of Pretoria 2024. All rights reserved.
Get Social With Us
Download the UP Mobile App