Research Article: 2025 Vol: 28 Issue: 1S
Mythili Kolluru, College of Banking and Financial Studies, Oman
K.V.Ch. Madhu Sudhana Rao, Tavant Technologies, CA, USA
Denis Hyams-Ssekasi, Institute of Management, University of Bolton, UK
Shobhna Gupta, College of Banking and Financial Studies, Muscat, Oman
Citation Information: Kolluru, M., Rao, K.V.Ch. M. S., Hyams-Ssekasi, D.,Gupta, S. (2025). Student perceptions on timed online examinations: A study of a university in northwest UK. Journal of Management Information and Decision Sciences, 28(S1), 1-14.
Purpose: This study aims to investigate undergraduate students' perceptions of timed online exams during the COVID-19 pandemic at a university in Northwest England. The study delves into the exam environment and students' challenges during online exams. Theoretical framework: The study is grounded in the context of the unprecedented disruptions caused by the COVID-19 pandemic, which necessitated a transition from classroom to online pedagogy. It examines undergraduate students' perceptions regarding online assessment and compares them to traditional paper-based examinations. Design/methodology/approach: The study utilizes a questionnaire distributed to 59 students across three levels of the undergraduate program. The collected data is analyzed using frequency analysis, percentages, and cross-tabulation to examine the relationships within the data. Specifically, Chi-Square analysis is employed to test the hypothesis. Findings: The study findings indicate that despite lacking prior experience with online learning, 70% of students prefer online assessments and perceive them as less stressful than paper-based examinations. However, challenges related to exam duration and the submission process were identified. It was also observed that computer competency is not gender-based, and the pandemic has increased students' adaptability mindset. Research, Practical & Social Implications: The studies provide valuable insights for institutions and educators to develop effective online assessment practices that consider students' perceptions and challenges. The practical implications can inform the design and implementation of online assessment strategies considering student preferences and stress levels. The social implications highlight the importance of maintaining academic integrity and credibility in the context of online assessments. Originality/value: This study contributes to the existing literature by examining undergraduate students' perceptions of online assessment during the COVID-19 pandemic in a specific region. It adds to the understanding of student preferences and challenges in the context of online evaluations, providing insights that can inform future research and the development of effective online assessment strategies.
Online assessments, Undergraduates, Perceptions, Challenges.
The World Health Organization (WHO) officially declared Coronavirus a pandemic on March 11, 2020 (Cucinotta & Vanelli, 2020). This pandemic, caused by the highly contagious Coronavirus, has profoundly impacted various aspects of human life. From disruptions in daily routines to significant modifications in education, the impact of this virus has been felt worldwide. In response to the pandemic, most countries enacted stringent measures to contain its spread, such as social distancing and restricted movement policies, closures of non-essential businesses, and suspension of in-person educational activities from primary schools through universities (Esposito & Principi, 2020; Arora et al., 2021; Joshi et al., 2021; Gamage et al., 2020). Educational institutions were forced to adjust quickly so learning could continue uninterrupted - this transition led to significant increases in for education purposes.
One of the most notable changes in education has been the adoption of online education. Traditional classrooms and exams were no longer viable solutions; therefore, educational institutions worldwide turned to online instruction (Stowell & Bennett, 2010; Appiah & Van Tonder, 2018). Online education offers various benefits, including social distancing capabilities, reduced costs, and instant grading (Stowell & Bennett, 2010; Mansell, 2009). One of the most significant challenges during this transition was adapting assessment methods to an online format (Stowell & Bennett, 2010). Due to health and logistical considerations, traditional in-person exams have become infeasible, leading educational institutions worldwide to turn towards virtual assessments as a way of maintaining academic continuity (Al-Hakeem & Abdulrahman, 2017; Ayo et al., 2007; De Haes et al., 2019; Crawford, 2020; Arora, 2021). Online assessments have many advantages, such as maintaining social distancing measures and lowering costs associated with traditional paper- based exams (Stowell & Bennett, 2010). Furthermore, students could receive instantaneous feedback (Stowell & Bennett, 2010). Studies have also demonstrated how online assessments can enhance students' motivation, engagement, and performance. However, even as online learning rapidly evolves, more must be known about its effects on student experiences and perceptions (Stack, 2015).
Online assessments have become integral to education systems in crises such as COVID-19 (Khan & Khan, 2019). Online assessments are essential in gathering data on learning outcomes, evaluating curriculum effectiveness, and measuring overall educational quality (Amigud et al., 2018). Transitioning from on-paper assessments to online assessments has been challenging. Concerns about academic integrity, identity verification, and maintaining institutional standards have arisen (Elsalem et al., 2021; Amigud et al., 2018). Technical challenges, including internet connectivity issues, access to suitable devices, and password management, have also posed obstacles, especially in regions with limited technological infrastructure (Dhawan, 2020; Hollister & Berenson, 2009).
This paper explores the evolving landscape of online assessments and the challenges and opportunities they present in the United Kingdom's higher education institutions. Drawing on a series of research studies conducted in different contexts, we delve into
students' perceptions and experiences regarding online assessments, shedding light on transitioning from traditional examination methods to the digital realm during the COVID-19 pandemic. The studies examined in this paper encompass a wide range of perspectives, including students' perceptions of online timed take-home examinations (Tam, 2022), the challenges and opportunities of online and alternative examination systems (Khadka et al., 2020), the use of Google Forms for online examinations (Solihah & Guritno, 2017), the comparison between computer-based and traditional tests (Escudier et al., 2011), the privacy and security concerns of online prDecoring services (Balash et al., 2021), the adoption of online prDecored examinations (Raman et al., 2021), the perception of electronic examinations (Umar & Wilson, 2019), and students' perceptions of online learning during the pandemic (Azhari & Kurniawati, 2021). Furthermore, we assess the effects of COVID-19 on year-end exams in chemical engineering at Imperial College London (Bhute et al., 2020).
Navigating the complex waters of digital education, it is critical to comprehend how students perceive and adapt to changes, including challenges they may encounter and any repercussions for the future of education. This paper seeks to offer insightful perspectives into the diverse world of online assessments while informing educators, institutions, and policymakers as they shape education in our rapidly developing society.
The New Norm-Online Assessment
With the advent of online education came an increased need for online assessment methods that met their demands; traditional in-person exams were no longer viable, and educational institutions quickly turned towards virtual exams (Stowell & Bennett, 2010). Virtual assessments offered many advantages over their in-person counterparts, such as social distancing effects, cost reduction from printing/distributing costs, and instantaneous feedback for students (Stowell & Bennett, 2010; Al-Hakeem & Abdulrahman, 2017; Ayo et al., 2007; De Haes et al., 2019; Crawford et al., 2020; Arora et al., 2021). Studies have proven the power of technology to enhance learning. Online exams have increased motivation, education, and student performance. Although online learning has quickly evolved in recent years, much remains to be discovered regarding its effects on students' experiences (Stack, 2015). Instructors possess an invaluable tool in using assessments online - they can shape responses, behavior, and performance by manipulating estimates (Stack, 2015). Online assessments have become an essential element of the education system, particularly during times of pandemic. They provide vital data regarding learning outcomes, curriculum effectiveness, and overall quality (Amigud et al., 2018). Online assessments significantly shape the education landscape during pandemic outbreaks (Khan & Khan, 2019).
Challenges and Opportunities
While online assessment offers flexibility and efficiency, it poses several unique challenges. Academic dishonesty, identity verification for student ID purposes, and maintaining institutional integrity have all been raised as potential concerns with online exams (Elsalem et al., 2021; Amigud et al., 2018). Furthermore, issues related to electricity provision, broadband internet connectivity access, and reliable services provide challenges, particularly in less developed nations (Doukas & Andreatos, 2007; Al-Mashaqbeh & Al Hamad, 2010).
The UK Perspective
Higher Education Institutions (HEIs) in the UK adhere to guidelines provided by the Quality Assurance Agency (QAA), which emphasize fair, transparent, and reliable assessment processes (Greensted & Slack, 1998; Dermo, 2009). Before the pandemic struck, various assessment methods, such as traditional exams, presentations, portfolios, and lab work, were utilized. As soon as the pandemic hit, though, institutions had to explore new forms of assessment - including online exams - due to limited capacity at institutions.
Understanding Students' Perceptions
To gain insights into the impact of online exams on students, a university in Northwest England conducted research addressing three dimensions - learner perception of online exams, learning environment considerations, and psychological effects due to pandemic influenza - at length. The aim was to shed light on students' transitioning experiences from traditional assessments to online examinations due to pandemic flu.
The Evolution of Online Examinations
Exams online have evolved, with various institutions adopting various approaches. While some offer designated IT rooms or laboratories for online exams, others allow students to take exams remotely (Thomas et al., 2002). Exams may take asynchronous or synchronous forms, with remote online tests increasingly popular due to pandemic outbreaks that threaten students' and staff's safety (Thomas et al., 2002).
Psychological Impact
A key concern surrounding the shift to online exams is its psychological ramifications for students. Studies have revealed that online exams may increase stress and anxiety compared to traditional in-person exams (Elsalem et al., 2020; Altuwairesh, 2021). Factors such as exam duration, question difficulty, and mode of navigation between questions can all increase stress (Elsalem et al., 2020). Furthermore, moving away from physical invigilation makes academic dishonesty easier (Hylton et al., 2016; McGee, 2013). Technical Challenges: Technical issues surrounding online exams have also become a significant source of difficulty. Issues related to internet connectivity, accessing devices for testing purposes, and password management may impede the smooth conduct of exams (Dhawan, 2020; Hollister & Berenson, 2009). These difficulties are especially evident in regions with limited technology or internet services.
Online Examination Environment
Students' academic performance can be significantly altered by taking exams online. Research by (Hollister & Berenson, 2009) has demonstrated that students taking online exams in unprDecored environments experience more pronounced variance in their
performance, often including increased incidents of cheating. Security in an online exam environment poses the most significant difficulty, as students must rely on their circumstances, such as access to Wi-Fi and device speed, when taking tests online. Unfamiliar test environments can also impede student performance (Murray, 2021), and controlled online settings may raise concerns related to cheating (Ðuric & Mahmutovic, 2021). However, (Butler et al., 2020) studies indicate that online environments may help enhance scores when another individual takes an exam alongside them. Gender and age play an essential role in computer skill competency. Studies have revealed how gender can impact how individuals assess their computer competence (Cai et al., 2017; Margolis & Fisher, 2002; Chongo et al., 2020; Van Deursen & Van Dijk, 2015). Studies on this subject have also explored differences in self- evaluation of technological skills, computational thinking abilities, and Information and Communication Technology (ICT) skills; girls tend to rate lower ICT competence than boys (Chongo et al., 2020); recent research suggests gender differences are diminishing (Punter et al., 2017). Age can influence computer skill acquisition, with older individuals typically demonstrating lower technology skills than younger ones (Reed et al., 2005; Nielsen, 2016). This pattern has been observed across age groups, such as higher education teachers aged 50 or over, who tend to demonstrate less digital competence (Kerzic et al., 2021). Understanding gender and age intersect can give educators valuable insight into how technology usage fits into education practices.
Purpose of Study
COVID-19 presented an unprecedented challenge that forced educational institutions to adapt quickly. One notable change was an abrupt switch from traditional in-person examinations to online assessments; this spurred further research on their effects on undergraduate students, including gender, age, computer skill competency, psychological effects, security concerns, malpractices, comfort considerations, and the larger context surrounding online examinations. To provide a theoretical basis for our hypotheses, we investigated key areas such as gender, age, computer skill competency, psychological effects, security malpractices, comfort concerns, and any possible larger context within online examinations.
Gender and Computer Skill Competence
Hypothesis 1A (H1A) suggests that gender may influence computer skill competency among undergraduate students. This hypothesis fits existing research indicating that cultural influences and norms may affect individuals' access to and attitudes toward technology. Gender disparities in technology-related abilities have historically been observed, with males typically reporting more access and higher levels of computer competency (Organisation for Economic Cooperation and Development, 2018). Recent trends, however, suggest that gender differences in computer skill competency may be narrowing due to increased access and shifting social attitudes.
Age and Computer Skill Competence
Hypothesis 1B (H1B) asserts that age can influence computer skill competency among undergraduate students. This hypothesis rests on the idea that young generations, having grown up during the digital age, may possess greater exposure to technology and thus possess higher levels of computer competency than their elder counterparts. Age-related disparities in technology adoption and proficiency have long been studied, with older individuals often reported as possessing lower technology-related skills (Olson et al., 2011). However, age and computer skill competency vary according to educational opportunities or individual motivation.
Determine the impact of age and gender on computer skill competency.
• Null Hypothesis (H0): No impact of gender on computer skill competency
• Alternate Hypothesis (H1): Impact of Gender on Computer skill competency
• Null Hypothesis (H0): No effect of age on computer skill competency
• Alternate Hypothesis (H1): Impact of age exists on computer skill competency
Study the psychological impact of online exams on learners.
• Null Hypothesis (H0): Online exams are not reliable and do not have any impact on learners psychologically
• Alternate Hypothesis (H1): Online exams are reliable and show an effect on learners psychologically
• Null Hypothesis (H0): Online exams are not secure and do not affect the learners psychologically
• Alternate Hypothesis (H1): Online exams are specific and affect the learners psychologically
• Null Hypothesis (H0): No relationship between malpractices in online exams and psychological impact on learners
• Alternate Hypothesis (H1): A relationship exists between malpractices in online exams and the psychological impact on learners
• Null Hypothesis (H0): Online exams are not comfortable and do not affect the learners psychologically
• Alternate Hypothesis (H1): Online exams are comfortable and affect the learners psychologically
Hypothesis 2A (H2A) contends that online exams are reliable and can have psychological effects on learners, which stems from their reliance on reliable assessment methods and their digital nature, evoking psychological responses in learners such as feelings of stress, motivation, or anxiety.
Hypothesis 2B (H2B) asserts that online exams are secure and psychologically impactful for learners. It recognizes that exam systems designed for online testing exist to maintain assessment integrity by verifying student identities and minimizing cheating while at the same time recognizing that taking exams in digital environments may elicit emotional responses such as stress, comfort, or confidence in learners.
Hypothesis 2C (H2C) suggests an association between malpractices in online exams and their impact on learners' psychological well-being. It asserts that academic dishonesty during online exams could result in adverse psychological repercussions for both those engaged in dishonesty and those not. This hypothesis encapsulates ethical considerations in online assessments and their potential effects on students' psychological well-being.
Hypothesis 2D (H2D) asserts that online exams provide learners with a comfortable experience, which may ultimately have psychological ramifications. It assumes that convenience and flexibility have positive psychological repercussions on learners' motivation or anxiety levels, possibly increasing motivation or decreasing anxiety levels, respectively. Therefore, this hypothesis underscores the significance of considering comfort level when considering the psychological effects of online exams on learners.
Understand Learner Perceptions of Online Exams
Students at Northwest UK University were chosen as respondents. These students pursue bachelor's degrees at the university; most respondents are full-time, and very few are employed. The respondents are studying their first, second, or third year of a bachelor's degree. The questionnaire was shared as a Google form with students, and 59 responses were recorded. The questionnaire consists of both open & closed-ended questions. This questionnaire is divided into four sections. Section 1 relates to demographic details. Section 2 refers to questions on the online study environment. Section 3 deals with the psychological impact of timed online exams on students. Section 4 looks at the learners' perceptions of timed online examinations. Responses were analyzed using Chi-Square analysis, frequency analysis, and percentages. The mainframe statistical model of cross-tabulation was used to examine the relationships within the data. Frequency percentages were calculated using pivot tables (Table 1).
Table1 Demographic Analysis | |
Category | Percentage(%) |
First-Year Degree Students | 71 |
Full-Time Students | 92 |
Age 18-30 | 61 |
Female | 53 |
The demographic analysis indicates that 71% of the respondents are studying for their first-year degree, 92% are full-time students, and 61% are 18-30. 53% of the respondents are female (Table 2).
Table2 Gender-Based Computer Skill Competency Grid | ||||
Computerskill Competency | Level of computer skill | TypingSpeed | ||
Male | Female | Male | Female | |
Basic | 17% | 10% | 14% | 10% |
Intermediate | 14% | 17% | 15% | 24% |
Moderate | 5% | 8% | 10% | 12% |
Advanced | 10% | 15% | 7% | 7% |
Expert | 2% | 2% | 2% | 0% |
The above table shows that computer skill level and typing speed remain almost identical regardless of gender. Thus, the inference is that computer skill competency is not gender-based (Table 3).
Table3 Student Exam Environment Characteristics | ||
Environmentcharacteristics | Yes | No |
Quite a space for the exam | 97% | 3% |
Access to a suitable device | 95% | 5% |
Are you borrowing a device? | 15% | 85% |
Satisfied with exam software | 92% | 8% |
Facing any internet connectivity issues? | 31% | 69% |
The results show that 97% of the respondents have access to a comfortable space to attend the exam, 95% have suitable devices, and a majority of 92% are well-versed with the exam software. 31% of the respondents face internet connectivity issues because they reside in remote locations. The inference is that the pandemic has increased their adaptability to study with the laptop approach. The university has provided the students with user-friendly software and trained them on its usage (Table 4).
Table4 The Psychological Impact of Timed Exams on Online Learners | ||
Impact | Frequency | Percentage |
No stress is experienced during exams | 6 | 10% |
Paper-based and timed online exams are equally stressful | 21 | 36% |
Timed online exams are less stressful than paper-based exams | 21 | 36% |
Timed online exams are more stressful than paper-based exams | 11 | 19% |
72% of respondents state that online exams are less stressful than paper-based exams. The two years of online learning and assessment during the pandemic have made the students well-versed and comfortable with online exams (Table 5).
Table5 Degree of Stress Experienced | |||||
Factors | SA % |
A % |
N % |
D % |
SD % |
Internet connection problem | 15 | 17 | 31 | 27 | 10 |
Exam structure | 19 | 24 | 39 | 14 | 5 |
Exam preparedness | 14 | 31 | 47 | 7 | 2 |
*Exam duration | 24 | 31 | 31 | 14 | 2 |
*Exam submission process | 24 | 25 | 29 | 17 | 5 |
Exam question difficulty | 8 | 31 | 44 | 12 | 5 |
Theexam environment is appropriate |
20 | 24 | 37 | 17 | 2 |
*Teaching methods have properly covered exam material | 32 | 27 | 22 | 14 | 5 |
The above table indicates that students are experiencing various challenges, of which exam duration (55%), submission process (49%), and syllabus completion (59%) stand out as significant student concerns (Table 6).
Table6 Computer Skills By Gender. | |||||
Gender | Advanced | Basic | Expert | Intermediate | Moderate |
Female | 9 | 6 | 1 | 10 | 5 |
Male | 6 | 10 | 1 | 8 | 3 |
Total | 15 | 16 | 2 | 18 | 8 |
Hypothesis 1
Null Hypothesis (H10): No impact of gender on computer skill competency Alternate Hypothesis (H11): Impact of Gender on Computer skill competency
Conclusion: As the p-value (0.70) is > 0.05, we do not have sufficient evidence to prove that gender impacts computer skill competency. Hence, it is established that gender has no impact on computer skill competency (Table 7).
Table7 Computer Skills by Age Group | |||||
Age Group |
Advanced | Basic | Expert | Intermediate | Moderate |
18-25 | 5 | 12 | 1 | 5 | 4 |
26-30 | 1 | 3 | 0 | 4 | 1 |
31-35 | 5 | 1 | 1 | 4 | 1 |
36+ | 4 | 0 | 0 | 5 | 2 |
Hypothesis 2
Null Hypothesis (H20): No impact of age on computer skill competency Alternate Hypothesis (H21): Impact of age exists on computer skill competency
Conclusion: As the p-value (0.22) is > 0.05, we do not have sufficient evidence to prove that age impacts computer skill competency. Hence, it is established that age does not affect computer skill competency.
This study aimed to gain comprehensive insights into the perceptions of undergraduate students in Northwest England regarding timed online exams, the associated examination environment, and the psychological impact of such assessments. The results derived from Table 1 provide a valuable starting point, revealing that most students (58%) possess only basic computer knowledge. This critical finding has significant implications, potentially affecting their performance in timed online exams, particularly when faced with
open-ended questions. Remarkably, this observation aligns closely with previous research conducted by (Özden, 2004), who extensively explored the dynamics of computer literacy and its consequences in the context of online assessments. Their analysis illuminated that individuals lacking prior experience with online reviews may experience apprehension and uncertainties, affecting their performance.
A noteworthy revelation emerges when we focus on the results presented in Table 3. Only 19% of students surveyed (n=59) expressed that timed online exams are more stressful than traditional paper-based exams. Such a finding might appear counterintuitive at first glance, considering the widely held perception that online exams can be inherently stressful due to factors like time constraints and the unfamiliarity of the format. However, it is essential to contextualize this result within the broader narrative of the COVID-19 pandemic, which has significantly shaped the educational landscape in recent times. Given that the pandemic has persisted for nearly two years, it is plausible that students have had ample time to adapt to the unique stressors associated with online exams. This could explain the discrepancy between these findings and those of (Elsalem et al., 2020), who, in their study, identified a significant association between students and anxiety during online exams (Stowell & Bennet, 2010; Farzin, 2017).
Contrary to the prevailing notion that online examinations exacerbate stress, our study provides a novel perspective by demonstrating that only a small fraction (19%) of students found timed online exams more stressful than traditional paper-based exams. This finding challenges the widespread belief about the inherent stress of online examinations, suggesting a possible shift in student adaptability and resilience in the face of the COVID-19 pandemic. This aspect of our research introduces a fresh viewpoint, illustrating how students' perceptions have evolved during this unprecedented period.
The perceived ease of cheating in online exams, as revealed by the data presented in Perception of Learners Table 5, adds another layer of complexity to the discussion. Surprisingly, only 44% of the students surveyed (n=59) believe cheating is more straightforward in online exams. This perception diverges from the findings of (Chirumamilla et al., 2020), who asserted that online exams conducted at home increased the likelihood of dishonest student behaviors. It is imperative to consider the evolving landscape of online exam prDecoring and security measures, which have seen significant advancements in recent years. The availability of sophisticated software tools designed to prevent and detect cheating may contribute to the changing perception among students (Petrisor et al., 2011). Our research contributes a novel insight into the evolving perceptions of cheating in online exams. Unlike previous studies that suggested a higher propensity for dishonesty in online settings, we found that only 44% of students believe cheating is more straightforward in online exams. This divergence in perception underscores a potential change in student attitudes toward academic integrity in the digital age, indicating a need to reassess and update our understanding of academic misconduct in online assessments.
Exploring the challenges associated with timed online exams, as detailed in Table 4, highlights a particularly salient issue: the exam duration. This factor emerged as a significant challenge faced by students, and it is essential to explore deeper into the reasons behind this observation. Some students, particularly those with only basic computer knowledge (as identified in Table 1), may need help with timely navigation and response in the online exam environment. Typing speed and familiarity with digital interfaces could contribute to this
challenge. Notably, these findings resonate with those of (Elsalem et al., 2020; Timmis et al., 2016), who identified exam duration as one of the critical factors causing stress. To address this challenge effectively, universities may consider offering computer competency training to students and adjusting the exam duration to provide a more comfortable experience.
Furthermore, the results derived from Table 2 underscore the importance of the examination environment in online assessments. The critical characteristics of a practical exam environment, including a calm space for examination, access to suitable devices, and satisfaction with exam software, were identified as essential factors influencing students' performance (Wibowo et al., 2016). These findings align closely with the work of (Hollister & Berenson, 2009), who emphasized the pivotal role of the exam environment in shaping students' performance outcomes. Therefore, universities and educational institutions should prioritize creating conducive online exam environments to ensure students can perform to the best of their abilities (Williamson, 2018). An innovative aspect of our study is identifying exam duration as a significant challenge, particularly for students with basic computer skills. This highlights an often-overlooked dimension of online assessments - the interplay between technical proficiency and exam performance. Our findings suggest that the content and format of online exams must be tailored to students' diverse technological competencies. This contribution is crucial for informing more inclusive and equitable online examination strategies.
Shifting the focus to examining gender differences in computer skill competency, the results presented in Table 6 show a fascinating revelation. Contrary to the expectations set by prior studies (Cai et al., 2017; Margolis & Fisher, 2002; Chongo et al., 2020; Van Deursen & Van Dijk, 2015), the data indicate that there is no significant difference in computer skill competency between male (n=28) and female (n=31) students among the surveyed group (n=59). This finding challenges conventional wisdom, often pointing to gender-based disparities in computer skill competency. The observed variation in our research may be attributed to various evolving factors that have shaped the educational landscape over time. These factors could include progressive educational initiatives, rapid technological advancements, and shifting societal attitudes toward gender roles and technology-related fields (Dreher et al., 2011; Butler-Henderson & Crawford, 2020). This challenges longstanding stereotypes and suggests narrowing the digital divide across gender and age. Our findings provide a contemporary understanding of computer literacy, which is vital for developing equitable digital education strategies.
Turning our attention to the impact of age on computer skill competency, the results from Table 7 provide valuable insights. The data suggests that age does not significantly impact computer skill competency among the surveyed students. This finding contradicts previous research by (Nielsen, 2016; Kerzic et al., 2021), which indicated age-related differences in computer proficiency. One potential explanation for this discrepancy could be rooted in the extraordinary circumstances created by the COVID-19 pandemic. The pandemic necessitated a rapid and widespread shift to online learning, compelling students of all ages to adapt to digital tools and platforms swiftly. This unprecedented digital transformation reduced disparities in computer skill competency among different age groups. The findings hint at the potential for technology to bridge generational divides in computer literacy (Rowan & Murray, 2021; Fields & Johnson, 2006). A novel aspect of our research is the emphasis on the importance of the examination environment in online assessments. We
identified key characteristics of a conducive exam environment and their significant influence on students' performance. This insight is crucial for educational institutions to understand as they develop and refine their online assessment strategies. Our findings contribute to a more nuanced understanding of the environmental factors that impact the efficacy of online exams. This comprehensive study meticulously investigated three critical dimensions of online exams among undergraduate students in Northwest England. The findings, robustly supported by statistical analyses, have provided profound insights into various aspects, including computer skill competency, the psychological impact of online exams, the examination environment, and learner perceptions. While many of these results align closely with prior research, it is essential to acknowledge the study's inherent limitations. The research primarily focuses on undergraduate students within a specific geographic region, which could influence the generalizability of the findings. To enhance the scope and applicability of future research, scholars may consider expanding the survey to encompass a more diverse and extensive student population across various educational levels. Additionally, there is a valuable opportunity to correlate students' performance with their online exam perceptions, providing educators and policymakers with more robust and actionable insights to improve the online learning experience. This study represents a vital contribution to the evolving field of online education, shedding light on the multifaceted dynamics that shape students' experiences and outcomes in the digital age.
Al-Hakeem, M. S., & Abdulrahman, M. S. (2017). Developing a new e-exam platform to enhance the university academic examinations: The case of Lebanese French University. International Journal of Modern Education and Computer Science, 9(5), 9. https://www.mecs-press.org/ijmecs/ijmecs-v9-n5/IJMECS-V9-N5-2.pdf.
Indexed at, Google Scholar, Cross Ref
Al-Mashaqbeh, I. F., & Al Hamad, A. (2010). Student's perception of an online exam within the decision support system course at Al al Bayt University. Second International Conference on Computer Research and Development, 131–135. https://doi.org/10.1109/ICCRD.2010.15.
Indexed at, Google Scholar, Cross Ref
Altuwairesh, N. (2021). Female Saudi University students' perceptions of online Education amid COVID-19 pandemic. Arab World English Journal (AWEJ) Special Issue on Covid 19..
Indexed at, Google Scholar, Cross Ref
Amigud, A., Arnedo-Moreno, J., Daradoumis, T., & Guerrero-Roldan, A. E. (2018). An integrative review of security and integrity strategies in an academic environment: Current understanding and emerging perspectives. Computers & Security, 76, 50–70.
Indexed at, Google Scholar, Cross Ref
Appiah, M., & Van Tonder, F. (2018). E-assessment in Higher Education: A review. International Journal of Business Management & Economic Research, 9(6), 1454–1460.
Arora, S., Chaudhary, P., & Singh, R. K. (2021). Impact of Coronavirus and online exam anxiety on self-efficacy: The moderating role of coping strategy. Interactive Technology and Smart Education, 18(3), 475–492.
Indexed at, Google Scholar, Cross Ref
Ayo, C. K., Akinyemi, I. O., Adebiyi, A. A., & Ekong, U. O. (2007). The prospects of e-examination implementation in Nigeria. Turkish Online Journal of Distance Education, 8(4), 125–134.
Indexed at, Google Scholar, Cross Ref
Azhari, T., & Kurniawati, S.P. (2021). Analysis of Students’ Perception of Online Learning (A Study of Three Universities in Aceh).Jurnal Scientia,10(1),55–61.
Indexed at, Google Scholar, Cross Ref
Balash, D.G., Kim, D., Shaibekova, D., Fainchtein, R.A., Sherr, M., & Aviv, A.J. (2021). Examining the examiners: Students' privacy and security perceptions of online proctoring services. In Seventeenth symposium on usable privacy and security (SOUPS 2021), 633–652.
Bhute, V.J., Campbell, J., Kogelbauer, A., Shah, U.V., & Brechtelsbauer, C. (2020). Moving to timed remote assessments: the impact of COVID-19 on year-end exams in chemical engineering at Imperial College London. Journal of Chemical Education, 97(9), 2760–2767.
Indexed at, Google Scholar, Cross Ref
Butler-Henderson, K., & Crawford, J. (2020). A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Computers & Education, 159, 104024.
Indexed at, Google Scholar, Cross Ref
Cai, Z., Fan, X., & Du, J. (2017). Gender and attitudes toward technology use: A meta-analysis. Computers & Education, 105, 1–13.
Indexed at, Google Scholar, Cross Ref
Chirumamilla, A., Sindre, G., & Nguyen-Duc, A. (2020). Cheating in e-exams and paper exams: The perceptions of engineering students and teachers in Norway. Assessment & Evaluation in Higher Education, 45(7), 940–957.
Indexed at, Google Scholar, Cross Ref
Crawford, J., Butler-Henderson, K., Rudolph, J., Malkawi, B., Glowatz, M., Burton, R., et al. (2020). COVID-19: 20 countries’ higher Education intra-period digital pedagogy response. Journal of Applied Learning and Teaching, 3(1), 1–20.
Indexed at, Google Scholar, Cross Ref
Cucinotta, D., & Vanelli, M. (2020). WHO declares COVID-19 a pandemic. Acta bio medica: Atenei parmensis, 91(1), 157.
Indexed at, Google Scholar, Cross Ref
De Haes, S., Huygh, T., Joshi, A., & Caluwe, L. (2019). National corporate governance codes and IT governance transparency in annual reports. Journal of Global Information Management, 27(4), 91–118.
Indexed at, Google Scholar, Cross Ref
Dermo, J. (2009). e-Assessment and the student learning experience: A survey of student perceptions of e-assessment. British Journal of Educational Technology, 40(2), 203–214.
Indexed at, Google Scholar, Cross Ref
Doukas, N., & Andreatos, A. (2007). Advancing electronic assessment. International Journal of Computers Communications & Control, 2(1), 56–65.
Indexed at, Google Scholar, Cross Ref
Dreher, C., Reiners, T., & Dreher, H. (2011). Investigating factors affecting the uptake of automated assessment technology. Journal of Information Technology Education: Research, 10(1), 161–181.
Indexed at, Google Scholar, Cross Ref
Đurić, J., & Mahmutović, A. (2021). Software for Writing Online Exams with Video and Audio Surveillance-Cheatless. In 2021 44th International Convention on Information, Communication and Electronic Technology. 654-659. IEEE.
Indexed at, Google Scholar, Cross Ref
Elsalem, L., Al-Azzam, N., Jum'ah, A. A., Obeidat, N., Sindiani, A. M., & Kheirallah, K. A. (2020). Stress and behavioral changes with remote E-exams during the Covid-19 pandemic: A cross-sectional study among undergraduates of medical sciences. Annals of Medicine and Surgery, 60, 271-279.
Indexed at, Google Scholar, Cross Ref
Elsalem, L., Al-Azzam, N., Jum’ah, A. A., & Obeidat, N. (2021). Remote E-exams during Covid-19 pandemic: A cross-sectional study of students’ preferences and academic dishonesty in faculties of medical sciences. Annals of Medicine and Surgery, 62, 326–333.
Indexed at, Google Scholar, Cross Ref
Escudier, M.P., Newton, T.J., Cox, M.J., Reynolds, P.A., & Odell, E.W. (2011). University students' attainment and perceptions of computer-delivered assessment; a comparison between computer‐based and traditional tests in a 'high‐stakes' examination. Journal of Computer Assisted Learning, 27(5), 440–447.
Indexed at, Google Scholar, Cross Ref
Esposito, S., & Principi, N. (2020). School closure during the coronavirus disease 2019 (COVID-19) pandemic: An effective intervention at the global level? JAMA pediatrics, 174(10), 921–922.
Indexed at, Google Scholar, Cross Ref
Farzin, S. (2017). Attitude of students towards e-examination system: An application of e-learning. Science Journal of Education, 4(6), 222–227.
Indexed at, Google Scholar, Cross Ref
Fields, P. J., & Johnson, E. P. (2006). An example of individualized learning and assessment through computerized testing. In Presented at The International Conferences on Teaching Statistics, ICOTS-7.
Gamage, K. A., Silva, E. K. D., & Gunawardhana, N. (2020). Online delivery and assessment during COVID-19: Safeguarding academic integrity. Education Sciences, 10(11), 301.
Indexed at, Google Scholar, Cross Ref
Greensted, C., & Slack, J. (1998). Response to the QAA consultation paper on the quality assurance and standards framework for UK Higher Education. Quality Assurance in Education, 6(3), 141-144.
Indexed at, Google Scholar, Cross Ref
Hollister, K. K., & Berenson, M. L. (2009). Proctored versus unproctored online exams: Studying the impact of exam environment on student performance. Decision Sciences Journal of Innovative Education, 7(1), 271–294.
Indexed at, Google Scholar, Cross Ref
Hylton, K., Levy, Y., & Dringus, L. P. (2016). Utilizing webcam-based proctoring to deter misconduct in online exams. Computers & Education, 92, 53–63..
Indexed at, Google Scholar, Cross Ref
Joshi, A., Vinay, M. & Bhaskar, P. (2021). Impact of coronavirus pandemic on the Indian education sector: Teachers' Perspectives on online teaching and assessments. Interactive Technology and Smart Education, 18(2), 205–226.
Indexed at, Google Scholar, Cross Ref
Khadka, B.K., Rokaya, B.B., Roka, J., & Bhatta, P.D. (2020). Perceptions, issues, and challenges towards online and alternative examinations system: A case of mid-western university. International Journal of Innovative Science and Research Technology. 5(11):105–14.
Indexed at, Google Scholar, Cross Ref
Khan, S., & Khan, R. A. (2019). Online assessments: Exploring perspectives of university students. Education and Information Technologies, 24(1), 661–677.
Indexed at, Google Scholar, Cross Ref
Mansell, W. (2009). Why has the e-assessment yet to arrive more quickly? The Guardian. http://www.guardian.co.uk/education/2009/jul/21/online-exams-schools.
Margolis, J., & Fisher, A. (2002). Unlocking the clubhouse: Women in computing. MIT Press.
Indexed at, Google Scholar, Cross Ref
McGee, P. (2013). Supporting academic honesty in online courses. Journal of Educators Online, 10(1), 1–31.
Indexed at, Google Scholar, Cross Ref
Murray, K. E. (2021). Take note: Teaching law students to be responsible stewards of technology. Cath UL Rev, 70(2), 201.
Nielsen, J. (2016). The distribution of users' computer skills: Worse than you think. Nielson Norman Group.
Chongo, S., Osman, K., & Nayan, N. A. (2020). Level of computational thinking skills among secondary science students: Variation across gender and mathematics achievement. Science Education International, 31(2), 159–163.
Indexed at, Google Scholar, Cross Ref
Olson, K. E., O’Brien, M. A., Rogers, W. A., & Charness, N. (2011). Diffusion of technology: frequency of use for younger and older adults. Aging International, 36(1), 123–145.
Indexed at, Google Scholar, Cross Ref
Organization for Economic Cooperation and Development (OECD). (2018). Bridging the digital gender divide: Include, upskill, innovate. OECD.
Özden, M.Y. (2004). Students’ perceptions of online assessment: A case study. International Journal of E-Learning & Distance Education/Revue Internationale due-learning et la formation à distance, 19(2), 77–92.
Petrişor, M., Măruşteri, M., Ghiga, D., & Şchiopu, A. (2011). Online assessment system. Applied Medical Informatics, 28(1), 23–28.
Punter, R. A., Meelissen, M. R., & Glas, C. A. (2017). Gender differences in computer and information literacy: An exploration of the performances of girls and boys in ICILS 2013. European Educational Research Journal, 16(6), 762–780..
Indexed at, Google Scholar, Cross Ref
Raman, R., Vachharajani, H., & Nedungadi, P. (2021). University students ' adoption of online proctored examinations during COVID-19: Innovation diffusion study. Education and information technologies, 26(6):7339–7358.
Indexed at, Google Scholar, Cross Ref
Rowan, L., & Murray, F. (2021). Online learning should change the way that exams work. https://www.universityworldnews.com/post.php?story=20210623135318737.
Kerzic, D., Danko, M., Zorko, V., & Decman, M. (2021). The effect of age on higher Education teachers' ICT use. Knowledge Management & E-Learning, 13(2), 182–193..
Indexed at, Google Scholar, Cross Ref
Reed, K., Doty, D. H., & May, D. R. (2005). The impact of aging on self-efficacy and computer skill acquisition. Journal of Managerial Issues, 17(2), 212–228.
Solihah, M. A., & Guritno, A. (2017). University students' perception of online examination using Google form. Britania JournalEnglish Teaching.
Stack, S. (2015). The impact of exam environments on student test scores in online courses. Journal of Criminal Justice Education, 26(3), 273–282.
Indexed at, Google Scholar, Cross Ref
Stowell, J. R., & Bennett, D. (2010). Effects of online exams on student exam performance and test anxiety. Journal of Educational Computing Research, 42(2), 161–171.
Indexed at, Google Scholar, Cross Ref
Tam, A. C. F. (2022). Students' perceptions of and learning practices in online timed take-home examinations during COVID-19. Assessment & Evaluation in Higher Education, 47(3), 477–492.
Indexed at, Google Scholar, Cross Ref
Thomas, P., Price, B., Paine, C., & Richards, M. (2002). Remote electronic examinations: Student experiences. British Journal of Educational Technology, 33(5), 537–549.
Indexed at, Google Scholar, Cross Ref
Timmis, S., Broadfoot, P., Sutherland, R., & Oldfield, A. (2016). Rethinking assessment in a digital age: Opportunities, challenges, and risks. British Educational Research Journal, 42(3), 454–476.
Indexed at, Google Scholar, Cross Ref
Umar, M. A., & Wilson, F. (2019). Perception of electronic examination among undergraduate students of the University of Maiduguri. International Journal of Humanities and Education Development, 1(5), 211–221.
Indexed at, Google Scholar, Cross Ref
Van Deursen, A. J., & Van Dijk, J. A. (2015). Toward a multifaceted model of internet access for understanding digital divides: An empirical investigation. The Information Society, 31(5), 379–391.
Indexed at, Google Scholar, Cross Ref
Wibowo, S., Grandhi, S., Chugh, R., & Sawir, E. (2016). A pilot study of an electronic exam system at an Australian university. Journal of Educational Technology Systems, 45(1), 5–33.
Indexed at, Google Scholar, Cross Ref
Williamson, M. H. (2018). Online exams: The need for best practices and overcoming challenges. The Journal of Public and Professional Sociology, 10(1), 2.
Indexed at, Google Scholar, Cross Ref
Received: 23-Dec-2024 Manuscript No. JMIDS-24-15568; Editor assigned: 24-Dec-2024 Pre QC No. JMIDS-24-15568(PQ); Reviewed: 27-Dec-2024 QC No. JMIDS-24-15568; Revised: 31-Dec-2024 Manuscript No. JMIDS-24-15568(R); Published: 07-Jan-2025