Acadlore takes over the publication of JORIT from 2025 Vol. 4, No. 3. The preceding volumes were published under a CC-BY 4.0 license by the previous owner, and displayed here as agreed between Acadlore and the owner.
Exploring Academics’ Acceptance of Technology in Statistics Education: Evidence from Confirmatory Factor Analysis
Abstract:
The aim of this study is to evaluate the performance of a proposed model utilizing the Technology Acceptance Model (TAM) to forecast student perceptions of statistics education with advanced technology. A total of 379 undergraduate students from Malaysia’s East Coast region were recruited using a simple random sampling technique. This study incorporates six main constructs that are tested simultaneously, namely social influence, self-efficacy, perceived usefulness, perceived ease of use, attitude toward using, and behavioural intention. The Pooled Confirmatory Factor Analysis (PCFA) was employed to assess the factor loadings and fitness of the model being tested. Moreover, the Composite Reliability (CR) and Average Variance Extracted (AVE) were established to assess their reliability and validity. The results of the Confirmatory Factor Analysis (CFA) demonstrated that all six constructs achieved satisfactory levels of model fit, reliability, and validity. These findings confirm that the measurement model is statistically robust and that each construct is well-defined and appropriate for further analysis. Given their strong psychometric properties, these constructs provide a solid foundation for future research and should be considered for further investigation by examining the structural relationships among them, particularly in the context of technology adoption in statistics education.
1. Introduction
The integration of educational technology (ET) has become a fundamental element across all levels of education, from primary and secondary schools to higher education institutions. Widely acknowledged for its role in enhancing teaching and learning, ET has had a transformative impact on educational systems globally. Universities, in particular, are under growing pressure to embed ET into their curriculum to better prepare students with the necessary skills for success in a highly competitive job market. Consequently, there is an increasing expectation for universities to equip undergraduates with relevant workforce competencies, especially in computer literacy and the use of advanced technologies that can distinguish them in their future careers.
In the field of statistics education - especially for students in the social sciences - grasping the subject matter presents unique challenges. The ability to analyse data, perform statistical calculations, and draw accurate conclusions often requires foundational knowledge that many students may not possess. Statistics remains a pivotal discipline in driving scientific discoveries, informing decision-making, and supporting accurate forecasting. For beginners, statistics courses are structured to explain fundamental concepts while simultaneously demonstrating their real-world applications across various fields. However, students who lack basic proficiency in statistics and mathematics may struggle to grasp these concepts. The transition to online learning during the COVID-19 pandemic further intensified these difficulties, requiring students to adapt to digital platforms, conduct analyses, and use statistical software in remote environments.
According to Tessema & Nicola-Gavrilă (2023), assessing the experiences of higher education institutions with online learning is essential, especially given the recent shift to digital education. These evaluations help identify how effective current technologies are in teaching, highlight key challenges, and suggest areas for improvement. In many developing countries, the future of education depends on how quickly and effectively national systems can adapt to technological change, requiring updates in both teaching methods and educational content.
A substantial body of research has examined academic staff perceptions toward the adoption of technology in statistics education. The Technology Acceptance Model (TAM) has been extensively utilized in educational settings to predict and explain user behaviour in relation to technology use (Davis, Granic, & Marangrunic, 2023; Al-Qaysi, Nordin & Emran, 2020). A systematic review by Ulfert-Blank & Schmidt (2022) identified several critical factors that influence the adoption of instructional technology, including self-efficacy, subjective norms, perceived enjoyment, supporting conditions, anxiety, system accessibility, and perceived complexity. Moreover, research applying an extended TAM within Yemeni institutions has demonstrated how variables such as perceived usefulness, ease of use, attitude, and self-efficacy significantly affect users’ behavioural intentions to adopt technology in fields like accounting education (Lin & Roberts, 2020; Ho et al., 2020).
Within the scope of this research, many particular technologies often employed in statistics education were investigated to fit the Technology Acceptance Model (TAM) framework. Across all courses, Moodle is the main learning management system (LMS) in most public universities in Malaysia, including those in the East Coast area. Lecturers can post a wide variety of materials on Moodle including lecture slides, extra files, notes, recorded teaching sessions, quizzes, tests, final assessments, and assignment submissions. Platforms including Zoom, Webex, Google Meet, and Microsoft Teams are commonly used to provide real-time interaction between professors and students in synchronous online courses. Some professors have also spent money on personal teaching equipment like digital whiteboards to improve the online instruction in technical courses like statistics, where thorough explanations are needed for formulas and problem-solving. Particularly when teaching difficult statistical ideas, these technologies let for a more dynamic and visually supportive learning environment. The study offers pertinent insights on how digital platforms and teaching innovations affect the learning of statistics in a post- pandemic academic context by examining students' acceptance of these tools.
Previous research has shown that the Technology Acceptance Model (TAM) is a useful framework for capturing academics' perceptions of technology use in statistics education, incorporating factors such as self- efficacy, subjective norms, enjoyment, supportive conditions, and system accessibility (Marikyan, Papagiannidis & Stewart, 2023). Though the Technology Acceptance Model (TAM) in digital education research is rather developed, its use in statistics education - especially on the East Coast of Malaysia - remains underexplored. Applying TAM to a particular regional setting where digital learning only started to pick up speed following the COVID-19 epidemic, this paper fills in this gap. Using Confirmatory Factor Analysis (CFA), the study provides a more thorough and confirmed assessment technique than earlier efforts that mostly relied on exploratory techniques or simple quantitative measures. Given the complexity of formulas and calculations, evaluating digital adoption in statistics education offers further understanding of how well technology supports cognitively demanding learning. This study is a significant and original contribution to the field as, to the authors' knowledge, no previous research has particularly looked at this event in East Coast area. The following sections will present the literature review, research methodology, key findings, and overall conclusions.
2. Literature Review
This study adapts the Technology Acceptance Model (TAM) to align with the pedagogical setting of undergraduate statistics instruction, since it provides a robust, theoretically sound framework for examining students' behavioural intentions about technology utilization. As digital learning tools increasingly permeate classroom settings, especially for intricate courses such as statistics that necessitate comprehension of formulas and calculations, it is crucial to evaluate students' acceptability and preparedness to utilize these digital platforms. The Technology Acceptance Model (TAM), emphasizing characteristics like perceived utility and perceived ease of use, is particularly pertinent for analysing students' perceptions of digital tools in augmenting their learning experience. Furthermore, few other models encapsulate the intricacies of technology acceptability in educational contexts as thoroughly as the Technology acceptability Model (TAM). Prior research regularly endorses the Technology Acceptance Model (TAM) as a reliable framework for such inquiries, hence validating its application in this context.
This study utilizes the Technology Acceptance Model (TAM) to conform to recognized research methodologies while simultaneously addressing a notable vacuum in the literature about digital adoption in statistics education, particularly within the distinct post-pandemic educational environment of the East Coast region in Malaysia. (Natasia et al., 2022). Within the educational context, TAM has been used to identify factors that predict the acceptance or rejection of technology in learning environments (Musa et al., 2024; Alfadda & Mahdi, 2021). As one of the most established frameworks for understanding technology usage in education, TAM focuses on the learner’s acceptance of technological tools (Mailizar, Almanthari, & Maulina, 2021). It also considers essential elements such as behavioural intention, attitude toward use, perceived usefulness, and perceived ease of use, which are critical to understanding how individuals adopt educational technologies (Tahar et al., 2020). Over time, researchers have extended TAM by integrating additional variables such as perceived enjoyment, self-efficacy, subjective norms, and facilitating conditions to gain a more comprehensive view of technology adoption, particularly in fields like statistics education (Li, Zhao, & Pu, 2020).
Perceived ease of use refers to the degree to which an individual believes that a technology is simple and effortless to use (Prastiawan et al., 2021). As a core component of TAM, it is often assessed alongside perceived usefulness to evaluate users’ attitudes toward adopting new technologies. Research suggests that ease of use may serve as a causal antecedent of perceived usefulness, meaning users’ evaluation of a technology’s simplicity may directly influence their perception of its value (Chen & Aklikokou, 2020). A technology perceived as easy to use can also enhance users’ self-efficacy - their confidence in performing tasks using the technology (Filieri et al., 2021). In combination, perceived ease of use and usefulness shape users' overall judgments and acceptance of a technology (Caffaro et al., 2020). Ultimately, ease of use plays a vital role in technology adoption, with its relevance often assessed using multi-item measurement scales.
Perceived usefulness is defined as the belief that using a particular technology or system will enhance performance or productivity (Pitafi, Kanwal, & Khan, 2020; Singh & Sinha, 2020). It is a central construct in TAM and has consistently been shown to significantly influence behavioural intention across various contexts. In the domain of statistics education, perceived usefulness is strongly associated with students’ willingness to adopt e- learning tools (Daragmeh, Lentner, & Sagi, 2021). This highlights the importance of students perceiving technology as beneficial to their learning outcomes. Therefore, enhancing users’ beliefs about the advantages of educational technology is essential in motivating its adoption. From an institutional perspective, improving these perceptions can lead to favourable individual and organizational outcomes, reinforcing the importance of prioritizing perceived usefulness in the design and implementation of educational technologies.
Attitude toward using technology significantly impacts both the acceptance and intended use of educational tools. Research indicates that attitudes are shaped by perceptions of usefulness, ease of use, and overall user experience (Sinaga & Pustika, 2021; Wang et al., 2021). A user’s emotional response and evaluative judgment about the benefits and simplicity of a technology are strong indicators of their attitude (Ismaili, 2021). Moreover, ease of use contributes to a positive attitude by meeting user expectations and improving system usability (Atabek, 2020). A favourable attitude toward technology often correlates with stronger behavioural intentions to use it (Conolly et al., 2020). Ultimately, perceived usefulness, ease of use, and previous experience play a critical role in shaping individuals’ attitudes and influencing their decisions to adopt educational technologies (Al-Rahmi et al., 2021).
Self-efficacy - the belief in one’s ability to perform tasks using technology - is a crucial factor influencing its adoption in statistics education. Studies have shown that individuals with high self-efficacy are more likely to accept and utilize technological tools (Natasia, Wiranti, & Parastika, 2022). In this context, technological self-efficacy refers to students’ confidence in using digital resources for learning statistics (Fauzi et al., 2021). Learners with higher self-efficacy are more likely to perceive these tools as useful and easy to use, leading to greater motivation and a more positive attitude toward technology-driven learning (Unal & Uzun, 2021). Furthermore, self-efficacy can mediate the relationship between learning motivation and technology acceptance. Students who perceive technology as effective and user-friendly are more motivated to engage in self-directed learning (Al-Rahmi et al., 2021). Strengthening self-efficacy through training, technical support, and positive learning experiences can significantly enhance students' readiness to adopt technology in statistics education (Murillo et al., 2021).
Social influence plays a vital role in shaping individuals' decisions to adopt educational technology, particularly in statistics education. It encompasses various social elements, including social identity, norms, and referents, which can all affect behaviour (Kurdi et al., 2020). Social identity theory suggests that individuals’ technology use is influenced by the norms and behaviours of the groups they identify with (Unal & Uzun, 2021). Additionally, perceived social norms, or expectations from peers and instructors, can further influence the adoption of new technologies (Natasia, Wiranti, & Parastika, 2022).
In the context of statistics education, students are more likely to adopt digital tools if they observe their peers and educators using them effectively (Pitafi, Kanwal, & Khan, 2020). Social encouragement and support from family, friends, or online communities also contribute to positive technology adoption behaviours (Sinaga & Pustika, 2021). Recognizing the impact of social influence is essential for educators and decision-makers aiming to promote technology use in education.
Figure 1 presents the proposed conceptual model, illustrating the relationships among key factors influencing behavioural intention. These include Perceived Usefulness, Perceived Ease of Use, Attitude Toward Using, Self-Efficacy, and Social Influence. The model suggests that perceived usefulness and ease of use affect behavioural intention both directly and indirectly through attitude, while self-efficacy and social influence contribute to shaping these perceptions.

3. Research Methodology Sample Size
This study employed a probability-based sampling technique known as stratified sampling. The minimum required sample size was determined using the calculation method recommended by Hair & Alamer (2022) and Hair et al. (2021), which suggest a minimum of 10 respondents per measurement indicator. Given that the instrument comprised 30 Likert-scale items, the minimum sample size was calculated as 300 (30 × 10) respondents. To account for potential non-responses or incomplete data, approximately 420 questionnaires were prepared and distributed.
The study was conducted in the east coast region of Peninsular Malaysia, which includes the states of Terengganu, Pahang, and Kelantan. This region, in contrast to the western and southern zones, is characterized by a higher concentration of rural areas, where internet connectivity and access to technology remain relatively limited. Although universities in the region are generally equipped with technological infrastructure, such access is largely confined to the campus environment. Consequently, students residing outside the university premises may face challenges in accessing online learning tools. Furthermore, the majority of undergraduate students enrolled in these universities are originally from the same states, reinforcing the regional specificity and relevance of this study. Based on these considerations, the sample size obtained from this region was deemed appropriate for achieving the study's objectives.
Following data collection, a comprehensive data cleaning process was undertaken to ensure data quality and to meet the assumptions of normal distribution. A total of 420 responses were initially gathered via an online survey platform. The dataset was then imported into IBM SPSS Version 25.0, where several quality checks were performed. These included reviewing item responses, identifying and handling missing values, correcting case alignment issues, and detecting outliers. After this process, 41 responses were excluded due to issues such as incompleteness or inconsistencies, resulting in a final sample of 379 valid responses used for subsequent inferential analyses.
This study adopts a quantitative research design utilizing a closed-ended, online-administered questionnaire. In Malaysia, aside from numerous private universities and community colleges, there are approximately five key public universities known for their strong emphasis on statistics education: Universiti Sultan Zainal Abidin (UniSZA), Universiti Malaysia Terengganu (UMT), Universiti Malaysia Pahang (UMP), Universiti Teknologi MARA (UiTM), and Universiti Sains Malaysia (USM) (Al-Hattami, 2021). These institutions have been the focus of increasing interest due to their active engagement in statistics education initiatives.
Due to the unavailability of comprehensive data on the academic population, the total population was not specified. Therefore, this study focused on gathering insights from academic staff with expertise in statistics, as their perspectives are essential in understanding the adoption and integration of technological innovations in education (Mwendwa, 2017).
Data collection was conducted through a random sampling technique, where the online questionnaire was distributed via Google Forms to a targetedgroup of lecturers in business and managementfaculties. The use of an online platform ensured not only convenience and wider reach but also reduced health risks associated with physical interaction, in line withongoing social distancing measures
This study employs Confirmatory Factor Analysis (CFA), which forms part of the Covariance-Based Structural Equation Modelling (CB-SEM) framework. CFA is a widely accepted technique in fields such as business, social sciences, information systems, and education (Al-Hattami et al., 2021; Abdelrahman, 2020). One of its primary strengths lies in its ability to assess the validity and fit of each construct within a proposed measurement model.
To evaluate construct validity, the analysis incorporates model fit indices, including absolute fit, incremental fit, and parsimonious fit, which reflect the degree of consistency between the observed data and the hypothesized model. These indices are specific to CB-SEM and are instrumental in verifying the adequacy of the measurement model, thereby reinforcing the credibility of survey-based data (Rasheed et al., 2023).
In addition, factor loading (also known as indicator loading) is used to assess how well each observed item represents its respective latent construct. Higher loadings suggest stronger item-construct alignment (Mahmood et al., 2022). Furthermore, higher factor loadings contribute to increased values of Composite Reliability (CR) and Average Variance Extracted (AVE), both of which are critical indicators of a model’s internal consistency and convergent validity. As such, CFA serves as a crucial step in validating the measurement model before advancing to hypothesis testing to explore the structural relationships among the constructs. Thus, the methodology enhances the robustness of the research findings and supports the theoretical adaptation of TAM within a new educational setting.
4. Findings
This study employed Confirmatory Factor Analysis (CFA) using Covariance Based Structural Equation Modelling (CB-SEM) via IBM-SPSS-AMOS software. The CFA was fully utilized to assess the quality of each item within constructs by examining factor loadings, evaluating the fit of the measurement model, checking construct correlations, and assessing construct reliability and validity.
The data collected from the online survey encompassed a sample size of 379 respondents, drawn from five universities situated in the east coast regions: UniSZA, UMT, UiTM, UMPSA, and USM. These universities were chosen strategically due to their potential for attracting participants, especially considering the availability of the Business Statistics course in the current semester. The profile of respondents was diverse, with a slight majority of females (58.31%) compared to males (41.68%). Regarding age distribution, the largest group of respondents fell within the 21-23 years bracket (52.5%), followed by those aged 18-20 years (25.59%).
In terms of educational background as presented in the Table 1, bachelor's program students constituted the largest proportion (56.72%) of the respondents, totalling 215 participants. UniSZA emerged as the primary contributor to the survey responses (37.46%), followed by UMT (21.37%), likely due to the survey's availability during the short semester at UniSZA. Notably, a substantial preference for online classes over in-person sessions was observed among respondents, with nearly a 30% disparity between the two groups. Additionally, perceptions of the difficulty level of statistics varied, with the majority of respondents (49.86%) perceiving it as moderate, while 36.14% found it difficult. These findings shed light on the demographic composition and preferences of the surveyed population, offering valuable insights for further analysis and interpretation of the survey results.
Variable | Frequency | Percentage (%) |
Gender | ||
Male | 158 | 41.68 |
Female | 221 | 58.31 |
Age | ||
18-20 years | 97 | 25.59 |
21-23 years | 199 | 52.5 |
24-26 years | 64 | 16.88 |
More than 26 years | 19 | 5.01 |
Education Level | ||
Diploma | 81 | 21.37 |
Bachelor | 215 | 56.72 |
Master | 83 | 21.89 |
University | ||
UniSZA | 142 | 37.46 |
UMT | 81 | 21.37 |
UiTM | 75 | 19.78 |
UMPSA | 53 | 13.98 |
USM | 28 | 7.38 |
Class Preferable | ||
Online | 243 | 64.11 |
Offline | 136 | 35.88 |
Statistics Difficulty | ||
Easy | 53 | 13.98 |
Moderate | 189 | 49.86 |
Difficult | 137 | 36.14 |
For inferential statistics, this study first conducts Confirmatory Factor Analysis (CFA). The study examines six main constructs, all of which are modelled as reflective measurement models. Each construct - Social Influence, Self-Efficacy, Perceived Usefulness, Perceived Ease of Use, Attitude Toward Use, and Behavioural Intention - is assessed using five indicators. The CFA analysis is performed using IBM SPSS AMOS 26, Correlation analysis is applied across all constructs, as illustrated in Figure 2.

Items | Factor Loading | Effect Size (Necessary Condition Analysis) | Composite Reliability (CR) | Average Variance Extracted (AVE) | ||||
Social Influence (Kurdi et al., 2020; Unal & Uzun, 2021; Sinaga & Pustika, 2021) | ||||||||
Online learning communities influence my adoption of educational technology. | .753 | 0.18 | 0.869 | 0.570 | ||||
My family supports my use of technology for educational purposes. | .816 | 0.32 | ||||||
I use technology for learning statistics because my classmates do. | .735 | 0.40 | ||||||
My instructors expect me to use technology in my statistics coursework. | .734 | 0.24 | ||||||
My peers encourage me to use technology for learning statistics | .735 | 0.34 | ||||||
Self-Efficacy (Natasia, Wiranti, & Parastika, 2022; Fauzi et al., 2021; Murillo et al., 2021) | ||||||||
I believe I can successfully complete statistical tasks using technology | .701 | 0.22 | 0.862 | 0.556 | ||||
I feel competent in applying technology to solve statistical problems | .776 | 0.25 | ||||||
I can troubleshoot minor technical issues while using educational technology | .739 | 0.31 | ||||||
I can effectively use statistical software and online tools without help. | .763 | 0.27 | ||||||
I am confident in my ability to use technology for statistics learning. | .748 | 0.34 | ||||||
Perceived Usefulness (Pitafi, Kanwal, & Khan, 2020; Singh & Sinha, 2020; Daragmeh, Lentner, & Sagi, 2021) | ||||||||
Using technology enhances my learning experience in statistics | .647 | 0.33 | 0.838 | 0.516 | ||||
Educational technology improves my understanding of statistical concepts | .660 | 0.29 | ||||||
Technology makes completing statistics-related tasks more efficient. | .674 | 0.27 | ||||||
Learning statistic through tech increases my productivity | .622 | 0.33 | ||||||
I find technology useful in performing my statistical coursework | .940 | 0.41 | ||||||
Perceived Ease of Use (Prastiawan et al., 2021; Chen & Aklikokou, 2020; Filieri et al., 2021) | ||||||||
I can use educational technology without assistance | .766 | 0.35 | 0.856 | 0.543 | ||||
It is easy for me to become skilful at using technology for learning statistics | .753 | 0.35 | ||||||
I find educational technology to be flexible and easy to use | .721 | 0.37 | ||||||
Interacting with technology for learning statistics is simple and understandable | .742 | 0.32 | ||||||
I find it easy learn how to use tech for statistics education | .702 | 0.33 | ||||||
I enjoy using technology for learning statistics. | .744 | 0.25 | ||||||
I believe that using technology makes learning statistics more engaging | .739 | 0.36 | ||||||
Using educational tech. is a good idea for statistics learning | .789 | 0.31 | ||||||
I have a positive attitude towards integrating technology into my studies | .773 | 0.33 | ||||||
I feel comfortable using tech. for statistics coursework | .755 | 0.26 | ||||||
Behavioural Intention | ||||||||
I intend to use technology-based tools (e.g., statistical software, online learning platforms) in my statistics courses regularly. | .741 | 0.31 | 0.861 | 0.554 | ||||
I plan to continue using technology for learning statistics even after completing my current course. | .748 | 0.32 | ||||||
I am willing to recommend the use of technology to my peers for learning statistical concepts. | .737 | 0.37 | ||||||
I expect to rely on technology (e.g., e-learning, statistical software) to improve my understanding of statistics in the future. | .754 | 0.26 | ||||||
If I have access to technology-based learning tools, I will actively use them to enhance my statistical skills. | .740 | 0.23 |
The findings presented in Table 2 demonstrate that most questionnaire items exhibited strong factor loadings, affirming the reliability and validity of the indicators in representing their respective constructs. According to Hair et al. (2019), a factor loading of 0.60 or above is considered acceptable, indicating that an item makes a meaningful contribution to the latent variable it intends to measure.
For the Social Influence construct, factor loadings ranged from 0.73 to 0.82, suggesting that external social factors - such as peer and instructor support - play an important role in students' technology adoption within the context of statistics education. This implies that students are influenced by their academic environment and social encouragement in their willingness to engage with digital tools.
The Self-Efficacy construct displayed loadings between 0.70 and 0.78, highlighting students’ confidence in their ability to use technology effectively for learning statistics. The consistent loadings within this range underscore the construct's strength in capturing students’ perceived competence in navigating educational technologies.
For Perceived Usefulness, factor loadings varied more widely, from 0.57 to 0.94. Although one item fell slightly below the recommended threshold, the majority exceeded 0.60, with some achieving very high values. This suggests that students generally perceive technology as beneficial, although the variation in loadings may indicate differences in how specific aspects of usefulness are interpreted in the learning context.
The Perceived Ease of Use construct showed factor loadings between 0.70 and 0.77, confirming that students view the technology as accessible and user-friendly. This finding aligns with the theoretical expectations of the Technology Acceptance Model (TAM), which posits ease of use as a critical factor in user acceptance.
Attitude toward Using Technology exhibited loadings ranging from 0.74 to 0.79, reflecting a consistently positive perception among students. These values support the construct's ability to capture students’ favourable disposition toward integrating technology into their learning experiences. Lastly, the Behavioural Intention construct had a narrow but strong loading range of 0.74 to 0.75, indicating a reliable and stable measurement of students’ willingness to adopt educational technology for statistics learning.
The overall model fit was evaluated using several established indices, including RMSEA, Chi-square/df, CFI, IFI, and TLI. The model demonstrated acceptable fit, with RMSEA falling below the recommended cut-off of 0.08, suggesting a good approximation of the population model. Additionally, the Chi-square/df ratio was below 3.0, confirming the model’s suitability. The comparative fit indices - CFI, IFI, and TLI - were all above 0.90, indicating a well-fitting model according to standard evaluation criteria (Hair et al., 2019).
Furthermore, all construct correlations were below 0.85, eliminating concerns of multicollinearity. This confirms that each construct maintains its theoretical distinctiveness, supporting the validity of the model in capturing diverse dimensions of technology acceptance within the context of statistics education.
Reliability was assessed using Composite Reliability (CR), which evaluates the internal consistency of each construct. The CR values ranged from 0.838 to 0.872, surpassing the commonly accepted threshold of 0.70 (Fornell & Larcker, 1981). These results confirm that the constructs demonstrate a high level of reliability and are consistently measured across the sample.
Validity was evaluated through both Average Variance Extracted (AVE) and an analysis of discriminant validity. The AVE values ranged between 0.516 and 0.578, exceeding the minimum benchmark of 0.50, thereby confirming that each construct accounts for more than half of the variance in its indicators - evidence of strong convergent validity (Hair et al., 2019).
To assess discriminant validity, the study applied the Fornell & Larcker (1981) criterion, which involves comparing the square root of each construct’s AVE withits correlations with other constructs. As shown in Table 3, the results indicate that all constructs were empirically distinct, confirming that the measurement model successfully captures unique and non-overlapping aspectsof students' perceptionsregarding technology adoptionin statistics education. This distinction is essential toensure thetheoretical integrity and validityof the model.
| Attitude Using | Social Influence | Self- Efficacy | Perceived Usefulness | Perceived Ease Use | Behavioural Intention |
Attitude Using | 0.760 |
|
|
|
|
|
Social Influence | 0.474 | 0.755 |
|
|
|
|
Self-Efficacy | 0.521 | 0.500 | 0.746 |
|
|
|
Perceived Usefulness | 0.450 | 0.382 | 0.426 | 0.701 |
|
|
Perceived Ease Use | 0.479 | 0.496 | 0.508 | 0.406 | 0.737 |
|
Behavioural Intention | 0.471 | 0.566 | 0.515 | 0.341 | 0.521 | 0.744 |
A comparison between the correlations with other constructs and the square root of the Average Variance Extracted (AVE) for each construct was another method used to assess discriminant validity. To prove discriminant validity, Fornell & Larcker (1981) suggested that the square root of the AVE should be greater than the correlation coefficient between the components. In this case, the results from the Table 3 are achieved.
5. Discussion
The quantitative data gathered in this study offered insightful analysis of the efficacy of the platform in improving students' experiences with statistics learning, as well as their opinions and acceptance of the platform. The main goal was to investigate the platform's acceptability and effect on learning results. Rigorous statistical techniques were used to guarantee the strength of the research model. Specifically, CFA was used to assess the validity and reliability of all latent constructs inside the model (Sukhov, Olsson, & Friman, 2022).
All of which confirm the integrity of the measurement model, the CFA findings showed notable inter-construct correlations, good model fit, and relevant interactions among observed variables. These results highlight the usefulness of CFA as a useful tool for applied researchers, providing thorough validation of measurement frameworks and supporting more investigation of the intricate interactions across components (Afthanorhan, Awang, & Aimran, 2020). Still, one must realize that essential assumptions - including the handling of uncertainty and the suitability of the estimating and hypothesis testing methods used - govern the interpretation of such outcomes.
Apart from model validation, the results of this study have practical consequences as well for the design and creation of future technological aids in statistics education. Students lacking basic mathematical knowledge may find undergraduate statistics courses quite difficult. In digital learning contexts, where students may struggle with abstract ideas, especially when important formulas and data interpretations are presented without adequate interactive or explanatory help, these obstacles are even more pronounced.
There is now an increasing demand for educational platforms that are not only functionally accessible but also pedagogically flexible. Including dynamic representations, real-time feedback systems, and contextualized examples, for instance, can help to convert abstract statistical ideas into more palatable forms. Smart learning systems that tailor instructional paths depending on student performance and offer timely scaffolding can also improve understanding and motivation. When based on actual data like the one offered in this study, these technical developments have the possibility to greatly enhance learning results and promote more acceptance of digital platforms in statistics teaching.
Ultimately, confirming the study model and offering more in-depth understanding of students' views and acceptance of the technological platform in statistics teaching were made possible by the use of strong statistical methods including CFA. The conversation, meanwhile, emphasizes the need of using these results as a basis to guide the creation of more focused, learner-centered digital tools. Future instructional technologies will be better able to assist different students in mastering statistical ideas by means of intentional design addressing typical learning barriers (Liu, Yu, & Damberg, 2022; Farmaki et al., 2022).
6. Conclusion
This study emphasizes the value of employing multi-method validation to enhance both the theoretical foundations and practical implications within the social sciences. The use of Confirmatory Factor Analysis (CFA) marks a notable step forward in methodological rigor, enabling researchers to verify the measurement structure and strengthen the reliability of research models. Although the findings offer meaningful contributions, future studies are encouraged to further advance validation techniques, incorporate more diverse samples, and adopt complementary statistical approaches to ensure the continued robustness and relevance of research outcomes in a dynamic and evolving academic environment.
While the integration of Confirmatory Factor Analysis (CFA) adds methodological strength to this study, several limitations should be acknowledged. A primary limitation concerns the generalizability of the findings, as the research focuses specifically on students enrolled in statistics courses. To enhance external validity, future studies should consider larger and more diverse samples, incorporating participants from various educational backgrounds, disciplines, and professional sectors. Another limitation lies in the methodological scope of the study. Although CFA offers robust construct validation, incorporating additional analytical techniques - such as machine learning algorithms, Bayesian statistical methods, or mixed-method approaches - could further enhance predictive accuracy and provide deeper insights into complex interrelationships (Kim et al., 2024; Miladinia et al., 2024). Future research is encouraged to adopt multi-method validation strategies to reduce potential biases and improve the overall reliability of results (Russo et al., 2024).
Though digital platforms in statistics education are being more widely used, many issues still exist that could impede students' learning experiences. Many undergraduates lack basic mathematics knowledge, which hinders their understanding of complicated statistical ideas, especially in online settings. The lack of real-time engagement, restricted individual feedback, and the linearity of material delivery typically aggravate these difficulties, which could cause cognitive overload and lower drive. Recent developments in educational technology have brought artificial intelligence (AI) into the spotlight as a possible answer to these problems. Offering real-time feedback and scaffolded support, AI-powered technologies like intelligent tutoring systems can dynamically change material depending on students' learning progress. While conversational agents like chatbots let students participate in interactive problem-solving outside the classroom, natural language processing can improve learning by offering context-specific explanations. By making digital platforms more responsive, intuitive, and in line with individual learner demands, these AI-driven strategies have the potential to revolutionize statistics teaching and hence promote better knowledge and academic performance.
Lastly, contextual factors such as educational settings, cultural norms, and access to technology may influence the interpretation and applicability of the results. Since this study primarily focuses on students within a structured academic environment, the findings may not fully reflect the experiences of learners in different institutional or professional contexts. To address this, future research should be extended across various educational institutions, disciplines, and geographic locations to enhance the transferability and relevance of the conclusions drawn.
Conceptualization: A. A.; methodology: A. A.; validation: S. A. F. S. A. T.; formal analysis: A. A.; investigation: N. Z. M.; resources: S. A. F. S. A. T.; data presentation: A. N. A.; data curation: N. Z. M.; data analysis: S. A. F. S. A. T.; literature review: A. N. A.; composition of original draft: A. N. A.; composition of the initial draft: N. Z. M; writing—review and editing: S. A. F. S. A. T.; visualization: S. A. F. S. A. T.; supervision: A. A.; project management: N. H. F. All authors have read and agreed to the published version of the manuscript.
The data used to support the research findings are available from the corresponding author on request.
We also gratefully acknowledge the financial support provided by Universiti Sultan Zainal Abidin (UniSZA) through two research grants: the Internal University Grant (DPU 2.0) [Project Code: UniSZA/2023/DPU 2.0/33] and the International Collaborative Research Grant [Project Code: UniSZA/2024/PSU-TDP/07]. The support from these funding sources has been instrumental in the successful completion of this study.
The authors declare that there were no commercial or financial relationships that could be construed as a potential conflict of interest.
