Abstract
Purpose
Since the onset of the COVID-19 pandemic in China, student engagement in online learning has been a critical issue for all educational institutions. The university student engagement inventory (USEI) is the most used scale for assessing the construct of student engagement. The present study aimed to evaluate the psychometric properties of the USEI among 1504 Chinese university students who completed a survey through an online platform between December 2020 and January 2021.
Design/methodology/approach
In this cross-sectional study, content validity, construct validity and reliability of the scale were assessed.
Findings
The results supported the three-factor model with acceptable goodness of fit (χ2 (71) = 369.717, p = 0.13, χ2/df = 5.207, comparative fit index (CFI) = 0.967, normed fit index (NFI) = 0.960, Tucker–Lewis index (TLI) = 0.958, standardized root mean square residual (SRMR) = 0.030, root mean square error of approximation (RMSEA) (90% CI) = 0.053 [0.049, 0.057]), good internal consistency and construct reliability (Cronbach's alpha and omega coefficient >0.70) and strong convergent validity. Also, the measurement invariance was confirmed across gender.
Originality/value
This study showed that the 3-factor structure of USEI with Chinese university students had good construct validity, internal consistency and reliability. It could help measure student engagement in online learning in China.
Keywords
Citation
She, L., Khoshnavay Fomani, F., Marôco, J., Allen, K.-A., Sharif Nia, H. and Rahmatpour, P. (2023), "Psychometric properties of the university student engagement inventory among Chinese students", Asian Association of Open Universities Journal, Vol. 18 No. 1, pp. 46-60. https://doi.org/10.1108/AAOUJ-08-2022-0111
Publisher
:Emerald Publishing Limited
Copyright © 2023, Long She, Fatemeh Khoshnavay Fomani, João Marôco, Kelly-Ann Allen, Hamid Sharif Nia and Pardis Rahmatpour
License
Published in the Asian Association of Open Universities Journal. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) license. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this license may be seen at http://creativecommons.org/licences/by/4.0/legalcode
Introduction
An increasing number of studies support the importance of students' learning experiences. As a result, researchers, practitioners and policymakers have focused on student engagement, including its measures (Bond et al., 2020). In this context, several influential theories, such as the theory of involvement (Astin, 1999), sociocultural theories of engagement (Kahu, 2013; Kahu and Nelson, 2018a, b) as well as the conceptualization of student engagement (Fredricks et al., 2004) have developed clarification around the widely used construct.
Student engagement, which can be distinguished from other related concepts, such as belonging, involvement or participation (Baron and Corbin, 2012), is dynamically influenced by positive behavior, cognitive and affective components exhibited during deep learning (Jean, 2015). A positive, fulfilling state of mind about one's work is a crucial component of engagement (Antoinette Bargagliotti, 2012). Consequently, many studies agree that the three dimensions of behavioral (BE), emotional (EE) and cognitive engagement (CE) best explain the dynamic and multifaceted aspects of student engagement (Allen and Boyle, 2023; Fredricks et al., 2016). In the learning process, BE is the explicit behavior that reflects students' participation in learning. CE involves students' thoughts mainly about their desire and effort to understand and master complex content and skills. In contrast, EE involves a student's positive and negative reactions (e.g. happiness, sadness, anxiety) to learning opportunities and assignments (Hu and Li, 2017; Maroco et al., 2016; Sinval et al., 2021). The emotional component has sometimes been conceptualized as a student's sense of belonging to an academic institution. However, there is a growing understanding that student engagement (sometimes referred to as academic engagement) is a distinct construct (See Allen and Boyle, 2023 for a review; Allen et al., 2021b; Furlong et al., 2003). Thus, having reliable and valid measurement tools allows researchers to be precise in their conceptualizations or operationalization of constructs to avoid imprecise terminology (Allen et al., 2021a; Allen and Kern, 2017, 2019).
According to the United States (US) “National Survey of Student Engagement” (NSSE, 2014), the amount of time and effort that a student invests in studying and undertaking other meaningful educational activities can be encompassed within the concept of student engagement (Whitton and Moseley, 2014). Other student engagement outcomes in educational systems are indicated by a range of activities that enhance learning and professional development (Peters et al., 2019), performance as well as an institution's reputation (Trowler, 2010). At an individual level, in addition to personal development, satisfaction and feelings of well-being are also recognized outcomes (Jean, 2015; Rahmatpour et al., 2019b). Furthermore, higher student engagement in academic activities is associated with higher academic achievement (Dunn and Kennedy, 2019; Lei et al., 2018), resilience (Ahmed et al., 2018; Skinner et al., 2016) as well as a lowered risk of burnout (Paloș et al., 2019; Rahmatpour et al., 2019a). It is also reportedly associated with an improved sense of school belonging (Wong et al., 2019) as well as a higher retention rate (O'Keeffe, 2013) and lower rates of drop-out (Klem and Connell, 2004).
The COVID-19 pandemic forced global shutdowns of educational institutions (e.g. universities), causing educational systems to adopt drastic changes, like remote learning where traditional face-to-face pivoted to delivery through online platforms. Education systems in China followed a policy entitled Suspending Classes without Stopping Learning, which shifted learning to online formats (She et al., 2021a; Zhang et al., 2020). Online learning refers to using the Internet and related technologies to prepare and present materials for educational, teaching and program management purposes (Fry, 2001). Despite the challenges of the COVID-19 pandemic, many students were able to maintain their commitment to learning through online platforms and remain affectively involved in developing new knowledge and skills (Borup et al., 2020) while others reported a decline in engagement (Huang, 2020; Unger and Meiran, 2020). Emerging studies have suggested that the lack of interaction between university students and instructors during online classes caused challenges in maintaining student interest and engagement (Huang, 2020; Unger and Meiran, 2020). The reported decline in student has led some researchers to call for student engagement to be made a top priority in higher educational institutions (Nickerson and Shea, 2020; Zhang et al., 2020). Positive academic emotions and student adaptability (Zhang et al., 2020)- critical factors of student engagement – are essential for students during times of uncertainty.
Assessing student engagement helps educational systems investigate student success in achieving academic goals (Caruth, 2018; Kahu and Nelson, 2018a, b). In a comprehensive review, Jiang et al. (2021) found that assessing student engagement can improve the quality and effectiveness of online learning. Among the existing self-report questionnaires to evaluate student engagement, there are less instruments available that target engagement amongst university students and very few that consider institutional factors. Due to the well-established importance of student engagement for university retention (Tight, 2020), and academic outcomes (Soffer and Cohen, 2019; Xerri et al., 2018) as well as the influence of university-related factors (Maroco et al., 2016), a valid tool that can reliably measure university student engagement is needed.
The student engagement questionnaire (SEQ) was developed to evaluate both student engagement and university teaching-learning processes (Kember and Leung, 2009). The SEQ has been validated among a Spanish university student population (Gargallo et al., 2018). Two other measures, namely the National Survey of Student Engagement (NSSE) and the Beginning College Survey of Student Engagement (BCSSE), were also developed to study engagement among first-year students in college (Chambers and Chiang, 2012) by capturing study habits and experiences (Wefald and Downey, 2009), however, both measures have been criticized for their limitations in not measuring all aspects of student engagement, especially as they relate to systemic considerations. As a result, the University Student Engagement Inventory (USEI) was developed to respond to the limitations of existing student learning engagement measures but also address the criticisms related to their psychometric properties (Maroco et al., 2016). The USEI was developed to assess student engagement of university students with the multidimensional nature of engagement and educational and organizational perspectives core considerations (Maroco et al., 2016). The measure reports strong dimensionality, internal reliability and invariance in various countries making it a suitable and robust tool to adopt to diverse settings like China (Costa and Marôco, 2017; Sharif Nia et al., 2022; Sinval et al., 2021).
The current study
Given the importance of assessing student engagement for the quality and effectiveness of online learning (Jiang et al., 2021), increasing online learning conditions, a decline in reported student engagement, concerns with the psychometric properties of existing student engagement measures, and the lack of validated measures in mainland China specifically, valid and reliable measures are urgently needed. The present study aims to investigate the psychometric properties of the USEI among Chinese university students and a need to prioritize student engagement.
Methods
Design
The cross-sectional and questionnaire-based research design was used to evaluate the psychometric properties of the Chinese version of the USEI in an online learning context. An online survey was created using the online questionnaire platform Sojump and data was collected from December 2020 to January 2021 by sending the survey to university students from five major cities in China (Urumchi, Lanzhou, Zhengzhou, Qingdao and Shijiazhuang). To be included in the study, respondents had to be Chinese university students who (1) had only experienced online learning modes during the COVID-19 pandemic and (2) were willing to be part of this study. Sample selection was based on convenience sampling.
Measures
The online questionnaire consists of two parts, with the first one requiring the respondent to provide details regarding their basic demographic characteristics (i.e. age, gender, current educational level and year of study), as well as the number of online classes they took per week during the pandemic. The second section then included 15 items of the USEI, with their three subfactors, namely BE (e.g. “I usually do my homework on time”), EE (e.g. “I feel excited about the schoolwork”) and CE (e.g. “when I read a book, I question myself to make sure I understand the subject I'm reading about”). A 5-point Likert scale from one (never) to five (always) was applied to score each USEI item, thus giving total scores between 5 and 25. In this case, higher scores indicated greater student engagement (Maroco et al., 2016). Furthermore, it should be noted that item 6 of the scale was reverse-coded (“I don't feel very accomplished at this school”). At the same time, certain items were also modified to reflect the online learning context, with, for example, “classroom” being replaced by “online class”.
Procedure
The developer of the USEI, Dr João Marôco, was contacted and we were given written permission to use the measure for the present study. Next, following the forward-backward translation technique, we first provided the English version of the USEI to two English-Chinese translators to translate it from English to Chinese independently. Then, we integrated the two sets of Chinese versions of USEI into one and provided it to a Chinese-English translator to translate it back into English. Next, an expert in the field of this study reviewed the English version of the instrument to confirm the originality and accuracy of the translated measure.
Content validity
To review the content validity of the Chinese version of the USEI, the content validity ratio (CVR) and modified kappa coefficient (K) was applied to ensure that the questionnaire accurately measured the true meaning of the construct. Ten experts in psychology, education and business were required to comment on the 15-item USEI, especially its wording, allocation of items and item scaling. After that, they were required to provide a rating on the essentiality of the USEI items. The CVR value was then computed based on the following formula: (ne – (N/2))/(N/2), where “ne” refers to the number of experts who rated the items as “Essential” and N represents the total number of experts (Cook and Beckman, 2006). Following the recommendation by Lawshe (1975), the minimum value for CVR should be 0.62 when there are ten experts. Finally, each item's modified kappa coefficient (K) was obtained, with its minimum value of 0.60 or above to establish each item's content validity (Wynd et al., 2003).
Data analysis
Descriptive statistics and item sensitivity
The R project for statistical computing was used to analyze the data. The skimr and psych package were first used to obtain the descriptive statistics before acquiring each item's minimum (Min), maximum (Max), average (M), skewness and kurtosis values. In addition, to present the results of each item, histograms were also created. Absolute values of less than 7 and 3 for skewness and kurtosis, respectively, indicated the normality of the data and satisfied item psychometric sensitively (Assunção et al., 2020; Finney and DiStefano, 2006).
Exploratory factor analysis (EFA)
To assess the validity and reliability of the construct, this study's dataset (n = 1504) was randomly divided into two subsamples using Excel's RAND function. Excel’s RAND function returns a random number between 0 and 1 rounded to units to each participant. Participants with zero were assigned to the test sample, while participantes with one were assigned to the validation sample. The first 752 cases were used as the first sub-sample, with the remaining 752 cases used as the second subsample. This study conducted the maximum likelihood EFA using the psych package in R to extract the factor structure based on the first sub-sample (n = 752). The Kaiser–Meyer–Olkin (KMO) test, along with Bartlett's test of sphericity, was subsequently applied to determine sampling adequacy as well as the appropriateness of the data for factor analysis. In this case, the minimum value for KMO of 0.5 is good for factor analysis (Hair et al. (2010), while a p-value of less than 0.05 for Bartlett's test of sphericity indicates the adequacy of the sampling. To extract factorial structure, this study follows the criteria of (1) eigenvalues of more than 1; (2) commonalities of more than 0.3 and (3) indication of scree plots (Field, 2013; She et al., 2021c). Also, fowling the suggestions from previous studies (Kamadi et al., 2016; She et al., 2021c), items with a factor loading of less than 0.4 were removed.
Confirmatory factor analysis (CFA)
For confirming and validating the factor structures obtained after EFA analysis, this study performed a CFA-based analysis on the second subsample (n = 752). This was performed using R's lavaan package with weighted least squares means and variance estimator (WLSMV) to assess the psychometric properties of the Chinese version of the USEI as well as to ensure that both the first-order and second-order factor structures were a good fit for the data. Some of the fitness indices which were selected to evaluate the model fit included the Chi-square (χ2) test, comparative fit index (CFI) > 0.90, normed fit index (NFI) > 0.90, Tucker–Lewis index (TLI) > 0.90, standardized root mean square residual (SRMR) < 0.09, and root mean square error of approximation (RMSEA) < 0.08 (Marsh and Hocevar, 1985; Pahlevan Sharif et al., 2019; She et al., 2021b).
Validity assessment
To assess the validity of the construct, the SemTools package was used to evaluate the instrument's convergent validity and discriminant validity based on the average variance extracted (AVE) and the heterotrait-monotrait ratio of correlations (HTMT), respectively. For convergent validity, each construct should have had an AVE value of greater than 0.5 (Pahlevan Sharif et al., 2019), while for discriminant validity, according to Fornell and Larcker (1981), the AVE of a construct needed to be greater than its squared correlation with other constructs. In addition, for the HTMT matrix table, the values had to be less than 0.85 (Henseler et al., 2015).
Reliability assessment
The internsal consistency was evaluated using the SemTools package to estimate Cronbach's alpha (α) and omega coefficient (ω) (α and ω > 0.7 indicate acceptably) (Maroco et al., 2014; Mayers, 2013). The reliability of the second-order construct was assessed through three reliability estimates (Sinval et al., 2021), including the proportion of the second-order factor explaining the total score (ωL1), the variance of the first-order factors explained by the second-order factor (ωL2), and the proportion of variance explained by the second-order factor after controlling for the uniqueness of the first-order factor (ωpartialL1).
Gender invariance
To assess if the Chinese version of the USEI could be used to assess both male and female student engagement, a multigroup confirmatory analysis was performed, using the lavaan package with robust maximum likelihood estimation. For this purpose, by defining four nested models, tests for configural invariance (no constraints), metric invariance (constrained factor loadings between genders), scalar invariance (constrained loadings and intercepts) and structural invariance (second-order factor loadings constrained) were performed. Strict invariance was not tested since this type of invariance is not required for group comparisons. Invariance was assumed for nonsignificant Δχ2 statistic, absolute ΔCFI < 0.01, and absolute ΔRMSEA < 0.02 between two nested models (Cheung and Rensvold, 2002).
Results
Participants
In total, 1504 university students completed the survey through the online questionnaire platform, Sojump. In this sample, for which the mean age was 19.89 years (SD = 1.93), 70.3% of the participants (1058) were females, while 29.7% (446) were males. Most students were undergraduate (97.7%) and had a minimum of six online classes every week (61.4%). Concerning the study year, 43.3% of students were in year 1 and 39.9% were in year 2.
Content validity
The ratings given by the ten experts confirmed that the CVR of all the 15 items exceeded the minimum threshold proposed by Lawshe (1975) i.e. 0.62. Furthermore, since the 15 items of the Chinese version of the USEI had modified kappa coefficient (K) values greater than 0.6, they were all considered to be appropriate and were, therefore, included.
Item's distribution properties
Table 1, which shows the results of the descriptive statistics, skewness and kurtosis for all the items, indicated that none deviated strongly from normality. In contrast, all the items showed appropriate psychometric sensitivity.
Factorial validity evidence
The maximum likelihood EFA, with Promax rotation, extracted three factors that had eigenvalues higher than 1 (
Construct validity
The results showed that the AVE of BE (0.748), EE (0.522) and CE (0.755) was greater than 0.50, hence demonstrating acceptable convergent validity for all three factors. The results of the discriminant validity assessment using Fornell and Larcker (1981) and HTMT criterion (Table 2) indicated that the discriminant validity of EE and CE was not established. However, we detected a strong correlation among the three first-order latent constructs (between BE and CE: 0.84; between BE and EE: 0.83; and between CE and EE: 0.93). The results suggested that there could be a second-order latent construct behind these factors. Therefore, we performed a second-order assessment to confirm the Chinese version of the USEI.
Second-order construct
The results of the second-order latent construct assessment showed a good goodness of fit (χ2(71) = 369.717, p = 0.13, χ2/df = 5.207, CFI = 0.967, NFI = 0.960, TLI = 0.958, SRMR = 0.030, RMSEA (90% CI) = 0.053 [0.049, 0.057]). As shown in Figure 2, factor loadings of each item of the first-order construct were greater than 0.4 and statistically significant. Moreover, the results showed a high structural weight (γ) for the student online learning engagement second-order construct: BE (γ = 0.84; p < 0.001); EE (γ = 0.95; p < 0.001) and CE (γ = 0.97; p < 0.001).
Construct reliability
Good internal consistency was observed for all three first-order constructs, with Cronbach's alpha (a) and omega coefficients ( ) values greater than 0.7 for all three subconstructs [BE (α = 0.902, ω = 0.889); EE (α = 0.829, ω = 0.835); CE (α = 0.918, ω = 0.895)]. Thus, the results demonstrated good construct reliability. Regarding the second-order construct of student online learning engagement, 89% of the second-order construct explained the total score (ωL1). In comparison, the second-order construct (ωL2) explained 96% of the variance of the first-order constructs. Finally, after controlling for the uniqueness of the first-order construct (ωpartialL1), 95% of the variance was explained by the second-order construct. Hence, it was shown that the second-order construct was both internally consistent and reliable.
Gender invariance
The finding for invariance of the Chinese version of the USEI demonstrated weak (metric) invariance using both invariance criteria (Δχ2metric (11) = 13.681, p = 0.251, ΔCFI = 0.000; ΔRMSEA = −0.002). However, strong (scalar) invariance was only confirmed using the |ΔCFI| and |ΔRMSEA| criteria (Δχ2scalar (11) = 39.929, p < 0.001, ΔCFI = −0.002; ΔRMSEA = −0.001). Structural invariance was also observed with the same criteria (Δχ2structural (3) = 25.977, p < 0.001, ΔCFI = −0.002; ΔRMSEA = 0.001) (Table 3).
Discussion
This study aimed to validate the USEI among Chinese university students who experienced online learning during the COVID-19 pandemic. EFA analysis yielded a 14-item instrument with a three-factor structure: behavioral, emotional and CE. At the same time, the results of the CFA confirmed that the three-factor version of Chinese USEI showed good model fit, internal consistency and construct validity and reliability. In addition, tests of gender invariance indicated no differences between male and female groups. These findings drawn from the factorial validity objectives of the study are consistent with the original study by Maroco et al. (2016).
The results of internal consistency and construct reliability for the Chinese version of the USEI are also consistent with previous studies in other contexts. For example, a study assessing the USEI among 4,479 university students from ten countries demonstrated the reliability and validity of the USEI scale for measuring academic engagement among university students (Assunção et al., 2020). Other studies are confirming the scale's validity and reliability in different contexts as well (Esposito et al., 2022; Gün et al., 2019). Gender invariance was also observed for the USEI in the present study. This allowed regression modeling comparisons between genders and mean comparisons in student engagement levels.
Like the original version of the USEI, student engagement was not only explained by the student's behavior but also by their emotions and cognitions. The results showed that CE was the most predictive subscale of online learning engagement. The first subscale of the USEI, BE, is an observable feature of student engagement (Hu and Li, 2017). This scale has five items that reflect students' participation in the online learning process. Based on the second-order CFA model of the Chinese version of the USEI, the highest regression coefficient of the BE subscale was related to item 5, “I usually participate actively in group assignments” (γ = 0.88).
The second subscale was EE with four items. Students' positive emotions, such as motivation and interest during academic courses, increase their desire to give greater efforts, thus leading to magnanimous engagement in learning (Lee et al., 2019). According to the results of the present study, the EE subscale had one less item than the original USEI, as item 6 was removed due to weak loading. This might be because item 6 is reversed while Sinval et al. (2021) suggested developing all items in the same direction. Therefore, this study suggests that future studies exclude item 6 or reformulate the item so that all items are in the same order. In the subscale, item 9, “I am interested in [online classes'] work”, had the highest regression coefficient (γ = 0.87).
The last subscale was CE, with five items. Cognitively engaged students use learning resources effectively to increase their engagement (Zhoc et al., 2020). As a result, students' CE significantly increases students' willingness to participate in the learning process through two specific other processes (i.e. behavioral and emotional). The highest regression coefficient was for item 11, “When I read a book, I question myself to make sure I understand the subject I'm reading about.” (γ = 0.86).
All subscales showed good convergent validity. In other words, all items of each subscale had a high correlation. After assessing the scale's discriminant validity, the results suggested specifying the scale as a second-order construct. The results of the CFA further confirmed that all subscales measure one latent construct of student online learning engagement.
Conclusion
In sum, this study showed that the 3-factor structure of USEI with Chinese university students had good construct validity, internal consistency and reliability. The study found that the Chinese USEI has utility in a range of higher education settings and contexts (e.g. university settings, colleges, private coaching and therapeutic contexts) for Chinese students. The Chinese USEI may help measure student engagement in online learning in China. Institutions concerned with student success may find measuring student engagement useful to devise strategies and priorities for student retention and academic outcomes. In the therapeutic sense, such a measure may be used to set therapeutic or institutional goals around learning outcomes.
While the current results confirmed the Chinese USEI with 14 items and three subscales of behavioral, cognitive and emotional engagement in an online learning context and was found to be a reliable and valid tool for the sample, this study was not without limitations. First, the instrument relies on self-report and as such, it is not immune to potential exaggeration and subsequent social desirability bias. Second, using a convenient sampling method limits the generalization of the findings. It is suggested that future studies should include more representative samples to cross-validate the results of this study. Last, future studies are needed to assess whether there is a potential impact of students' academic discipline background (e.g. social science vs science) on students' engagement in online learning.
Figures
Distribution properties of USEI's items
Convergent and discriminant validity assessment of Chinese USEI
AVE (main diagonal) and square correlation between factors (lower triangular matrix) | |||
---|---|---|---|
Factor | BE | EE | CE |
BE | 0.748 | ||
EE | 0.691 | 0.522 | |
CE | 0.704 | 0.858 | 0.755 |
Heterotrait-monotrait ratio of correlations (HTMT) | |||
EE | 0.771 | ||
CE | 0.786 | 0.903 |
Source(s): Table by authors
Gender invariance analysis of Chinese USEI
Model invariance | χ2 (df) | Δχ2 (df) | p | CFI | ΔCFI | RMSEA | ΔRMSEA |
---|---|---|---|---|---|---|---|
Configural | 681.736(142) | – | – | 0.966 | – | 0.071 | – |
Metric | 697.835(153) | 13.681(11) | 0.251 | 0.966 | 0.000 | 0.069 | −0.002 |
Scalar | 737.774(164) | 39.929(11) | <0.001 | 0.964 | −0.002 | 0.068 | −0.001 |
Structural | 763.751(167) | 25.977(3) | <0.001 | 0.963 | −0.002 | 0.069 | 0.001 |
Source(s): Table by authors
Funding: No funds, grants or other support was received.
Availability of data and material: The data that support the findings of this study are available from the corresponding author upon reasonable request.
Ethics approval: This paper is a part of a main study. The protocol of study was approved by the Mazandaran University of Medical Sciences Research Ethics Committee (IR.MAZUMS.REC.1399.089).
Authors' contributions: P.R., H.SH. and L.SH. contributed to the study conception and design. Material preparation, data collection were performed by L.SH. J.M., and H.SH. performed data analysis. The first draft of the manuscript was written by all authors. All authors commented on previous versions of the manuscript, read and approved the final manuscript
Conflict of Interest: The authors have no relevant financial or non-financial interests to disclose.
References
Ahmed, U., Umrani, W.A., Qureshi, M.A. and Samad, A. (2018), “Examining the links between teachers support, academic efficacy, academic resilience, and student engagement in Bahrain”, International Journal of Advanced and Applied Sciences, Vol. 5 No. 9, pp. 39-46.
Allen, K.A. and Boyle, C. (2023), “School belonging and student engagement: the critical overlaps, similarities, and implications for student outcomes”, in Christenson, S., Reschly, A.L. and Wylie, C. (Eds), Handbook of Research on Student Engagement, 2nd ed., Springer.
Allen, K.-A. and Kern, M.L. (2017), School Belonging in Adolescents: Theory, Research and Practice, Springer, Singapore.
Allen, K.-A., Gray, D.L., Arslan, G., Riley, K., Vella-Brodrick, D. and Waters, L. (2021a), 19 School Belonging Policy. Building Better Schools with Evidence-Based Policy, Routledge, Melbourne, p. 139.
Allen, K.-A. and Kern, P. (2019), Boosting School Belonging: Practical Strategies to Help Adolescents Feel like they Belong at School, Routledge, Melbourne.
Allen, K.-A., Slaten, C.D., Arslan, G., Roffey, S., Craig, H. and Vella-Brodrick, D.A. (2021b), “School belonging: the importance of student and teacher relationships”, The Palgrave Handbook of Positive Education, Springer, pp. 525-550.
Antoinette Bargagliotti, L. (2012), “Work engagement in nursing: a concept analysis”, Journal of Advanced Nursing, Vol. 68 No. 6, pp. 1414-1428, doi: 10.1111/j.1365-2648.2011.05859.x.
Assunção, H., Lin, S.-W., Sit, P.-S., Cheung, K.-C., Harju-Luukkainen, H., Smith, T., Maloa, B., Campos, J.Á.D.B., Ilic, I.S., Esposito, G., Francesca, F.M. and Marôco, J. (2020), “University student engagement inventory (USEI): transcultural validity evidence across four continents”, Frontiers in Psychology, Vol. 10, p. 2796.
Astin, A.W. (1999), “Student involvement: a developmental theory for higher education”, Journal of College Student Development, Vol. 40 No. 5, pp. 518-529.
Baron, P. and Corbin, L. (2012), “Student engagement: rhetoric and reality”, Higher Education Research and Development, Vol. 31 No. 6, pp. 759-772, doi: 10.1080/07294360.2012.655711.
Bond, M., Buntins, K., Bedenlier, S., Zawacki-Richter, O. and Kerres, M. (2020), “Mapping research in student engagement and educational technology in higher education: a systematic evidence map”, International Journal of Educational Technology in Higher Education, Vol. 17 No. 1, p. 2.
Borup, J., Jensen, M., Archambault, L., Short, C.R. and Graham, C.R. (2020), “Supporting students during COVID-19: developing and leveraging academic communities of engagement in a time of crisis”, Journal of Technology and Teacher Education, Vol. 28 No. 2, pp. 161-169.
Caruth, G.D. (2018), “Student engagement, retention, and motivation: assessing academic success in today's college students”, Participatory Educational Research, Vol. 5, pp. 17-30.
Chambers, T. and Chiang, C.-H. (2012), “Understanding undergraduate students' experience: a content analysis using NSSE open-ended comments as an example”, Quality and Quantity, Vol. 46 No. 4, pp. 1113-1123.
Cheung, G.W. and Rensvold, R.B. (2002), “Evaluating goodness-of-fit indexes for testing measurement invariance”, Structural Equation Modeling: A Multidisciplinary Journal, Vol. 9, pp. 233-255, doi: 10.1207/S15328007SEM0902_5.
Cook, D.A. and Beckman, T.J. (2006), “Current concepts in validity and reliability for psychometric instruments: theory and application”, The American Journal of Medicine, Vol. 119 No. 2, pp. 166.e167-166.e116, doi: 10.1016/j.amjmed.2005.10.036.
Costa, A. and Marôco, J. (2017), “Inventário do Envolvimento Académico dos Estudantes do Ensino Superior - USEI”, in Almeida, L.S., Simões, M.R. and Gonçalves, M.M. (Eds), Adaptação, Desenvolvimento e Sucesso Académico dos Estudantes do Ensino Superior: Instrumentos de avaliação, Associação para o Desenvolvimento da Investigação em Psicologia da Educação, Braga, pp. 33-44.
Dunn, T.J. and Kennedy, M. (2019), “Technology Enhanced Learning in higher education; motivations, engagement and academic achievement”, Computers and Education, Vol. 137, pp. 104-113.
Esposito, G., Marôco, J., Passeggia, R., Pepicelli, G. and Freda, M.F. (2022), “The Italian validation of the university student engagement inventory”, European Journal of Higher Education, Vol. 12 No. 1, pp. 1-21, doi: 10.1080/21568235.2021.1875018.
Field, A. (2013), Discovering Statistics Using IBM SPSS Statistics, SAGE Publications, London.
Finney, S.J. and DiStefano, C. (2006), “Non-normal and categorical data in structural equation modeling”, Structural Equation Modeling: A Second Course, Vol. 10 No. 6, pp. 269-314.
Fornell, C. and Larcker, D.F. (1981), “Evaluating structural equation models with unobservable variables and measurement error”, Journal of Marketing Research, Vol. 18 No. 1, pp. 39-50, doi: 10.2307/3151312.
Fredricks, J.A., Blumenfeld, P.C. and Paris, A.H. (2004), “School engagement: potential of the concept, state of the evidence”, Review of Educational Research, Vol. 74 No. 1, pp. 59-109, doi: 10.3102/00346543074001059.
Fry, K. (2001), “E‐learning markets and providers: some issues and prospects”, Education + Training, Vol. 43 Nos 4.5, pp. 233-239, doi: 10.1108/EUM0000000005484.
Furlong, M.J., Whipple, A.D., St. Jean, G., Simental, J., Soliz, A. and Punthuna, S. (2003), “Multiple contexts of school engagement: moving toward a unifying framework for educational research and practice”, The California School Psychologist, Vol. 8 No. 1, pp. 99-113, doi: 10.1007/BF03340899.
Gargallo, B., Suárez-Rodríguez, J.M., Almerich, G., Verde, I. and Iranzo, M.À.C. (2018), “The dimensional validation of the Student Engagement Questionnaire (SEQ) with a Spanish university population. Students' capabilities and the teaching-learning environment”, Anales de Psicologia, Vol. 34 No. 3, p. 519.
Gün, F., Turabik, T., Arastaman, G. and Akbaşlı, S. (2019), “Adaptation of university student engagement inventory to Turkish culture: validity and reliability study”, Journal of the Faculty of Education, Vol. 20 No. 2, pp. 507-520, doi: 10.17679/inuefd.457919.
Hair, J.F., Black, W.C., Babin, B.J. and Anderson, R.E. (2010), Multivariate Data Analysis, Pearson Education, Andover.
Henseler, J., Ringle, C.M. and Sarstedt, M. (2015), “A new criterion for assessing discriminant validity in variance-based structural equation modeling”, Journal of the Academy of Marketing Science, Vol. 43 No. 1, pp. 115-135, doi: 10.1007/s11747-014-0403-8.
Hu, M. and Li, H. (2017), “Student engagement in online learning: a review”, 2017 International Symposium on Educational Technology (ISET).
Huang, J. (2020), “Successes and challenges: online teaching and learning of chemistry in higher education in China in the time of COVID-19”, Journal of Chemical Education, Vol. 97 No. 9, pp. 2810-2814.
Jean, S.B. (2015), “Student engagement: a principle-based concept analysis”, International Journal of Nursing Education Scholarship, Vol. 12 No. 1, pp. 111-121, doi: 10.1515/ijnes-2014-0058.
Jiang, Z., Wu, H., Cheng, H., Wang, W., Xie, A.N. and Fitzgerald, S.R. (2021), “Twelve tips for teaching medical students online under COVID-19”, Medical Education Online, Vol. 26 No. 1, 1854066, doi: 10.1080/10872981.2020.1854066.
Kahu, E.R. (2013), “Framing student engagement in higher education”, Studies in Higher Education, Vol. 38 No. 5, pp. 758-773.
Kahu, E.R. and Nelson, K. (2018a), “Student engagement in the educational interface: understanding the mechanisms of student success”, Higher Education Research and Development, Vol. 37, pp. 58-71.
Kahu, E.R. and Nelson, K. (2018b), “Student engagement in the educational interface: understanding the mechanisms of student success”, Higher Education Research and Development, Vol. 37 No. 1, pp. 58-71.
Kamadi, V.S.R.P.V., Allam, A.R., Thummala, S.M. and Nageswara Rao, P.V. (2016), “A computational intelligence technique for the effective diagnosis of diabetic patients using principal component analysis (PCA) and modified fuzzy SLIQ decision tree approach”, Applied Soft Computing, Vol. 49, pp. 137-145, doi: 10.1016/j.asoc.2016.05.010.
Kember, D. and Leung, D.Y. (2009), “Development of a questionnaire for assessing students' perceptions of the teaching and learning environment and its use in quality assurance”, Learning Environments Research, Vol. 12 No. 1, pp. 15-29.
Klem, A.M. and Connell, J.P. (2004), “Relationships matter: linking teacher support to student engagement and achievement”, Journal of School Health, Vol. 74, pp. 262-273.
Lawshe, C.H. (1975), “A quantitative approach to content validity”, Personnel Psychology, Vol. 28 No. 4, pp. 563-575, doi: 10.1111/j.1744-6570.1975.tb01393.x.
Lee, J., Song, H.-D. and Hong, A.J. (2019), “Exploring factors, and indicators for measuring students' sustainable engagement in e-learning”, Sustainability, Vol. 11 No. 4, p. 985.
Lei, H., Cui, Y. and Zhou, W. (2018), “Relationships between student engagement and academic achievement: a meta-analysis”, Social Behavior and Personality: An International Journal, Vol. 46 No. 3, pp. 517-528.
Maroco, J., Maroco, A.L. and Campos, J.A.D.B. (2014), “Student’s academic efficacy or inefficacy? An example on how to evaluate the psychometric properties of a measuring instrument and evaluate the effects of item wording”, Open Journal of Statistics, Vol. 4 No. 6, pp. 484-493, doi: 10.4236/ojs.2014.46046.
Maroco, J., Maroco, A.L., Campos, J.A.D.B. and Fredricks, J.A. (2016), “University student's engagement: development of the university student engagement inventory (USEI)”, Psicologia: Reflexão e Crítica, Vol. 29 No. 1, p. 21.
Marsh, H.W. and Hocevar, D. (1985), “Application of confirmatory factor analysis to the study of self-concept: first-and higher order factor models and their invariance across groups”, Psychological Bulletin, Vol. 97 No. 3, p. 562.
Mayers, A. (2013), Introduction to Statistics and SPSS in Psychology, Pearson Higher Ed, Harlow.
Nickerson, L.A. and Shea, K.M. (2020), “First-semester organic chemistry during COVID-19: prioritizing group work, flexibility, and student engagement”, Journal of Chemical Education, Vol. 97 No. 9, pp. 3201-3205.
O'Keeffe, P. (2013), “A sense of belonging: improving student retention”, College Student Journal, Vol. 47 No. 4, pp. 605-613.
Pahlevan Sharif, S., Mostafiz, I. and Guptan, V. (2019), “A systematic review of structural equation modelling in nursing research”, Nurse Researcher, Vol. 26 No. 2, pp. 28-31, doi: 10.7748/nr.2018.e1577.
Paloș, R., Maricuţoiu, L.P. and Costea, I. (2019), “Relations between academic performance, student engagement and student burnout: a cross-lagged analysis of a two-wave study”, Studies in Educational Evaluation, Vol. 60, pp. 199-204.
Peters, H., Zdravkovic, M., João Costa, M., Celenza, A., Ghias, K., Klamen, D., Mossop, L., Rieder, M., Devi Nadarajah, V. and Wangsaturaka, D. (2019), “Twelve tips for enhancing student engagement”, Medical Teacher, Vol. 41 No. 6, pp. 632-637.
Rahmatpour, P., Chehrzad, M., Ghanbari, A. and Sadat-Ebrahimi, S.-R. (2019a), “Academic burnout as an educational complication and promotion barrier among undergraduate students: a cross-sectional study”, Journal of Education and Health Promotion, Vol. 8, doi: 10.4103/jehp.jehp_165_19.
Rahmatpour, P., Sharif Nia, H. and Peyrovi, H. (2019b), “Evaluation of psychometric properties of scales measuring student academic satisfaction: a systematic review”, Journal of Education and Health Promotion, Vol. 8, pp. 1-11.
Sharif Nia, H., Azad Moghddam, H., Marôco, J., Rahmatpour, P., Allen, K.-A., Kaur, H., Kaveh, O., Gorgulu, O. and Pahlevan Sharif, S. (2022), “A psychometric lens for E-learning: examining the validity and reliability of the Persian version of university students' engagement inventory (P-USEI)”, The Asia-Pacific Education Researcher. doi: 10.1007/s40299-022-00677-y.
She, L., Ma, L., Jan, A., Sharif Nia, H. and Rahmatpour, P. (2021a), “Online learning satisfaction during COVID-19 pandemic among Chinese university students: the serial mediation model [original research]”, Frontiers in Psychology, Vol. 12 No. 4395, doi: 10.3389/fpsyg.2021.743936.
She, L., Ma, L. and Khoshnavay Fomani, F. (2021b), “The consideration of future consequences scale among Malaysian young adults: a psychometric evaluation”, Frontiers in Psychology, Vol. 12, 770609, doi: 10.3389/fpsyg.2021.770609.
She, L., Pahlevan Sharif, S. and Sharif Nia, H. (2021c), “Psychometric evaluation of the Chinese version of the modified online compulsive buying scale among Chinese young consumers”, Journal of Asia-Pacific Business, Vol. 22 No. 2, pp. 121-133, doi: 10.1080/10599231.2021.1905493.
Sinval, J., Casanova, J.R., Marôco, J. and Almeida, L.S. (2021), “University student engagement inventory (USEI): psychometric properties”, Current Psychology, Vol. 40 No. 4, pp. 1608-1620, doi: 10.1007/s12144-018-0082-6.
Skinner, E.A., Pitzer, J.R. and Steele, J.S. (2016), “Can student engagement serve as a motivational resource for academic coping, persistence, and learning during late elementary and early middle school?”, Developmental Psychology, Vol. 52 No. 12, p. 2099.
Soffer, T. and Cohen, A. (2019), “Students' engagement characteristics predict success and completion of online courses”, Journal of Computer Assisted Learning, Vol. 35, pp. 378-389.
Tight, M. (2020), “Student retention and engagement in higher education”, Journal of Further and Higher Education, Vol. 44, pp. 689-704.
Trowler, V. (2010), “Student engagement literature review”, The Higher Education Academy, Vol. 11 No. 1, pp. 1-15.
Unger, S. and Meiran, W.R. (2020), “Student attitudes towards online education during the COVID-19 viral outbreak of 2020: distance learning in a time of social distance”, International Journal of Technology in Education and Science (IJTES), Vol. 4 No. 4, pp. 256-266.
Wefald, A.J. and Downey, R.G. (2009), “Construct dimensionality of engagement and its relation with satisfaction”, The Journal of Psychology, Vol. 143 No. 1, pp. 91-112.
Whitton, N. and Moseley, A. (2014), “Deconstructing engagement: rethinking involvement in learning”, Simulation and Gaming, Vol. 45 Nos 4-5, pp. 433-449.
Wong, T.K., Parent, A.-M. and Konishi, C. (2019), “Feeling connected: the roles of student-teacher relationships and sense of school belonging on future orientation”, International Journal of Educational Research, Vol. 94, pp. 150-157.
Wynd, C.A., Schmidt, B. and Schaefer, M.A. (2003), “Two quantitative approaches for estimating content validity”, Western Journal of Nursing Research, Vol. 25 No. 5, pp. 508-518, doi: 10.1177/0193945903252998.
Xerri, M.J., Radford, K. and Shacklock, K. (2018), “Student engagement in academic activities: a social support perspective”, Higher Education, Vol. 75, pp. 589-605.
Zhang, K., Wu, S., Xu, Y., Cao, W., Goetz, T. and Parks-Stamm, E.J. (2020), “Adaptability promotes student engagement under COVID-19: the multiple mediating effects of academic emotion”, Frontiers in Psychology, Vol. 11, 633265.
Zhoc, K.C.H., King, R.B., Chung, T.S.H. and Chen, J. (2020), “Emotionally intelligent students are more engaged and successful: examining the role of emotional intelligence in higher education”, European Journal of Psychology of Education, Vol. 35, pp. 839-863, doi: 10.1007/s10212-019-00458-0.
Acknowledgements
The authors thank all the participants who took part in the study.