Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

International Journal of Electrical and Computer Engineering (IJECE)

Vol. 13, No. 5, October 2023, pp. 5342~5353


ISSN: 2088-8708, DOI: 10.11591/ijece.v13i5.pp5342-5353  5342

Factors affecting students’ continuance intention to use teaching


performance assessment application from technology
continuance theory

Nurdin Nurdin1, Sagaf S. Pettalongi2, Muhammad Nur Ahsan3, Vindy Febrianti4


1
Department of Islamic Banking, Faculty of Islamic Economics and Business, Universitas Islam Negeri Datokarama Palu,
Palu, Indonesia
2
Department of Islamic Education, Faculty of Tarbiyah and Teacher Training, Universitas Islam Negeri Datokarama Palu,
Palu, Indonesia
3
Department of Islamic Communication, Faculty of Philosophy, Humanities, and Communication, Universitas Islam Negeri Datokarama
Palu, Palu, Indonesia
4
Department of Islamic Education, Postgraduate School, Universitas Islam Negeri Datokarma Palu, Palu, Indonesia

Article Info ABSTRACT


Article history: This study aims to determine university students’ continuance intention in
using an android-based teaching performance assessment (TPA) application.
Received Dec 16, 2022 For the data gathering instrument, we employed an online structured
Revised Dec 26, 2022 questionnaire. Two hundred and forty students from four faculties were
Accepted Feb 4, 2023 selected and assigned a five-scale survey. All completed questionnaires were
analyzed using analysis of moment structure (AMOS). The findings show that
the factors of productivity, performance, relevancy, quality and mobility of
Keywords: the android-based TPA have significantly influenced students’ continuance
intention to use the application. The results highlighted that when an
Android based-application android-based system was developed based on the criteria, the long-term use
Teaching performance of the android-based TPA application can be consistently maintained to
Teaching assessment improve universities’ teaching quality assessment. However, our study needs
Technology continuance theory to improve in that the university students may evaluate teaching staffs who
Usage intention are not teaching a subject in their class because all teaching staff has appeared
in the application database. In addition, further research needs to limit each
lecture based on a specific course to be assessed by a particular student’s class.
This is an open access article under the CC BY-SA license.

Corresponding Author:
Nurdin Nurdin
Faculty of Islamic Economics and Business, Universitas Islam Negeri Datokarama Palu
No. 2 Diponegoro street, Palu, Indonesia
Email: [email protected]

1. INTRODUCTION
Universities have intensively applied service assessment policies to improve their education quality.
One aspect that has been frequently evaluated regularly is teaching quality assessment. The teaching
performance assessment (TPA) is a strategy to find out teaching quality which is crucial to improve meaningful
and sustainable learning in a highly competitive educational industry. The outcome of teaching performance
evaluation is usually used by decision-makers of university stakeholders such as rectors, deans, and the quality
assurance (QA) department for teaching staff development and accreditation purposes.
Mobile apps for education quality evaluation have been implemented in some universities [1], [2].
The apps are used to evaluate the quality of education, such as teaching performance. The teaching performance
evaluation is an educational institution’s activity to assess their teaching staff performance for better
improvement [3]. In higher education institutions, the objective of TPA is to improve the quality of teaching

Journal homepage: http://ijece.iaescore.com


Int J Elec & Comp Eng ISSN: 2088-8708  5343

and education as demanded by stakeholders. As such, lecturers are required to improve their teaching
performance through external critics. In this regard, the critics come from their students who receive teaching
material and activities.
The evaluation of the teaching process can be carried out qualitatively [4], [5] or it can also involve
quantitative measuring in the assessment process [6], [7]. The evaluation results of the teaching staff
performance have long been a reference for advanced universities to take further actions toward teaching staff
improvement across countries around the globe [8]. The acts might include providing training, peer teaching
critics, and pursuing further education level.
In Indonesia, universities are imposed on practicing teaching performance evaluation when a
university proposes accreditation based on the directorate for higher education regulation Number 12 of 2012.
The results of the teaching quality assessment reflect stakeholders’ satisfaction, such as students, parents, and
graduate users, and it also becomes an indicator to determine the quality of teaching in the university. The
indicator can be used for accreditation and university quality improvement. Indonesian government regulation
number 14 of 2005 mandates all teaching staff to be accredited for professionalism before they are liable for
certification allowances. Their professionals are usually measured by their capability in teaching.
In response to professional issues, universities implement various TPA mechanisms. For example,
some universities use conventional TPA methods through paper-based surveys by distributing hard-copy of
questionnaires to students Dandalt and Brutus [9], and Buchanan et al. [10] or using a mobile-based application
[11]. However, instead of rampant implementation of mobile apps to evaluate teaching performance, more
research needs to be conducted to determine factors influencing students’ continuance intention to use mobile
apps for teaching performance evaluation. Therefore, the study’s objectives are to find out factors that affect
students’ motivation to continue using an android-based application TPA and how the significance of each
factor influences their continuance motivation in using the system. The results of this research may shed light
on academia and practitioners regarding factors that affect the continuance intention to use the android-based
application in TPA and use the finding to improve the mobile application evaluation systems.

2. LITERATURE REVIEW
2.1. Previous studies
Several researchers within various educational contexts have studied teaching staff performance. For
example, some researchers have studied teaching performance evaluation within higher education institutions
[12], [13] and middle school contexts [14]. Most research on teaching evaluation performance was conducted
using paper-based questionnaires, which might hinder maximum evaluators’ involvement. For example, a
paper-based teaching performance evaluation conducted by Guolla [13], who distributed printed questionnaires
to 184 undergraduate and postgraduate students, found that fewer students were willing to take part in the
study. However, with limited participation, the study found the strengths and weaknesses of teachers’ teaching
performance. The study has a weakness because most instruments were not returned, which proved that the
manual teacher performance assessment restricted broader students’ participation.
Teaching performance evaluation using information technology has intensively been applied since
various cutting edges mobile technologies emerged in the education sector. Early studies in teaching
performance evaluation used information technology-based, and researchers used desktop-based and
web-based information systems. Hussein [15], for example, applied a teaching management information
system called 'JUSUR' to assess educators’ teaching quality at King Saud University. The study recruited all
students as participants, and they were asked to complete a 5-point Likert scale questionnaire on desktop-based
applications in the campus computer networks. The study results provide information regarding the quality of
teaching within all the university departments [15]. The weakness of the system is that students must use
campus computers connected to campus networks. In addition, many students need help accessing the
computer network.
Meanwhile, another research conducted by Sher [16] and Felton et al. [17] uses a web-based system
to evaluate teaching quality based on criteria of quality, easiness, and sexiness. Students from all faculties were
selected purposively, and they were asked to access the university website. However, many students were
reluctant to access websites made for the evaluation process. In addition, a slow internet connection caused the
lack of access to the web-based valuation system. The problem with the internet connection reflects the
weakness of a web-based teaching quality evaluation system.
A new approach to assessing teaching staff has been developed. Noguera et al. [18] developed 3D
mobile application to evaluate the teaching quality delivered to physiotherapy students. The application helps
practice assessment where personal computers were not possible to use. Syaifudin et al. [19] developed a
web-based application to help students evaluate learning activities online. The application was accessed and
smoothly utilized by the students. However, both studies did not conduct further research to understand how

Factors affecting students’ continuance intention to use teaching performance … (Nurdin Nurdin)
5344  ISSN: 2088-8708

students could continue using applications for teaching performance evaluation. A lack of understanding of the
users’ continuance usage of teaching evaluation applications might cause unsustainable use.

2.2. Theoretical constructs and hypotheses


Previous research concentrated on the early phase of adoption of an android application, while limited
studies have been conducted on long-period use of the information systems [20]. The theories to understand
the early adoption stage have been adapted from social and psychological areas. However, such theories were
unable to explain the continuance of information systems. This research used the technology continuance
theory (TCT) adapted from Liao et al. [21] to understand students’ motivation to continue using an
android-based teaching assessment application. The TCT theory argues that users’ continuance intention to use
an application is because they perceive confirmation and benefits, and it is easy to use. However, the perceived
confirmation, benefits, and ease to use are encouraged by factors such as the information system quality,
relevancy, productivity, performance, and mobility [22]. Therefore, we adopted the theory concept, but some
variables were developed and integrated into the theory to study the continuance intention usage of
android-based TPA to fit our study context. The variables were identified and developed as the constructs of
our study, as depicted in Table 1.

Table 1. The criteria for android-based TPA


No Criteria Authors
1 System productivity [23]–[25]
2 System performance [26]–[28]
3 System relevancy [29]–[31]
4 System quality [32]–[35]
5 System mobility [36], [37]
6 Continuance usage [38]–[40]

2.2.1. System productivity


The concept of “Productivity” refers to an application that can increase the effectiveness of users in
carrying out their daily routine tasks [41]. Within the educational context, productivity has become the main
concern in carrying out teaching tasks such as class management, lesson planning, teaching evaluation, student
performance evaluation, and all other related teaching tasks [42]. Meanwhile, from students’ perspectives, an
application is considered highly productive when the application supports them in evaluating their teachers’
performances effectively and efficiently. In other words, quality teaching assessment applications help students
to handle their teachers’ evaluations as efficiently as possible.
At the same time, management and instructors can also obtain their TPA results quickly [43]. By using
such an application, students can evaluate their teachers’ teaching performance as often as possible, every time
required. As such, the willingness to use the application can be maintained whenever the teaching performance
evaluation is required to produce detailed teaching performance reports [44]. An android-based teaching
evaluation performance can help students to assess a number of their teacher performances in a relatively short
time [45]. Therefore, when an android-based application can reduce the resources and time required for
students to perform their teachers’ teaching assessment, their intention to use the application is high [46]. Our
hypothesis is as:
H1: The productivity of android-based teaching performance evaluation (TPE) positively influences users’
continuance intention usage

2.2.2. System performance (SP)


Information system performance is understood as the ability of the system to perform given tasks [47].
Excellent information system performance resulted in user satisfaction, defined as the “sum of one’s feelings
regarding an information system” [48]. The performance of an information system is determined by its
efficiency, such as the system can improve the communication process, decision-making, and organizational
operations. Information system performance is also causing users’ confirmation of expectation, perceived
usefulness, and perceived ease of use of the information systems, leading to users’ continuance intention to use
the information systems. Bhattacherjee [38] posits that information systems performance expectations become
an important factor in users’ continuance intention to use an information system. Users build positive behavior
towards the information systems when the performance fulfill their expectation. The high performance of an
information system indicates a high level of task-technology fit and satisfaction with the information system
[49]. As such, we hypothesize as:

Int J Elec & Comp Eng, Vol. 13, No. 5, October 2023: 5342-5353
Int J Elec & Comp Eng ISSN: 2088-8708  5345

H2: The high performance of android-based teaching performance evaluation (TPE) positively influences
users’ continuance intention usage

2.2.3. System relevancy


The information system literature views relevance as an essential criterion of satisfaction [50]. Several
studies view relevance as a contextual and situational-based concept as users resolve their information needs.
In this study, we argue that information system relevance is critical to students' intentions to use an information
system [51], [52]. The dimension “relevant” explains the extent of the benefits of an application for teaching
staff in all classes and disciplines. Teaching quality assessment applications must be able to support all teaching
staff in getting information about their teaching abilities. When assessing the relevance of the application,
evaluators (students) need to see the relevance of the tasks that must be completed by the lecturer in the
classroom and the courses taught in his paper by preparing, giving, assessing, and analyzing the teaching
material given to students [44]. Existing facilities in the application must also be relevant to aspects that will
be assessed in lecturer learning activities. If aspects are not relevant to be assessed, it will cause the results that
appear in the application cannot to be used for decision-making by their superiors. In other words, the more
relevant an information system, the more likely users will use it [53]. The relevancy is often associated with
the compatibility and usefulness of the information system as the most determinant of continuance intention to
use it [54]. The hypothesis is that:
H3: The relevancy of android-based teaching performance evaluation (TPE) positively influences users’
continuance intention usage

2.2.4. System quality


The quality of an Information system is achieved as the result of an IS development and maintenance
process [55]. The information system has quality features to support ease of use and maximum operation, and
users can operate it conveniently. DeLone and McLean [25] argue that a high-quality information system is
reliable to use, has excellent response time, is easy to use, useful, easy to access, and flexible. The information
produced by the system is reliable and trusted to be used for decision-making and solving problems.
Myers et al. [24] consider a high-quality information system as a system with a quick response time, and it also
increases efficiency in tasks performed [35]. The systems can reasonably respond to users’ needs when data is
required to support decision-making.
A high-quality application can support users to interact smoothly and efficiently in completing their
tasks [44]. Therefore, users who have a positive experience when using an application tend to use it maximum
and will continue to use it. Furthermore, users also find that all application features support its functionality to
increase their intention to use it. Moreover, the application’s functionality includes some dimensions that
support collaboration, communication, and the ability to save work in progress [44]. In other words, an
application has good functionality when designed to align with task requirements [56]. Therefore, we
hypothesize as:
H4: The quality of android-based teaching performance evaluation (TPE) positively influences users’
continuance intention usage

2.2.5. Perceived mobility


One essential feature that distinguishes android-based TPA from traditional teaching evaluation
systems is mobility. Perceived information system mobility is understood as “the extent of user awareness of
the mobility value of mobile services and systems” [57]. Users, who perceive the value of application mobility,
appreciate the ubiquity of mobile teaching performance evaluation and have a strong intention to use the
application. Previous studies show that perceived application mobility significantly impacts the perceived use
of the application anytime and anywhere and enhances users’ acceptance of mobile services within teaching
assessment contexts [36].
In the android-based TPA literature, a mobile application tends to refer to the individuals’ ability to
make evaluations independent of time and place [58]. Students can be located in more than one place physically
with current conditions. More clearly, students can evaluate their instructors’ quality of teaching from home or
when traveling. Students can also practice all teaching assessments through android-based applications [59],
[60]. A mobile information system is considered productive when the system can reduce time in production,
control products, improve responsiveness, change management, and improve product quality [61]. Therefore,
we hypothesize as:
H5: Mobility of android-based teaching performance assessment positively influences users' continuance
intention usage

Factors affecting students’ continuance intention to use teaching performance … (Nurdin Nurdin)
5346  ISSN: 2088-8708

2.3. TPA continuance usage


Information system continuance intention usage is mostly influenced by users’ expectations and
perceived performance of the information systems [38]. The expectation of the information systems (IS)
performance is related to factors such as the quality and functionality of the IS [59], [60], relevancy of the IS
with tasks fulfillment [61], [62], information system productivity [24], [63], information system mobility [36],
[64]. Those factors enhance users’ satisfaction, resulting in the intention to use the systems for a long time.
The continuance intention to use a mobile application that is developed to assess teaching performance
has also been found to be affected by the factors discussed above [65], [66]. Most users view a mobile
application as a system with characteristics that reinforce users’ intention to continue using the application
[39]. When a mobile application user’s perceived usefulness of a mobile application characteristics to the
degree to which the user’s expectations about the application are confirmed, the intention to continue using the
application becomes a habit [21]. The habits, then, familiarize users tend to continue using the mobile
application [67]. The concept posited that continuance intention was influenced by users’ habits, satisfaction,
and post-acceptance of the mobile application characteristics of ease and usefulness perceptions [40].

3. METHOD
This study used the quantitative method, and it was conducted at the State Islamic University of
Datokarama Palu. Data was collected using an online structured questionnaire approach. There were
240 respondents recruited from four faculties, with 132 men and 108 women using a quota random sampling
approach. Structured questionnaires were distributed to the respondents. The questionnaires were available
online for one month. The questionnaires consisted of six variables with twenty-eight questions relating to their
perception of the continuance use of an android-based TPA. The respondents' characteristics are depicted in
Table 2.

Table 2. Demographic data of respondents


Total Percentage
Gender
Men 132 44
Women 108 56
Total 240 100%
Faculties
a. Faculty of Islamic Teacher Training 60 25
b. Faculty of Islamic Economics and Business 60 25
c. Faculty of Islamic Law 60 25
d. Faculty of Islamic Philosophy and Communication 60 25
Total 240 100%
Year of Start Education
2018 42 17.5
2019 59 24.5
2020 77 32
2021 62 26
Total 240 100%

Our study conducted an online structured survey as an instrument for data gathering strategy.
Perception measures in the form statements were applied for measuring each factor corresponding to five Likert
scales ranging from strongly agree (5), agree (4), neutral (3), do not agree (2), and strongly do not agree (1).
However, before we distributed the questionnaire, we conducted a pre-tested with relevant research experts
and prospective respondents and then followed by a pilot test with 20 university students. The pilot test results
showed that the Cronbach-Alpha value for all statements in the survey was higher than 0.7, which means the
survey was reliable. The data were analyzed using analysis of moment structure (AMOS) version 22. First, we
conducted an explanatory factor analysis (EFA) with main components extraction to explore all constructs used
in this study. The constructs were five variables, X, and one variable, Y. Then, we conducted confirmatory
factors analysis (CFA) to measure factor loading, reliability, convergent, and discriminate validity. The flow
of the TPA application process can be seen in Figure 1.
The android-based TPA application was built by using PHP hypertext pre-processor. In addition, we
used MySQL database and sublime text three as well as the android studio. The TPA android-based system
was provided in the Google App store. We also adopted the framework from Alkhafaji and Sriram [68] in
building the system data flow process. Our android application for TPA provided the primary system and
sub-system for students and the university quality assurance unit (LPM). At the same time, the administrator

Int J Elec & Comp Eng, Vol. 13, No. 5, October 2023: 5342-5353
Int J Elec & Comp Eng ISSN: 2088-8708  5347

of the system and the university management are sub-system users. The students used the application by
entering their lecturers’ teaching assessment data. Then the quality assurance unit accessed the android
application database to analyze the data and then made reports on the lecturers’ teaching performance quality
at the university. The android application administrator managed users’ IDs, and passwords by maintaining the
application via the internet. The university management only received the lecturers TPA reports for their future
decision-making in teaching staff management.

Figure 1. The TPA application structure adapted from Alkhafaji and Sriram [68]

4. RESULTS AND DISCUSSION


4.1. Variables influence continuance intention to use TPA
The constructs were used to determine the factors that influenced the students’ continuance intention
usage of the TPA. Table 3 shows the results of the measurement. The results below show that all items fit their
variables. All the loading items from scores 0.585 to 0.875 shows higher than the threshold of 0.50. The
coefficient of Cronbach’s alpha for all factors variables spanned between 0.721 to 0.847, which means they
are higher than the 0.7 value. Furthermore, the composite reliability values (CR) were higher than 0.8 (ranging
from values 0.84 to 0.918), while the average extracted variances (AEV) were higher than the recommended
0.5 value, meaning that the relationship between all variables and the motivation to continue using of TPA has
shown internal consistency reliability or the variables consistency with the scale [69].
Table 3 also indicates that the more factors’ loading different among items within similar variables,
the bigger the gap between the score of CR and Cronbach’s Alpha. Our study also compared the square root
of the average variance extracted from each variable and the correlation between variables in examining the
discriminant validity [70]. The discriminant validity indicated whether or not the variables in the research
model are significantly related among them.
The square roots of the average extracted variance of all variables were found to be larger than the
relationship estimated with the other variables. In other words, the computation technique exceeds adequate
Factors affecting students’ continuance intention to use teaching performance … (Nurdin Nurdin)
5348  ISSN: 2088-8708

reliability, validity, and discriminant validity. The highlight scores on the diagonal indicate the square root of
the average extracted variance. Table 4 shows the correlation matrix and discriminant validity.
Data for our research hypothesis testing is depicted in Table 5. The model testing results include the
standardized regression coefficient, the critical ratio (t-value), and the probability (P-value). The model tested
in this study shows the significant influence of all five variables on the continuance intention of the students to
use the android-based TPA. As shown in Table 5, all variables were found significantly influence the student’s
continuance intention to use TPA with a determined significant level was 5% (p<0.05).

Table 3. The results of the calculation


Constructs Cronbach’s Standardized Composite Average variances
alpha items load reliability extracted
Systems productivity (SPr)
SPr1: I find TPA help in accomplishing my lecturer 0.721
teaching performance evaluation more quickly
SPr2: The TPA improves the evaluation process 0.584
SPr3: The TPA improves my evaluation 0.620 0.843 0.503
0.788
productivity 0.875
SPr4: The TPA provides effectiveness for the 0.710
evaluation process
SPr5: Using TPA improves the quality of teaching
System performance (SP)
SP1: I find that the use of the TPA is flexible and
easy 0.627
SP2: I do not need to work hard in using TPA 0.795
0.815 0.909 0.538
SP3: I become more skillful when using TPA 0.745
SP4: I find the interaction with the TPA is clear and 0.754
understandable
Systems relevancy (SR)
SR1: I find the TPA helps me to evaluate relevant
lecturers
0.850
SR2: The TPA system provides a relevant aspect of
teaching to evaluate
0.823 0.820 0.898 0.602
SR3: In general, the TPA system supports the
0.640
evaluation process
0.680
SR4: I can choose a relevant aspect of teaching to
evaluate
System quality (SQ)
SQ1: The TPA is more accurate, better quality, and
real-time
SQ2: I find the TPA system increases the quality of 0.790
teaching evaluation
SQ3: I find the services of TPA are secure and 0.600
manageable
0.721 0.840 0.502
SQ4: The services of TPA must be accessible more 0.640
quickly 0.710
SQ5: I find the TPA facilitates teaching 0.700
improvement 0.730
SQ6: I prefer the TPA has good quality in
navigating, browsing, and downloading
System mobility (SM)
SM1: I, sometimes like to experience the TPA can
0.840
be accessed anywhere
SM2: I can save time by using TPA because it is
0.847 0.680 0.918 0.670
mobile
SM3: I do not need to think about a space when I
0.780
want to use the TPA
Intention to continuance use (ItoCU)
ItoCU1: I will continue using TPA for teaching
performance evaluation assessment 0.730
ItoCU2: I have got good knowledge and skills to
continue the use of the TPA 0.680
ItoCU3: I prefer to use TPA than other applications
for teaching performance assessments 0.750
0.835 0.892 0.617
ItoCU4: I will suggest my friends use the TPA
application for teaching performance 0.700
assessment
ItoCU5: I will keep using the TPA as frequently as 0.720
possible 0.750
ItoCU6: I will enjoy using the TPA in the future

Int J Elec & Comp Eng, Vol. 13, No. 5, October 2023: 5342-5353
Int J Elec & Comp Eng ISSN: 2088-8708  5349

In our research model proposed in the theoretical construct section, we hypothesized that all factors,
system productivity, system performance, system relevancy, system quality, and system mobility, positively
influence the users’ continuance intention to use the teaching performance evaluation application. Based on
our statistical calculation results, the standardized regression coefficient indicated the significance of all factors
to the users’ continuance intention to the TPA application ranging from 0.228 to 0.366. The factor of system
performance has the strongest influence on the users’ continuance intention to use the TPE application.
Meanwhile, the factor of system relevancy played the weakest influence on the users’ continuance intention to
use the system.

Table 4. Correlation matrix and discriminant validity


Factors S SP SR SQ SM ItoCU

S 0.712
SP 0.450 0.730
SR 0.305 0.487 0.766
SQ 0.391 0.503 0.459 0.704
SM 0.313 0.428 0.324 0.457 0.799
ItoCU 0.549 0.674 0.492 0.495 0.567 0.721

Table 5. The coefficient of relationship and the critical ratio of hypothesis


Relationship Reg. Coef t-Value P-Value Significance
Spr➔ItoCU(H1) 0.268 2.19 0.02 Yes
SP➔ItoCU(H2) 0.366 2.29 0.01 Yes
SR➔ItoCU(H3) 0.228 1.87 0.03 Yes
SQ➔ItoCU(H4) 0.256 1.99 0.03 Yes
SM➔ItoCU(H5) 0.313 2.16 0.02 Yes

4.2. Discussion
Our study shows that our research model proposed earlier and hypothesis are proven significantly and
able to describe students’ continuance intention to use the android-based TPA. Tested constructs (factors):
application productivity, performance, relevancy, quality, and mobility were all important factors that affect
the student’s continuance intention to use the TPA. The two modified constructs, system quality (β=0.228,
P<0.04) and mobility (β=0.313, P<0.04), were important for all students’ responses.
Our results support earlier studies. For example, variable SPr influenced users’ continuance
motivation to use the TPA. The finding indicated that users expect a mobile application to assess teaching
performance should support them to assess more lectures in a certain time, and it can be performed in a short
time [71]. Such expectations reflect teaching activities sometimes are performed more than twice a day in the
Indonesian university context. This requires a teaching evaluation system that can support assessment
productivity.
Our study also found that the performance of a mobile application has significantly influenced users’
continuance intention to use it [72], [73]. Our findings indicated that users with high-performance expectancy
tend to use the android-based teaching performance evaluation continually. Furthermore, users usually consider
the performance of an information system to determine their satisfaction with using the information system
within an organization’s work practice because the information system can support their learning effectiveness
[74]. In this study, better TPA performance has supported the teaching quality improvement to satisfy the
students because the university lecturers got input from students’ assessment process.
The variable SR also significantly influenced the students’ continuance intention to use the TPA in
which the application features support all students to evaluate all aspects of lectures’ teaching performance.
Relevance information systems features have been found to influence users’ continuance intention to use the
system [75]. The android-based TPA system’s relevance with the need of the students to evaluate lecture skills
in performing their teaching has enhanced the students’ continuance intention to use the application in future
teaching and learning activities.
The SQ of the TPA was found to be significantly affecting the students’ continuance intention to use
the TPA. Early related research also showed that the quality of an information system influenced users’
intention to use it [76], [77]. In this study, the quality is referred to as easy to access, quick to download, easy
to select the features, and easy to find a lecture name. Furthermore, the students did not experience
difficulties in using the application every time they used it. As such, those indicators positively affected the
students’ continuance intention to use the android-based system, as found by Nikon and Economides [78] in
their studies.
Factors affecting students’ continuance intention to use teaching performance … (Nurdin Nurdin)
5350  ISSN: 2088-8708

Variable SM was found to have an important effect on the students’ continuance intention to use the
TPA. The finding suggested that an application that can be accessed and used everywhere can maintain users’
continuance intention to use it [79], [80]. An application with higher mobility enhances users’ continuance
intentions toward mobile applications [81]. In this study, the students can access and use the TPA everywhere
they want. They did not have to access the application in the classroom where a lecturer noticed them. As such,
the flexibility to access and use the system can increase users’ conformability in using the system, which might
lead to continuance usage.

5. CONCLUSION
Our study contributes to both the theory and practice world. From a theoretical perspective, our study
shed light on the understanding of the predictors of continuance intention to use android-based TPA, and this
study might become a reference for both academia and practitioners in higher education institutions. For the
researchers, the model used in the study can be tested within a broader context involving more populations
from other universities. Meanwhile, for education institutions, our study provides important insights into
supporting teaching quality assessment to maintain the high quality of teaching as a university’s core business.
The limitation of this study is that the students may assess a lecturer who is not teaching in their class due to
the freedom to select on the database. Further research needs to limit each lecture based on a specific course to
be assessed by a particular student’s class.

REFERENCES
[1] G. K. Gitonga, “Mobile application for school quality evaluation system,” Ph.D. dissertation, Strathmore University, 2016.
[2] A. B. Shinde, V. L. Karade, and S. S. Sutar, “Android based student feedback system for improved teaching learning,” International
Journal of Computer Sciences and Engineering, vol. 7, no. 2, pp. 237–243, Feb. 2019, doi: 10.26438/ijcse/v7i2.237243.
[3] R. Noe, J. Hollenbeck, B. Gerhart, and P. Wright, Human resource management: Gaining a competitive advantage, 9 ed. McGraw
Hill, 2015.
[4] Q. He, M. Valcke, and A. Aelterman, “A qualitative study of in-service teachers’ evaluation beliefs,” Procedia-Social and
Behavioral Sciences, vol. 69, pp. 1076–1085, Dec. 2012, doi: 10.1016/j.sbspro.2012.12.035.
[5] C. S. Luis and I. Cañadas, “Qualitative and quantitative methods to assess the qualities of a lecturer: what qualities are demanded
by online and on-site students?,” Procedia-Social and Behavioral Sciences, vol. 143, pp. 106–111, Aug. 2014, doi:
10.1016/j.sbspro.2014.07.369.
[6] S. Cadez, V. Dimovski, and M. Z. Groff, “Research, teaching and performance evaluation in academia: the salience of quality,”
Studies in Higher Education, vol. 42, no. 8, pp. 1455–1473, Aug. 2017, doi: 10.1080/03075079.2015.1104659.
[7] J. E. Doty, “Teacher performance evaluation and professional growth in the era of 'Educator effectiveness',” Electronic Theses and
Dissertations, The University of Maine, 2018.
[8] P. Ramsden, “A performance indicator of teaching quality in higher education: The course experience questionnaire,” Studies in
Higher Education, vol. 16, no. 2, pp. 129–150, Jan. 1991, doi: 10.1080/03075079112331382944.
[9] E. Dandalt and S. Brutus, “Teacher performance appraisal regulation: A policy case analysis,” NASSP Bulletin, vol. 104, no. 1,
pp. 20–33, Mar. 2020, doi: 10.1177/0192636520911197.
[10] J. Buchanan, G. Harb, and T. Fitzgerald, “Implementing a teaching performance assessment: An Australian case study,” Australian
Journal of Teacher Education, vol. 45, no. 5, pp. 74–90, May 2020, doi: 10.14221/ajte.2020v45n5.5.
[11] G. W. Soad, N. F. Duarte Filho, and E. F. Barbosa, “Quality evaluation of mobile learning applications,” in 2016 IEEE Frontiers
in Education Conference (FIE), Oct. 2016, pp. 1–8. doi: 10.1109/FIE.2016.7757540.
[12] A. W. Bangert, “The development of an instrument for assessing online teaching effectiveness,” Journal of Educational Computing
Research, vol. 35, no. 3, pp. 227–244, Oct. 2006, doi: 10.2190/B3XP-5K61-7Q07-U443.
[13] M. Guolla, “Assessing the teaching quality to student satisfaction relationship: Applied customer satisfaction research in the
classroom,” Journal of Marketing Theory and Practice, vol. 7, no. 3, pp. 87–97, Jul. 1999,
doi: 10.1080/10696679.1999.11501843.
[14] H. Borko, V. Mayfield, S. Marion, R. Flexer, and K. Cumbo, “Teachers’ developing ideas and practices about mathematics
performance assessment: Successes, stumbling blocks, and implications for professional development,” Teaching and Teacher
Education, vol. 13, no. 3, pp. 259–278, Apr. 1997, doi: 10.1016/S0742-051X(96)00024-8.
[15] H. B. Hussein, “Assessing elearning teaching quality of faculty members in teachers’ college at King Saud
University: Students perspectives,” Procedia-Social and Behavioral Sciences, vol. 55, pp. 945–952, Oct. 2012,
doi: 10.1016/j.sbspro.2012.09.584.
[16] A. Sher, “Assessing the relationship of student-instructor and student-student interaction to student learning and satisfaction in web-
based online learning environment,” Journal of Interactive Online Learning, vol. 8, no. 2, pp. 102–120, 2009.
[17] J. Felton, J. Mitchell, and M. Stinson, “Web-based student evaluations of professors: the relations between perceived quality,
easiness and sexiness,” Assessment and Evaluation in Higher Education, vol. 29, no. 1, pp. 91–108, Feb. 2004, doi:
10.1080/0260293032000158180.
[18] J. M. Noguera, J. J. Jiménez, and M. C. Osuna-Pérez, “Development and evaluation of a 3D mobile application for learning
manual therapy in the physiotherapy laboratory,” Computers and Education, vol. 69, pp. 96–108, Nov. 2013, doi:
10.1016/j.compedu.2013.07.007.
[19] Y. W. Syaifudin et al., “Web application implementation of android programming learning assistance system and its
evaluations,” IOP Conference Series: Materials Science and Engineering, vol. 1073, no. 1, Feb. 2021,
doi: 10.1088/1757-899X/1073/1/012060.
[20] C. Cheung and M. Limayem, “The role of habit in information systems continuance: examining the evolving relationship between
intention and usage,” in ICIS 2005 Proceedings, 2005.

Int J Elec & Comp Eng, Vol. 13, No. 5, October 2023: 5342-5353
Int J Elec & Comp Eng ISSN: 2088-8708  5351

[21] C. Liao, P. Palvia, and J.-L. Chen, “Information technology adoption behavior life cycle: Toward a technology continuance theory
(TCT),” International Journal of Information Management, vol. 29, no. 4, pp. 309–320, Aug. 2009, doi:
10.1016/j.ijinfomgt.2009.03.004.
[22] J. Gebauer, M. J. Shaw, and M. L. Gribbins, “Task-technology fit for mobile information systems,” Journal of Information
Technology, vol. 25, no. 3, pp. 259–272, Sep. 2010, doi: 10.1057/jit.2010.10.
[23] V. Jain and S. Kanungo, “Beyond perceptions and usage: Impact of nature of information systems use on information system-
enabled productivity,” International Journal of Human-Computer Interaction, vol. 19, no. 1, pp. 113–136, Sep. 2005, doi:
10.1207/s15327590ijhc1901_8.
[24] B. L. Myers, L. A. Kappelman, and V. R. Prybutok, “A comprehensive model for assessing the quality and productivity of the
information systems function,” Information Resources Management Journal, vol. 10, no. 1, pp. 6–26, Jan. 1997, doi:
10.4018/irmj.1997010101.
[25] W. H. DeLone and E. R. McLean, “Information systems success: The quest for the dependent variable,” Information Systems
Research, vol. 3, no. 1, pp. 60–95, Mar. 1992, doi: 10.1287/isre.3.1.60.
[26] C. S. Chapman and L.-A. Kihn, “Information system integration, enabling control and performance,” Accounting, Organizations
and Society, vol. 34, no. 2, pp. 151–169, Feb. 2009, doi: 10.1016/j.aos.2008.07.003.
[27] H. S. Mahmassani and R. Jayakrishnan, “System performance and user response under real -time information
in a congested traffic corridor,” Transportation Research Part A: General, vol. 25, no. 5, pp. 293–307, Sep. 1991,
doi: 10.1016/0191-2607(91)90145-G.
[28] J. W. Anderson, “Information systems performance measurement and evaluation (session overview),” in Proceedings of the 1985
ACM thirteenth annual conference on Computer Science-CSC ’85, 1985. doi: 10.1145/320599.320675.
[29] S. P. Harter, “Psychological relevance and information science,” Journal of the American Society for Information Science, vol. 43,
no. 9, pp. 602–615, Oct. 1992, doi: 10.1002/(SICI)1097-4571(199210)43:9<602::AID-ASI3>3.0.CO;2-Q.
[30] Rosemann and Vessey, “Toward improving the relevance of information systems research to practice: The role of applicability
checks,” MIS Quarterly, vol. 32, no. 1, pp. 1–22, 2008, doi: 10.2307/25148826.
[31] T. Saracevic, “Relevance: A review of the literature and a framework for thinking on the notion in information science. Part III:
Behavior and effects of relevance,” Journal of the American Society for Information Science and Technology, vol. 58, no. 13,
pp. 2126–2144, Nov. 2007, doi: 10.1002/asi.20681.
[32] N. Gorla, T. M. Somers, and B. Wong, “Organizational impact of system quality, information quality, and service quality,” The
Journal of Strategic Information Systems, vol. 19, no. 3, pp. 207–228, Sep. 2010, doi: 10.1016/j.jsis.2010.05.001.
[33] R. R. Nelson, P. A. Todd, and B. H. Wixom, “Antecedents of information and system quality: An empirical examination within the
context of data warehousing,” Journal of Management Information Systems, vol. 21, no. 4, pp. 199–235, Apr. 2005, doi:
10.1080/07421222.2005.11045823.
[34] J. J. Jiang, G. Klein, and C. L. Carr, “Measuring information system service quality: SERVQUAL from the other side,” MIS
Quarterly, vol. 26, no. 2, pp. 145–166, Jun. 2002, doi: 10.2307/4132324.
[35] A. B. Kiros and P. U. Aray, “Tigrigna language spellchecker and correction system for mobile phone devices,” International
Journal of Electrical and Computer Engineering (IJECE), vol. 11, no. 3, pp. 2307–2314, Jun. 2021,
doi: 10.11591/ijece.v11i3.pp2307-2314.
[36] S. A. Nikou and A. A. Economides, “The effects of perceived mobility and satisfaction on the adoption of mobile-based
assessment,” in 2015 International Conference on Interactive Mobile Communication Technologies and Learning (IMCL), Nov.
2015, pp. 167–171. doi: 10.1109/IMCTL.2015.7359579.
[37] E. Park and K. J. Kim, “An integrated adoption model of mobile cloud services: Exploration of key determinants and extension of
technology acceptance model,” Telematics and Informatics, vol. 31, no. 3, pp. 376–385, Aug. 2014, doi: 10.1016/j.tele.2013.11.008.
[38] A. Bhattacherjee, “Understanding information systems continuance: An expectation-confirmation model,” MIS Quarterly, vol. 25,
no. 3, pp. 351–370, Sep. 2001, doi: 10.2307/3250921.
[39] M. Limayem, S. G. Hirt, C. M. K. Cheung, “How habit limits the predictive power of intention: the case of information systems
continuance,” MIS Quarterly, vol. 31, no. 4, pp. 705–737, 2007, doi: 10.2307/25148817.
[40] V. Venkatesh, J. Y. L. Thong, F. K. Y. Chan, P. J.-H. Hu, and S. A. Brown, “Extending the two-stage information systems
continuance model: incorporating UTAUT predictors and the role of context,” Information Systems Journal, vol. 21, no. 6,
pp. 527–555, Nov. 2011, doi: 10.1111/j.1365-2575.2011.00373.x.
[41] M. Tarafdar, Q. Tu, and T. S. Ragu-Nathan, “Impact of technostress on end-user satisfaction and performance,” Journal of
Management Information Systems, vol. 27, no. 3, pp. 303–334, Dec. 2010, doi: 10.2753/MIS0742-1222270311.
[42] T. R. Sass, A. Semykina, and D. N. Harris, “Value-added models and the measurement of teacher productivity,” Economics of
Education Review, vol. 38, pp. 9–23, Feb. 2014, doi: 10.1016/j.econedurev.2013.10.003.
[43] C. K. Leong, Y. H. Lee, and W. K. Mak, “Mining sentiments in SMS texts for teaching evaluation,” Expert Systems with
Applications, vol. 39, no. 3, pp. 2584–2589, Feb. 2012, doi: 10.1016/j.eswa.2011.08.113.
[44] T. Cherner, C.-Y. Lee, A. Fegely, and L. Santaniello, “A detailed rubric for assessing the quality of teacher resource apps,” Journal
of Information Technology Education: Innovations in Practice, vol. 15, pp. 117–143, 2016.
[45] G. Mithula P. and A. P. Rajan R., “Analysis of students’ preferences for teachers based on performance attributes in higher
education,” TEM Journal, vol. 8, no. 2, pp. 630–635, 2019.
[46] D. R. Bacon, Y. (Eric) Zheng, K. A. Stewart, C. J. Johnson, and P. Paul, “Using conjoint analysis to evaluate and reward teaching
performance,” Marketing Education Review, vol. 26, no. 3, pp. 143–153, Sep. 2016, doi: 10.1080/10528008.2016.1192951.
[47] Y. (Susan) Wei and Q. Wang, “Making sense of a market information system for superior performance: The roles of organizational
responsiveness and innovation strategy,” Industrial Marketing Management, vol. 40, no. 2, pp. 267–277, Feb. 2011, doi:
10.1016/j.indmarman.2010.06.039.
[48] R. W. Zmud and A. C. Boynton, “Survey measures and instruments in MIS: Inventory and appraisal,” The information systems
research challenge: Survey research methods, vol. 3, pp. 149–180, 1991.
[49] J. D’Ambra and C. S. Wilson, “Explaining perceived performance of the World Wide Web: uncertainty and the task‐technology fit
model,” Internet Research, vol. 14, no. 4, pp. 294–310, Sep. 2004, doi: 10.1108/10662240410555315.
[50] M. Gluck, “Exploring the relationship between user satisfaction and relevance in information systems,” Information Processing &
Management, vol. 32, no. 1, pp. 89–104, Jan. 1996, doi: 10.1016/0306-4573(95)00031-B.
[51] J. C. Zimmer, R. E. Arsal, M. Al-Marzouq, and V. Grover, “Investigating online information disclosure: Effects
of information relevance, trust and risk,” Information and Management, vol. 47, no. 2, pp. 115–123, Mar. 2010,
doi: 10.1016/j.im.2009.12.003.
[52] M. K. Alsmadi, “The students’ acceptance of learning management systems in Saudi Arabian Universities,” International Journal
of Electrical and Computer Engineering (IJECE), vol. 10, no. 4, pp. 4155–4161, 2020, doi: 10.11591/ijece.v10i4.pp4155-4161.
Factors affecting students’ continuance intention to use teaching performance … (Nurdin Nurdin)
5352  ISSN: 2088-8708

[53] P. W. Handayani, A. N. Hidayanto, A. A. Pinem, P. I. Sandhyaduhita, and I. Budi, “Hospital information system user acceptance
factors: User group perspectives,” Informatics for Health and Social Care, vol. 43, no. 1, pp. 84–107, Jan. 2018, doi:
10.1080/17538157.2016.1269109.
[54] M. Hubert, M. Blut, C. Brock, R. W. Zhang, V. Koch, and R. Riedl, “The influence of acceptance and adoption drivers on smart
home usage,” European Journal of Marketing, vol. 53, no. 6, pp. 1073–1098, Jun. 2019, doi: 10.1108/EJM-12-2016-0794.
[55] T. T. P. Thi and M. Helfert, “A review of quality frameworks in information systems,” Prepr. arXiv.1706.03030,
Apr. 2017.
[56] van der Heijden, “User acceptance of hedonic information systems,” MIS Quarterly, vol. 28, no. 4, pp. 695–704, 2004, doi:
10.2307/25148660.
[57] E. Park and K. J. Kim, “User acceptance of long‐term evolution (LTE) services,” Program, vol. 47, no. 2, pp. 188–205, Apr. 2013,
doi: 10.1108/00330331311313762.
[58] T. Leinonen, A. Keune, M. Veermans, and T. Toikkanen, “Mobile apps for reflection in learning: A design research in K-12
education,” British Journal of Educational Technology, vol. 47, no. 1, pp. 184–202, Jan. 2016, doi: 10.1111/bjet.12224.
[59] J. C.-J. Chang and W. R. King, “Measuring the performance of information systems: A functional scorecard,” Journal of
Management Information Systems, vol. 22, no. 1, pp. 85–115, Apr. 2005, doi: 10.1080/07421222.2003.11045833.
[60] M. Saluvan and A. Ozonoff, “Functionality of hospital information systems: results from a survey of quality directors at Turkish
hospitals,” BMC Medical Informatics and Decision Making, vol. 18, no. 1, Dec. 2018, doi: 10.1186/s12911-018-0581-2.
[61] W. J. Tastle, B. A. White, A. Valfells, and P. Shackleton, “Information systems, offshore outso urcing, and relevancy in the
business school curriculum,” Journal of Information Technology Research, vol. 1, no. 2, pp. 61–77, Apr. 2008, doi:
10.4018/jitr.2008040105.
[62] M. Pearson, A. Pearson, and J. P. Shim, “The relevancy of information systems research: The practitioner’s view,” Information
Resources Management Journal, vol. 18, no. 3, pp. 50–67, Jul. 2005, doi: 10.4018/irmj.2005070104.
[63] S. N. Prattipati and M. O. Mensah, “Information systems variables and management productivity,” Information & Management,
vol. 33, no. 1, pp. 33–43, Nov. 1997, doi: 10.1016/S0378-7206(97)00036-0.
[64] Y.-S. Yen and F.-S. Wu, “Predicting the adoption of mobile financial services: The impacts of perceived mobility and personal
habit,” Computers in Human Behavior, vol. 65, pp. 31–42, Dec. 2016, doi: 10.1016/j.chb.2016.08.017.
[65] S. Kang, “Factors influencing intention of mobile application use,” International Journal of Mobile Communications, vol. 12,
no. 4, pp. 360–379, 2014, doi: 10.1504/IJMC.2014.063653.
[66] F. Muñoz-Leiva, S. Climent-Climent, and F. Liébana-Cabanillas, “Determinants of intention to use the mobile banking apps: An
extension of the classic TAM model,” Spanish Journal of Marketing-ESIC, vol. 21, no. 1, pp. 25–38, Feb. 2017, doi:
10.1016/j.sjme.2016.12.001.
[67] C.-H. Hsiao, J.-J. Chang, and K.-Y. Tang, “Exploring the influential factors in continuance usage of mobile social Apps:
Satisfaction, habit, and customer value perspectives,” Telematics and Informatics, vol. 33, no. 2, pp. 342–355, May 2016, doi:
10.1016/j.tele.2015.08.014.
[68] S. Alkhafaji and B. Sriram, “Instructor’s performance: A proposed model for online evaluation,” International Journal of
Information Engineering and Electronic Business, vol. 5, no. 4, pp. 34–40, Oct. 2013, doi: 10.5815/ijieeb.2013.04.05.
[69] A. Vahdat, A. Alizadeh, S. Quach, and N. Hamelin, “Would you like to shop via mobile app technology? The technology acceptance
model, social factors and purchase intention,” Australasian Marketing Journal, vol. 29, no. 2, pp. 187–197, May 2021, doi:
10.1016/j.ausmj.2020.01.002.
[70] K.-W. Fu, W. S. C. Chan, P. W. C. Wong, and P. S. F. Yip, “Internet addiction: prevalence, discriminant validity and correlates
among adolescents in Hong Kong,” British Journal of Psychiatry, vol. 196, no. 6, pp. 486–492, Jun. 2010, doi:
10.1192/bjp.bp.109.075002.
[71] M. M. Elaish, N. A. Ghani, L. Shuib, and A. Al-Haiqi, “Development of a mobile game application to boost students’ motivation
in learning english vocabulary,” IEEE Access, vol. 7, pp. 13326–13337, 2019, doi: 10.1109/ACCESS.2019.2891504.
[72] Y. Cheng, S. Sharma, P. Sharma, and K. Kulathunga, “Role of personalization in continuous use intention of mobile news apps in
India: Extending the UTAUT2 model,” Information, vol. 11, no. 1, Jan. 2020, doi: 10.3390/info11010033.
[73] C. Tam, D. Santos, and T. Oliveira, “Exploring the influential factors of continuance intention to use mobile apps: Extending the
expectation confirmation model,” Information Systems Frontiers, vol. 22, no. 1, pp. 243–257, Feb. 2020, doi: 10.1007/s10796-018-
9864-5.
[74] H. Kang, J. A. Turi, S. Bashir, M. N. Alam, and S. A. Shah, “Moderating role of information system and mobile technology with
learning and forgetting factors on organizational learning effectiveness,” Learning and Motivation, vol. 76, Nov. 2021, doi:
10.1016/j.lmot.2021.101757.
[75] R. S. Al-Maroof and S. A. Salloum, “An integrated model of continuous intention to use of google classroom,” in Recent Advances
in Intelligent Systems and Smart Applications, 2021, pp. 311–335. doi: 10.1007/978-3-030-47411-9_18.
[76] F. B. Franque, T. Oliveira, C. Tam, and F. de O. Santini, “A meta-analysis of the quantitative studies in continuance intention
to use an information system,” Internet Research, vol. 31, no. 1, pp. 123–158, Aug. 2020, doi: 10.1108/INTR-03-2019-
0103.
[77] S. A. Nikou and A. A. Economides, “Factors that influence behavioral intention to use mobile -based assessment: A STEM
teachers’ perspective,” British Journal of Educational Technology, vol. 50, no. 2, pp. 587–600, Mar. 2019, doi:
10.1111/bjet.12609.
[78] S. A. Nikou and A. A. Economides, “Mobile-based assessment: investigating the factors that influence behavioral intention to use,”
Computers & Education, vol. 109, pp. 56–73, Jun. 2017, doi: 10.1016/j.compedu.2017.02.005.
[79] J. Melcher, E. Camacho, S. Lagan, and J. Torous, “College student engagement with mental health apps: analysis of barriers to
sustained use,” Journal of American College Health, vol. 70, no. 6, pp. 1819–1825, Aug. 2022, doi:
10.1080/07448481.2020.1825225.
[80] X. C. Le, “Charting sustained usage toward mobile social media application: the criticality of expected benefits and emotional
motivations,” Asia Pacific Journal of Marketing and Logistics, vol. 34, no. 3, pp. 576–593, Feb. 2022, doi: 10.1108/APJML-11-
2020-0779.
[81] J. Lu, C. Liu, and J. Wei, “How important are enjoyment and mobility for mobile applications?,” Journal of Computer Information
Systems, vol. 57, no. 1, pp. 1–12, Jan. 2017, doi: 10.1080/08874417.2016.1181463.

Int J Elec & Comp Eng, Vol. 13, No. 5, October 2023: 5342-5353
Int J Elec & Comp Eng ISSN: 2088-8708  5353

BIOGRAPHIES OF AUTHORS

Nurdin Nurdin is a Professor in Information Systems at the Department of


Economics and Business, Universitas Islam Negeri Datokarama Palu. He has been a faculty
member since 1999. He earned his Master’s degree in Information Management at the Faculty
of Economic and Commerce, University of Western Australia (UWA), and a Ph.D. degree in
Information and Communication Technology from Swinburne University of Technology,
Australia. He has been a teaching staff at UIN Datokarama Palu since 1997. His research
interests lie in the areas of e-government, information systems, social media, and knowledge
management. He has published in a number of international journals and proceedings. He can
be contacted at email: [email protected].

Sagaf S. Pettalongi is a Professor in Education Management at the Faculty of


Tarbiyah and Teacher Training Universitas Islam Negeri Datokarama Palu. He has been a
faculty member since 1991. He obtained his Doctor degree in education management from the
State University of Malang. His research interest lies in education management, education
administration, education theory, higher education, and multiculturalism in education. He can
be contacted at email: [email protected].

Muhammad Nur Ahsan is a doctoral student at the Faculty of Social Science,


University of Gadjah Mada Yogyakarta. He is also a lecturer at the Faculty of Philosophy and
Islamic communication Universitas Islam Negeri Datokarama Palu. His research areas are social
sciences, history, and philosophy. He can be contacted at: [email protected].

Vindy Febrianti is a postgraduate student at the department of Islamic Education


Universitas Islam Negeri Datokarama Palu. Her research interest includes education
management, Islamic education, and education at the middle level. She can be contacted at
email: [email protected].

Factors affecting students’ continuance intention to use teaching performance … (Nurdin Nurdin)

You might also like