Predicting students’ attention in the classroom from Kinect facial and body features

Springer Science and Business Media LLC - Tập 2017 Số 1 - 2017
Janez Zaletelj1, Andrej Košir1
1Faculty of Electrical Engineering, University of Ljubljana, Trzaska 25, Ljubljana, 1000, Slovenia

Tóm tắt

Từ khóa


Tài liệu tham khảo

D Dinesh, A Narayanan, K Bijlani, in 2016 International Conference on Information Science (ICIS), Kochi, India. Student analytics for productive teaching/learning (Institute of Electrical and Electronics Engineers (IEEE)Piscataway, 2016), pp. 97–102.

NJ Butko, G Theocharous, M Philipose, JR Movellan, in Automatic Face & Gesture Recognition and Workshops (FG 2011), 2011 IEEE International Conference On. Automated facial affect analysis for one-on-one tutoring applications (Institute of Electrical and Electronics Engineers (IEEE)Piscataway, 2011), pp. 382–287.

J Whitehill, Z Serpell, Y-C Lin, A Foster, JR Movellan, The faces of engagement: Automatic recognition of student engagement from facial expressions. IEEE Trans. Affect. Comput.5(1), 86–98 (2014).

RA Calvo, S D’Mello, Affect detection: An interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput.1(1), 18–37 (2010).

AS Won, JN Bailenson, JH Janssen, Automatic detection of nonverbal behavior predicts learning in dyadic interactions. IEEE Trans. Affect. Comput.5(2), 112–25 (2014).

J Fredricks, W McColskey, J Meli, B Montrosse, J Mordica, K Mooney, Measuring student engagement in upper elementary through high school: A description of 21 instruments. (issues & answers report, rel 2011–no. 098), (2011). Technical report, U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southeast.

JA Fredricks, PC Blumenfeld, AH Paris, School engagement: Potential of the concept and state of the evidence. Rev. Educ. Res. Spring. 74(1), 59–109 (2004).

R Martinez-Maldonado, A Clayphan, K Yacef, J Kay, Mtfeedback: Providing notifications to enhance teacher awareness of small group work in the classroom. IEEE Trans. Learn. Technol.8(2), 187–200 (2015).

CR Henrie, LR Halverson, CR Graham, Measuring student engagement in technology-mediated learning: A review. Comput. Educ.90:, 36–53 (2015).

MS Young, S Robinson, P Alberts, Students pay attention!: Combating the vigilance decrement to improve learning during lectures. Act. Learn. High. Educ.10(1), 41–55 (2009).

EF Risko, N Anderson, A Sarwal, M Engelhardt, A Kingstone, Everyday attention: Variation in mind wandering and memory in a lecture. Appl. Cogn. Psychol.26(2), 234–42 (2012).

C-M Chen, J-Y Wang, C-M Yu, Assessing the attention levels of students by using a novel attention aware system based on brainwave signals. Br. J. Educ. Technol.48(2), 348–469 (2015).

C Yan, Y Zhang, J Xu, F Dai, L Li, Q Dai, F Wu, A highly parallel framework for HEVC coding unit partitioning tree decision on many-core processors. IEEE Signal Proc. Lett.21(5), 573–6 (2014).

C Yan, Y Zhang, J Xu, F Dai, J Zhang, Q Dai, F Wu, Efficient parallel framework for hevc motion estimation on many-core processors. IEEE Trans. Circ. Syst. Video Technol.24(12), 2077–89 (2014).

C Yan, Y Zhang, F Dai, X Wang, L Li, Q Dai, Parallel deblocking filter for HEVC on many-core processor. Electron. Lett.50(5), 367–8 (2014).

C Yan, Y Zhang, F Dai, J Zhang, L Li, Q Dai, Efficient parallel hevc intra-prediction on many-core processor. Electron. Lett.50(11), 805–6 (2014).

H Monkaresi, N Bosch, RA Calvo, SK D’Mello, Automated detection of engagement using video-based estimation of facial expressions and heart rate. IEEE Trans. Affect. Comput.8(1), 15–28 (2017).

N Alioua, A Amine, A Rogozan, A Bensrhair, M Rziza, Driver head pose estimation using efficient descriptor fusion. EURASIP J. Image Video Process. 2016(1), 1–14 (2016).

R Bixler, S D’Mello, Automatic gaze-based user-independent detection of mind wandering during computerized reading. User Model. User-Adap. Inter.26(1), 33–68 (2016).

N-H Liu, C-Y Chiang, H-C Chu, Recognizing the degree of human attention using EEG signals from mobile sensors. Sensors. 13(8), 10273 (2013).

J Han, L Shao, D Xu, J Shotton, Enhanced computer vision with microsoft kinect sensor: A review. IEEE Trans. Cybern.43(5), 1318–34 (2013).

S Springer, GY Seligmann, Validity of the kinect for gait assessment: A focused review. Sensors. 16(2), 194 (2016).

G Zhu, L Zhang, P Shen, J Song, An online continuous human action recognition algorithm based on the kinect sensor. Sensors. 16(2), 161 (2016).

SS Mukherjee, NM Robertson, Deep head pose: Gaze-direction estimation in multimodal video. IEEE Trans. Multimed.17(11), 2094–2107 (2015).

A Saeed, A Al-Hamadi, A Ghoneim, Head pose estimation on top of haar-like face detection: A study using the kinect sensor. Sensors. 15(9), 20945–66 (2015).

L Paletta, K Santner, G Fritz, A Hofmann, G Lodron, G Thallinger, H Mayer, in ICVS’13 Proceedings of the 9th International Conference on Computer Vision System. Lecture Notes In Computer Science. Facts—a computer vision system for 3D recovery and semantic mapping of human factors (Springer-VerlagBerlin, 2013), pp. 62–72.

U Burnik, J Zaletelj, A Košir, Video-based learners’ observed attention estimates for lecture learning gain evaluation. Multimedia Tools and Applications (2017). https://doi.org/10.1007/s11042-017-5259-8 .