Show simple record

dc.contributorAlexander, Samuel
dc.contributorDadgostar, Farhad
dc.contributorFan, Chao
dc.contributorBigdeli, Abbas
dc.contributor.authorSarrafzadeh, Hossein
dc.date.accessioned2013-01-31T02:09:58Z
dc.date.available2013-01-31T02:09:58Z
dc.date.issued2007
dc.identifier.issn0747-5632
dc.identifier.urihttps://hdl.handle.net/10652/2040
dc.description.abstractMany software systems would significantly improve performance if they could adapt to the emotional state of the user, for example if Intelligent Tutoring Systems (ITSs), ATM’s, ticketing machines could recognise when users were confused, frustrated or angry they could guide the user back to remedial help systems so improving the service. Many researchers now feel strongly that ITSs would be significantly enhanced if computers could adapt to the emotions of students. This idea has spawned the developing field of Affective Tutoring Systems (ATSs): ATSs are ITSs that are able to adapt to the affective state of students. The term “Affective Tutoring System” can be traced back as far as Rosalind Picard’s book Affective Computing in 1997. This paper presents research leading to the development of Easy with Eve, an ATS for primary school mathematics. The system utilises a network of computer systems, mainly embedded devices to detect student emotion and other significant bio-signals. It will then adapt to students and displays emotion via a lifelike agent called Eve. Eve’s tutoring adaptations are guided by a case-based method for adapting to student states; this method uses data that was generated by an observational study of human tutors. This paper presents the observational study, the case-based method, the ATS itself and its implementation on a distributed computer systems for real-time performance, and finally the implications of the findings for Human Computer Interaction in general and e-learning in particular. Web-based applications of the technology developed in this research are discussed throughout the paper.en_NZ
dc.language.isoenen_NZ
dc.publisherPergamonen_NZ
dc.subjectAffective tutoring systemsen_NZ
dc.subjectlifelike agentsen_NZ
dc.subjectemotion detectionen_NZ
dc.subjectfacial expressionsen_NZ
dc.subjecthuman computer interactionen_NZ
dc.subjectaffective computingen_NZ
dc.title“How do you know that I don't understand?" A look at the future of intelligent tutoring systemsen_NZ
dc.typeJournal Articleen_NZ
dc.rights.holderPergamonen_NZ
dc.identifier.doi10.1016/j.chb.2007.07.008en_NZ
dc.subject.marsden080602 Computer-Human Interactionen_NZ
dc.identifier.bibliographicCitationSarrafzadeh, A., Alexander, S., Dadgostar, F., Fan, C., & Bigdeli, A. (2008). “How do you know that I don’t understand?” A look at the future of intelligent tutoring systems. Computers in Human Behavior, 24(4), 1342-1363.en_NZ
unitec.institutionUnitec Institute of Technologyen_NZ
unitec.publication.spage1342en_NZ
unitec.publication.lpage1363en_NZ
unitec.publication.volume24en_NZ
unitec.publication.titleComputers in Human Behavioren_NZ
unitec.peerreviewedyesen_NZ
dc.contributor.affiliationInstitute of information and Mathematical Sciences, Massey University, Auckland, New Zealanden_NZ
dc.contributor.affiliationSafeguarding Australia Program, National ICT Australia (NICTA) Ltd, QLD Lab, Brisbane, Australiaen_NZ


Files in this item

Thumbnail

This item appears in

Show simple record