The use of standard-based assessment, grading and reporting tools is essential to ensure that assessment meets acceptable levels of quality and standardization. This study reports the design, development and evaluation of a standards-based assessment tool for the instructors at Sultan Qaboos University, Sultanate of Oman. The Rapid Applications Development Model was implemented to develop early versions of the assessment tool, called RealGrade. The Grading tool Usability Questionnaire and a series of individual interviews were used to measure participants’ reactions toward the usability of RealGrade and determine the extent to which the prototype is usable. The results revealed that participants found the RealGrade effective and efficient in facilitating the process of standards-based assessment and communicating grades with students at the University. In addition, they favored the design, flexibility and ease of use of RealGrade. Further examinations of mean differences among participants according to their computer experience and teaching experience were conducted.

">

A Standards-Based Grading And Reporting Tool For Faculty: Design And Implications

Alaa Sadik*
Assistant Professor, Instructional Technology at Sultan Qaboos University, Oman.
Periodicity:April - June'2011
DOI : https://doi.org/10.26634/jet.8.1.1470

Abstract

The use of standard-based assessment, grading and reporting tools is essential to ensure that assessment meets acceptable levels of quality and standardization. This study reports the design, development and evaluation of a standards-based assessment tool for the instructors at Sultan Qaboos University, Sultanate of Oman. The Rapid Applications Development Model was implemented to develop early versions of the assessment tool, called RealGrade. The Grading tool Usability Questionnaire and a series of individual interviews were used to measure participants’ reactions toward the usability of RealGrade and determine the extent to which the prototype is usable. The results revealed that participants found the RealGrade effective and efficient in facilitating the process of standards-based assessment and communicating grades with students at the University. In addition, they favored the design, flexibility and ease of use of RealGrade. Further examinations of mean differences among participants according to their computer experience and teaching experience were conducted.

Keywords

Standards-Based Assessment, Assessment Tool, Usability Evaluation.

How to Cite this Article?

Alaa M. Sadik (2011). A Standards-Based Grading And Reporting Tool For Faculty: Design And Implications. i-manager’s Journal of Educational Technology, 8(1), 46-63. https://doi.org/10.26634/jet.8.1.1470

References

[1]. Bevan, N. (2001). International standards for HCI and usability. International Journal of Human-Computer Studies, 55, 533-552.
[2]. Bozionelos, N. (2001). Computer anxiety: Relationship with computer experience and prevalence. Computers in Human Behavior, 17(2), 213-224.
[3]. Burger, D. (1998). Designing a Sustainable Standards-based Assessment System. Mid-continent Regional Educational Laboratory, Inc.
[5]. Davis, D. (1989). Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology, MIS Quarterly, 13(3), 318-340.
[6]. Dillon, A. (2001). Usability evaluation. In W. Karwowski (ed.) Encyclopedia of Human Factors and Ergonomics, London: Taylor and Francis.
[7]. Ebel, L. & frisbie, a. (1986). Essentials of educational measurement (4th ed.). Englewood Cliffs, N.J.: Prentice-Hall.
[8]. Francis, L., Katz, Y. and Jones, S. (2000). The reliability and validity of the Hebrew version of the Computer Attitude Scale. Computers and Education, 35(2), 149-159.
[9]. Guskey, T. (2001). Helping Standards Make the Grade. Making Standards Work, 59(1), 20-27.
[10]. Harrison, P., Seeman, J., & Behm, R. (1991). Development of a distance education assessment instrument. Educational Technology Research and Development, 39(4), 65-77.
[11]. Henry, W. & Stone, W. (1997). The development and validation of computer self-efficacy and outcome expectancy scales in a non-volitional context. Behavior Research Methods, Instruments, & computers. 29(4), 519-527.
[12]. Hills, R. (1991). Apathy concerning grading and testing. PhiDelta Kappan, 72(7), 540-545.
[13]. Huff, L. & McNaughton, J. (1991). Diffusion of an Information Technology Innovation. You and the Computer, Summer, 25-30.
[14]. Igbaria, M., Zinatelli, N., Cragg, P. & Cavaye, A. (1997). Personal Computing Acceptance Factors in Small Firms: A Structural Equation Model. MIS Quarterly, 21(3), 279-305.
[16]. Kennedy, D. (1998). Software Development Teams in Higher Education: an Educator's View. In R. M. Corderoy (ed.) FlexibilITy: The next wave. ASCILITE 98. Proceedings of the 15th Annual Conference of the Australian Society for Computers in Learning in Tertiary Education. Australia, University of Wollongong, 373-386
[17]. Khattri, N., Kane, M., & Reeve A. (1995). How performance assessments affect teaching and learning. Educational Leadership, 53, 80-83.
[18]. Lee, M., Kim, Y. & Lee, J. (1995). An Empirical Study of the Relationships among End-User Information Systems Acceptance, Training and Effectiveness. Journal of Management Information Systems, 12(2), 189-202.
[19]. Lohr, L. (2000). Three perceptual principles for instructional interface design. Educational Technology, 40(1), 45–52.
[20]. Lohr, L., Javeri, M., Mahoney, C., Gall, J., Li, K. & Strongin, D. (2003). Using Rapid Application Development to Improve the Usability of a Preservice Teacher Technology Course. ETR&D, 51(2), 41–55.
[21]. MacCormack, A. (2001). Product-Development Practices that Work: How Internet Companies Build Software, Sloan Management Review, Winter, 75-84.
[22]. Mathieson, K. (1991). Predicting User Intentions: Comparing the Technology Acceptance Model with the Theory of Planned Behavior. Information Systems Research, September, 173-191.
[25]. Oosterhof, C. (1994). Classroom applications of educational measurement (2nd ed.). New York: MacMillan.
[26]. Preece, J. (Ed.) (1993) A Guide to Usability: Human Factors in Computing. Wokingham, UK: Addison-Wesley.
[27]. Preece, J., Rogers, Y., & Sharp, H. (2002). Interaction design: Beyond human-computer interaction. Hoboken, NJ: John Wiley & Sons.
[28]. Reed, D. (1996). High School Teachers' Grading Practices: A Description of Methods for Collection and Aggregation of Grade Information in Three Schools, Unpublished Ph.D. Dissertation, University of California.
[29]. Roblyer, D., Edwards, J., and Havrileck, A. (1999). Integrating Educational Technology into teaching. Upper Saddle River, NJ & Columbus, OH: Merrill, an imprint of Prentice Hall.
[30]. Rogers, M. (1995). Diffusion of Innovation, 4th ed., New York: The Free Press.
[31]. Rozanski, P., and Haake, R. (2003). Curriculum and content: The many facets of HCI. Paper presented at the 4th Conference on Information Technology Curriculum on Information Technology Education, Lafayette, Indiana, USA.
[32]. Rubin, J. (1994). Handbook of Usability Testing. New York: John Wiley and Sons.
[33]. Rushby, J. (1997). Quality criteria for multimedia. Association for Learning Technology Journal, 5(2), 18-30.
[34]. Sadik, A. (2006). Factors Influencing Teachers' Attitudes Toward Personal Use and School Use of Computers: New Evidence From a Developing Nation. Evaluation Review, 30(1), 86-113.
[35]. Scacchi, W. (2000). Understanding Software Process Redesign using Modeling, Analysis and Simulation. Software Process Improvement and Practice, 5(2), 183-195.
[36]. Scacchi, W. (2001). Process Models in Software Engineering. In Marciniak, J. (ed.), Encyclopedia of Software Engineering, 2nd Edition, New York: John Wiley and Sons, Inc.
[37]. Walvoord, E. (2004). Assessment Clear and Simple: A Practical Guide for Institutions, Departments, and General Education. San Francisco: Jossey-Bass.
[38]. Wesley. Pressman, S. (1997). Software engineering: A practitioner's approach. (Fourth ed.). New York: McGraw-Hill Companies Inc.
If you have access to this article please login to view the article or kindly login to purchase the article

Purchase Instant Access

Single Article

North Americas,UK,
Middle East,Europe
India Rest of world
USD EUR INR USD-ROW
Pdf 35 35 200 20
Online 35 35 200 15
Pdf & Online 35 35 400 25

Options for accessing this content:
  • If you would like institutional access to this content, please recommend the title to your librarian.
    Library Recommendation Form
  • If you already have i-manager's user account: Login above and proceed to purchase the article.
  • New Users: Please register, then proceed to purchase the article.