This study was an empirical investigation of active student responding (ASR) utilizing a student response system (SRS) vs. single student questioning (SSQ) and no student responding in a graduate level special education class of 23 participants. During the SRS condition, every participant responded to questions using remotes/clickers. During the SSQ condition, the instructor randomly called upon individual participants to vocally answer a question. During the control condition, no questions were asked of participants. An alternating treatments design was used to test the effects of the three conditions on the response accuracy to a short-answer quiz at the beginning of next session and accuracy with which participants completed a task during which they must apply the information presented during the lecture. There was statistically significant difference in student performance on application tasks, but not statistically significant difference on quiz scores. The findings diverge from the results other SRS studies and K-12 ASR studies, but support some college level studies.

">

The Effects Of Student Response System And Single Student Questioning Technique On Graduate Students’ Recall And Application Of Lecture Material

Sara Bicard*, David F. Bicard**, Laura Baylot Casey***, Clinton Smith****, Esther Plank*****, Cort Casey******
*,**,***,Assistant Professor,Special education ,University of Memphis.
****,*****,Doctrate of Education students,University of Memphis.
Periodicity:April - June'2008
DOI : https://doi.org/10.26634/jet.5.1.559

Abstract

This study was an empirical investigation of active student responding (ASR) utilizing a student response system (SRS) vs. single student questioning (SSQ) and no student responding in a graduate level special education class of 23 participants. During the SRS condition, every participant responded to questions using remotes/clickers. During the SSQ condition, the instructor randomly called upon individual participants to vocally answer a question. During the control condition, no questions were asked of participants. An alternating treatments design was used to test the effects of the three conditions on the response accuracy to a short-answer quiz at the beginning of next session and accuracy with which participants completed a task during which they must apply the information presented during the lecture. There was statistically significant difference in student performance on application tasks, but not statistically significant difference on quiz scores. The findings diverge from the results other SRS studies and K-12 ASR studies, but support some college level studies.

Keywords

Student Response System, Active Student Responding, Postsecondary Education.

How to Cite this Article?

Sara Bicard, David F. Bicard, Laura Baylot Casey , Clinton Smith and Esther Plank , Cort Casey (2008). The Effects Of Student Response System And Single Student Questioning Technique On Graduate Students’ Recall And Application Of Lecture Material. i-manager’s Journal of Educational Technology, 5(1), 23-30. https://doi.org/10.26634/jet.5.1.559

References

[1]. Austin, J. L., Lee, M., &Carr, J. E. (2004). The effects of guided notes on undergraduate students‘ recording of lecture content. Journal of Instructional Psychology, 30, 314-320.
[2]. Austin, J., Gilbert Lee, M., Thibeault, M. D.. Carr, J., & Bailey, J. (2002). Effects of Guided Notes on University Students‘ Responding and Recall of Information. Journal ofBehavioral Education, 14(4), 243-254.
[3]. Brame, R B. (2001). Making sustained silent reading [SSR] more effective: Effects of a story recall game on students‘ off task behavior during 83R and retention of story facts. Unpublished doctoral dissertation, The Ohio State University, Columbus.
[4]. Bunce, D. M., Van den Plas, J. R., & Havanki, K. L. (2006). Comparing the effectiveness on student achievement of a student response system versus online WebCT quizzes. J. Chem. Educ., 83(3), 488493.
[5]. Caldwell, J. E. (2007). Clickers in the large classroom: Current research and best practice tips. CBE Life Sciences Education, 6 (1), 9-20.
[6]. Cooper, J. 0., Heron, T. E., & Heward, W. L. (1987). Appliedbehavioranalysis. Columbus, OH: Merrill.
[7]. Greenwood, C.R., Delquadri, J., & Hall, R.\/. (1984). Opportunity to respond and student academic achievement. In W.L. Heward, T.E. Heron, D.S. Hill, &J. Trap- Porter (eds). Focus on behavior analysis in education (pp. 58-88). Columbus, OH:Merrill.
[8]. Hatch, J. Jensen, M., & Moore, R. (2005). Manna from heaven or “Clickers“ from hell: Experiences with an electronic response system. Journal of College Science Teaching 34 (7), 36-42.
[9]. Heward, W. L. (T 994). Three “low—tech“ strategies for increasing the frequency of active student response during group instruction. In R. Gardner lll, D. M. Sainato, J. 0. Cooper, T. E. Heron, W. L. Heward, J. Eshleman, & T. A. Grossi (Eds.), Behavior analysis in education: Focus on measurablysuperior instruction (pp. 283-320). Monterey, CA: Brooks/Cole.
[10]. Jackson, M. H., and Trees, A. R. (2003). Clicker implementation and assessment. http://Comm.colorado.edu/mjackson/clickerreport.htm [accessed 29 April 2008).
[11]. Judson, E. & Sawada, D. (2002). Learning from past and present: Electronic response systems in college lecture halls. Journal of Computers in Mathematics and ScienceTeaching2'l[2): 167-181
[12]. Lazarus, B. D. (1993). Guided notes: Effects with secondary and post—secondary students with mild disabilities. Education and Treatment of Children, 16. 272-289.
[13]. Martyn, M. (2007). Clickers in the classroom: An active learning approach. Educause Quarterly, 30(2), 71 - 74.
[14]. Narayan, J. S., Heward, W. L., Gardner III, R. Courson, F. H., &Omness, C. (1990). Using response cards to increase student participation in an elementary classroom. Journal ofAppIied BehaviorAnalysis, 23, 483- 490
[15]. Neef, N. A., McCord, B. E., & Ferreri, S. J. (2006). Effects of using guided notes versus completed notes during lectures on college students‘ quiz performance. Journal ofAppIiedBehaviorAnalysis, 39, 123-130.
[16]. Randolph, J. J. (2007) Meta-analysis of the research on response cards: effects on test achievement, quiz achievement, participation, and off-task behavior. Journal of Positive Behavior Interventions, 9 (2), p. 113- 128.
[17]. Robertson, L. J. (2000). Twelve tips for using a computerized interactive audience response system, Medical Teacher, 22, (3), pp. 237-239.
[18]. Saville, B. K., Zinn, T. E., Neef, N.A., Van Norman, R., & Ferreri, S. J. (2006). A comparison of inter - teaching and lecture in the college classroom. Journal of Applied BehaviorAnalysis, 39, 49-61 .
[19]. Shabani, D. B., & Carr, J. E. (2004). An evaluation of response cards as an adjunct to standard instruction in university classrooms: A systematic replication and extension. North American Journal of Psychology, 6(1 ), p. 85-100.
[20]. Stowell, J.R. & Nelson, J.M. (2007). Benefits of electronic audience response systems on student participation, learning, and emotion. Teaching of Psychology, 34 (4), 253-258.
[21]. Tincani, M., Ernsbarger, S. C., Harrison, T. J., Heward, W. L. (2005). The effects of fast and slow-paced teaching on participation, accuracy, and off-task behavior of children in the Language for Learning program. Journal of Direct lnstruction, 5,97-109.
If you have access to this article please login to view the article or kindly login to purchase the article

Purchase Instant Access

Single Article

North Americas,UK,
Middle East,Europe
India Rest of world
USD EUR INR USD-ROW
Pdf 35 35 200 20
Online 35 35 200 15
Pdf & Online 35 35 400 25

Options for accessing this content:
  • If you would like institutional access to this content, please recommend the title to your librarian.
    Library Recommendation Form
  • If you already have i-manager's user account: Login above and proceed to purchase the article.
  • New Users: Please register, then proceed to purchase the article.