Open Access
SHS Web Conf.
Volume 75, 2020
The International Conference on History, Theory and Methodology of Learning (ICHTML 2020)
Article Number 04004
Number of page(s) 6
Section Methodology of Learning, Education and Training
Published online 26 March 2020
  1. I.P. Karpova, Sravnenie otkrytyh i vyborochnyh testov (Comparison of open and sample tests). Otkryitoe obrazovanie 3, 32–38 (2010) [Google Scholar]
  2. A. Orlov, T. Ponomareva, O. Chukajev, S.Pazuhina, Tekhnologii otsenivaniya rezultatov obrazovatelnogo protsessa v vuze v kontekste kompetentnostnogo podkhoda (Technologies for assessing the results of the educational process in a university in the context of a competency-based approach), 2nd edn. (Direkt-Media, Moscow, Berlin, 2017). [Google Scholar]
  3. Ch.A. Melovitz Vasan, D.O. DeFouw, B.K. Holland, N.S. Vasan, Analysis of testing with multiple choice versus open - ended questions: Outcome - based observations in an anatomy course. Anat Sci Educ, 11(3),254–261 (2018) [CrossRef] [Google Scholar]
  4. D. Marengo, R. Miceli, M. Settanni, Test unidimensionality and item format: Do mixed item formats threaten test unidimensionality? Results from a standardized math achievement test. Testing, Psychometrics, Methodology in Applied Psychology 23(1),25–36 (2016) [Google Scholar]
  5. F. Mozaffari, S. Mohammad Alavi, A. Rezaee, Investigating the Impact of Response Format on the Performance of Grammar Tests: Selected and Constructed. Journal of Teaching Language Skills 36(2),103–128 (2017) [Google Scholar]
  6. S. Schladitz, J. Grofi Ophoff, M. Wirtz, Effects of different response formats in measuring educational research literacy. Journal for educational research online 9(2),137–155 (2017) [Google Scholar]
  7. R. W. Lissitz, Xiaodong Hou, Sh. Slater, The Contribution of Constructed Response Items to Large Scale Assessment: Measuring and Understanding their Impact. Journal of Applied Testing Technology 13 (3) (2012), w/48366. Accessed 21 Jul 2019. [Google Scholar]
  8. M. Kastner, B. Stangl, Multiple-Choice and Constructed Response Tests: Do Test Format and Scoring Matter? Procedia - Social and Behavioral Sciences 12, 263–273 (2011) [Google Scholar]
  9. L.F. Bachman, A.S. Palmer, Language Testing in Practice: Designingand Developing Useful Language Tests, 1st edn (Oxford University Press, New York, 1996) [Google Scholar]
  10. V. Avanesov, Soderzhaniye testa i testovykh zadaniy (The content of the test and test items). (2007), Accessed 21 June 2019 [Google Scholar]
  11. V. Avanesov, Problemy kachestva pedagogicheskikh izmereniy (The problem of the quality of pedagogical measurements) (2008), Accessed 6 July 2019 [Google Scholar]
  12. K. Djatlova, T. Mikhaleva, Issledovaniye vliyaniya raznoobraziya form testovykh zadaniy na statisticheskiye kharakteristiki testov (Investigation of the influence of the forms variety of test tasks on the statistical characteristics of the test). Voprosyi testirovaniya v obrazovanii 4, 65–75 (2006) [Google Scholar]
  13. M.E. Martinez, Cognition and the question of test item format. Educational Psychologist 34 (4), 207218 (1999) [CrossRef] [Google Scholar]
  14. G.R. Hancock, Cognitive Complexity and the Comparability of Multiple-Choice and Constructed- Response Test Formats. The Journal of Experimental Education 62/2, 143–157 (1994) [CrossRef] [Google Scholar]
  15. J.C. Alderson, Assessing Reading (Cambridge University Press, UK, Cambridge, 2000) [CrossRef] [Google Scholar]
  16. B. Clay, Is This a Trick Question? A Short Guide to Writing Effective Test Question (Kansas Curriculum Center, )2001. [Google Scholar]
  17. T. Hudson, Teaching Second Language Reading (Oxford University Press, Oxford, 2007) [Google Scholar]
  18. A. Kan, O. Bulut, D.C. Cormier, The Impact of Item Stem Format on the Dimensional Structure of Mathematics Assessments. Educational Assessment 24(1),13–32 (2019) [CrossRef] [Google Scholar]
  19. H. Wainer, D. Thissen, Combining Multiple-Choice and Constructed-Response Test Scores: Toward a Marxist Theory of Test Construction. Applied Measurement in Education 6(2),103–118 (1993) [CrossRef] [Google Scholar]
  20. O.A. Reshetnikova, Chto okazyvayet vliyaniye na izmeneniye kontrolnykh izmeritelnykh materialov dlya gosudarstvennoy itogovoy attestatsii? (What influences the change of control measuring materials for state final certification?). Pedagogicheskie izmerenija 2, 5–9 (2016) [Google Scholar]
  21. C. Hoyt, Test reliability estimated by analysis of variance. Psychometrika 6, 153–160 (1941) [CrossRef] [Google Scholar]
  22. Ch.E. Osgood, G. J. Suci, P. Tannenbaum, The Measurement of Meaning (University of Illinois Press, Champaign, 1957) [Google Scholar]
  23. J. Cohen, Statistical power analysis for the behavioral sciences, 2nd edn. (Lawrence Erlbaum Associates, Hillsdale, 1988) [Google Scholar]
  24. G.W. Snedecor, Statistical Methods Applied to Experiments in Agriculture and Biology (Collegiate Press, Ṇ Ames, 1937) [Google Scholar]
  25. A.V. Hryvko, Yu.O. Zhuk, Vykorystannia zasobiv IKT u protsesi eksperymentalnoho doslidzhennia emotyvno-otsinnoho stavlennia uchniv do riznykh form testovykh zavdan z ukrainskoi movy (Using the means of computing technologies in the process of experimental research of the students’ emotive- estimate relation to different forms of test tasks in Ukrainian language). Information Technologies and Learning Tools 70(2),285–297 (2019). doi:10.33407/itlt.v70i2.2621 [Google Scholar]
  26. C. Jonick, J. Schneider, D. Boylan, The effect of accounting question response formats on student performance. Accounting Education 26(4),291–315 (2017) [CrossRef] [Google Scholar]
  27. E. Lesage, M. Valcke, E. Sabbe, Scoring methods for multiple choice assessment in higher education - Is it still a matter of number right scoring or negative marking? Studies in Educational Evaluation 39, 188–193 (2013) [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.