Expert Validation of a Python Test, Reliability, Difficulty and Discrimination Indices

Hector Manuel Belmar Garrido

Abstract


In recent years, different countries have implemented the teaching of computer programming from the first grade in schools, with the aim of incorporating computational thinking as a new way of thinking that is a necessary skill for scientific and technological development, a fundamental axis for development in the 21st century. Recent reviews of the state of the art have shown that this task is only being carried out by countries that have given space to technology and science, incorporating it at the elementary school level. However, developing countries do not yet consider the significance of the issue, so to date they have not taken the necessary steps in this direction. In addition, learning computer programming is fundamental for countries to join technological development, so that they can be creators of technology and not just users of it. The problem is that there is no didactic development for the teaching of programming, nor validated evaluation instruments to quantify the learning of computer programming. The objective of this research is to validate a programming test that evaluates technical skills in the Python programming language. The instrument proposed to be validated is a 90-item test, which after validation by experts was reduced to 70 items, and after the psychometric analysis that considered the calculation of reliability, difficulty and discrimination indexes, resulted in the proposed 45-item instrument, as a standard instrument for the evaluation of learning in the Python programming language. It should be noted that the validation methodology was carried out using the classical theory of psychometric analysis.

Full Text:

PDF


DOI: https://doi.org/10.20849/jed.v7i1.1320

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Journal of Education and Development  ISSN 2529-7996 (Print)  ISSN 2591-7250 (Online)

Copyright © July Press

To make sure that you can receive messages from us, please add the 'julypress.com' domain to your e-mail 'safe list'. If you do not receive e-mail in your 'inbox', check your 'bulk mail' or 'junk mail' folders.