El uso de mapas conceptuales como instrumento de evaluación del aprovechamiento en ciencias: lo que sabemos hasta ahora.

María Araceli Ruiz Primo


Texto completo

HTML (English) PDF (English)

Resumen


En este artículo se describe el uso de mapas conceptuales como instrumento de evaluación para medir la organización del conocimiento proposicional (declarativo) del aprovechamiento en ciencias. Como instrumento de evaluación, un mapa conceptual está constituído por una tarea que invita al estudiante a representar la organización de su conocimiento en un tópico específico; un formato de respuesta; y una sistema de calificación. Un problema de interpretación de constructo que plantea el uso de mapas conceptuales consiste en que distintos tipos de tarea, formato de respuesta, y sistema de calificación dan como resultado diversas técnicas de mapas conceptuales que pueden producir en los estudiantes distintas formas de representación del conocimiento. Este artículo presenta un panorama de la investigación de los mapas conceptuales como instrumento evaluativo. Se describen brevemente algunos estudios que han evaluado la confiabilidad y la validez de los mapas conceptuales y se presenta una síntesis de lo que hasta ahora se sabe de este tipo de instrumento.

Palabras clave


Evaluación basada en mapas conceptuales; evaluación del conocimiento de los estudiantes; evaluación alternativa en ciencias.

Referencias


Anderson, R. C. (1984). Some reflections on the acquisition of knowledge. Educational Researcher, 13(10), 5-10.

Anderson, T. H. & Huang, S-C. C. (1989). On using concept maps to assess the comprehension effects of reading expository text (Technical Report No. 483). Urbana-Champaign: Center for the Studying of Reading, University of Illinois at Urbana-Champaign. (ERIC Document Reproduction Service No. ED 310 368).

Baxter, G.P., Elder, A.D., & Glaser, R. (1996). Knowledge-based cognition and performance assessment in the science classroom. Educational Psychologist, 31(2), 133-140.

Chi, M.T.H., Feltovich, P.J., & Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.

Chi, M.T.H., Glaser, R., & Farr, M.J. (1988). The nature of expertise. Hillsdale, NJ: Lawrence Earlbaum Associates, Publishers.

Cronbach, L. J. (1990). Essentials of psychological testing (Fifth ed.). New York: Harper & Row Publishers.

Cronbach, L.J., Gleser, G.C., Nanda, H., & Rajaratnam, N. (1972). The dependability of behavioral measurements. New York: John Wiley.

Dochy, F. J. R. C. (1996). Assessment of domain-specific and domain-transcending prior knowledge: Entry assessment and the use of profile analysis. In M. Birenbaum & F. J. R. C. Dochy (Eds.) Alternatives in assessment of achievements, learning process and prior knowledge (pp. 93-129). Boston, MA: Kluwer Academic Publishers.

Glaser, R. (1996). Changing the agency for learning: Acquiring expert performance. In K. A. Ericsson (Ed.) The road to excellence: The acquisition of expert performance in the art, sciences, sports, and games (pp 303-311). Mahwah, NJ: Erlbaum.

Glaser, R. & Bassok, M. (1989). Learning theory and the study of instruction. Annual Review of Psychology, 40, 631-66.

Glaser, R. & Baxter, G.P. (1997). Improving the theory and practice of achievement testing. Paper presented at the BOTA Meeting. National Academy of Science/National Research Council. Washington, DC.

Goldsmith, T. E., Johnson, P. J., & Acton, W. H. (1991). Assessing structural knowledge. Journal of Educational Psychology, 83(1), 88-96.

Lomask, M., Baron, J. B., Greig, J. & Harrison, C. (1992, March). ConnMap: Connecticut's use of concept mapping to assess the structure of students' knowledge of science. Paper presented at the annual meeting of the National Association of Research in Science Teaching. Cambridge, MA.

McClure, J. R., & Bell, P. E. (1990). Effects of an environmental education-related STS approach instruction on cognitive structures of preservice science teachers. Pennsylvania, PA: Pennsylvania State University. (ERIC Document Reproduction Service No. ED 341 582).

Meng, X. L., Rosenthal, R. & Rubin, D. B. (1992). Comparing correlated correlation coefficients. Psychological Bulletin, 111(1), 172-175.

Mintzes, J.J., Wandersee, J.H., & Novak, J.D. (1997). Teaching science for understanding. San Diego: Academic Press.

Novak, J. D., & Gowin, D. R. (1984). Learning how to learn. New York: Cambridge Press.

Novak, J. D., Gowin, D. B., & Johansen, G. T. (1983). The use of concept mapping and knowledge vee mapping with junior high school science students. Science Education, 67(5), 625-645.

Pearsall, N.R., Skipper, J.E.J., & Mintzes, J.J. (1997). Knowledge restructuring in the life sciences. A longitudinal study of conceptual change in biology. Science Education, 81(2), 193-215.

Rice, D.C., Ryan, J.M. & Samson, S.M. (1998). Using concept maps to assess student learning in the science classroom: Must different method compete? Journal of Research in Science Teaching, 35(10), 503-534.

Ruiz-Primo, M. A. & Shavelson, R. J. (1996a). Problems and issues in the use of concept maps in science assessment. Journal of Research in Science Teaching, 33(6), 569-600.

Ruiz-Primo, M. A. & Shavelson, R. J. (1996b). Rhetoric and reality in science performance assessment. Journal of Research in Science Teaching, 33(10), 1045-1063.

Ruiz-Primo, M.A., Schultz, E. S., & Shavelson, R.J. (1996, April). Concept map-based assessments in science: An exploratory study. Paper presented at the annual meeting of the American Educational Research Association, New York, NY.

Ruiz-Primo, M.A., Schultz, E. S., & Shavelson, R.J. (1997, March). On the validity of concept map-based assessment interpretations: An experiment testing the assumption of hierarchical concept maps in science. Paper presented at the annual meeting of the American Educational Research Association, Chicago, IL.

Ruiz-Primo, M.A., Schultz, E. S., Li, M., & Shavelson, R.J. (1999). Comparison of the Reliability and Validity of Scores From Two Concept-Mapping Techniques. Manuscript submitted for publication.

Ruiz-Primo, M.A., Shavelson, R.J., Li, M., & Schultz, E. S. (2000). On the validity of cognitive interpretations of scores from alternative concept-mapping techniques. Manuscript submitted for publication.

Schau, C., & Mattern, N. (1997). Use of map techniques in teaching applied statistics courses. The American Statistician, 51, 171-175.

Schau, C., Mattern, N., Weber, R., Minnick, K., & Witt, C. (1997, March). Use of fill-in concept maps to assess middle school students' connected understanding of science. Paper presented at the AERA Annual Meeting, Chicago, IL.

Shavelson R. J. (1972). Some aspects of the correspondence between content structure and cognitive structure in physics instruction. Journal of Educational Psychology, 63, 225-234.

Shavelson, R.J., & Ruiz-Primo, M.A. (1999). Leistungsbewertung im naturwissenschaftlichen Unterricht (On the assessment of science achievement). Unterrichtswissenschaft. Zeitschrift für Lernforschung, 27 (2), 102-127.

Shavelson, R.J., & Ruiz-Primo, M.A., (2000). On the psychometrics of assessing science understanding. In J. Mintzes, J. Wandersee, J. Novak (Eds). Assessing science understanding (pp. 303-341). San Diego: Academic Press

Shavelson, R. J. & Webb, N.M. (1991). Generalizability theory: A primer. Newbury Park, CA: Sage.

Shavelson, R.J., Webb, N.M., & Rowley, G. (1989). Generalizability theory. American Psychologist, 44(6), 922-932.

Surber, J.R. (1984). Mapping as a testing and diagnostic device. In C.D. Holley & D.F. Dansereau (Eds.). Spatial learning strategies: techniques, applications, and related issues (pp. 213-233). Orlando: Academic Pres

White, R. T, & Gunstone, R. (1992). Probing understanding. New York: Falmer Press.

Zajchowski, R. & Martin, J. (1993). Differences in the problem solving of stronger and weaker novices in physics: Knowledge, strategies, or knowledge structure. Journal of Research in Science Teaching, 30(5), 459-470.


Enlaces refback

  • No hay ningún enlace refback.




http://redie.uabc.mx