Chemical Literacy: Performance of First Year Chemistry Students on Chemical Kinetics

Muntholib Muntholib(1), Suhadi Ibnu(2), Sri Rahayu(3*), Fauziatul Fajaroh(4), Sentot Kusairi(5), Bambang Kuswandi(6)

(1) Postgraduate Program, Universitas Negeri Malang, Indonesia, Jl. Semarang No. 5 Malang 65145, East Java, Indonesia
(2) Department of Chemistry, Universitas Negeri Malang, Indonesia, Jl. Semarang No. 5 Malang 65145, East Java, Indonesia
(3) Department of Chemistry, Universitas Negeri Malang, Indonesia, Jl. Semarang No. 5 Malang 65145, East Java, Indonesia
(4) Department of Chemistry, Universitas Negeri Malang, Indonesia, Jl. Semarang No. 5 Malang 65145, East Java, Indonesia
(5) Department of Physics, Universitas Negeri Malang, Indonesia, Jl. Semarang No. 5 Malang 65145, East Java, Indonesia
(6) Faculty of Pharmacy, University of Jember, Indonesia, Jl. Kalimantan No. 37 Jember 68121, East Java, Indonesia
(*) Corresponding Author


This study aims to (1) develop and validate a multiple choice chemical literacy test instrument (MC-CLTI) on chemical kinetics and (2) conduct a small survey on chemical literacy of first year chemistry students. The development of the instrument involved expert consultation and judgment, validation and two times pilot studies. The first pilot study involved 119 first year chemistry students while the second pilot study involved 197 second year chemistry students. The final form of MC-CLTI consists of 30 valid and reliable items (Cronbach's Alpha coefficient = 0.744). The survey showed that the average score of respondents' chemical literacy was 63.24. This score is in the range of the average scores reported by several previous studies.


chemical literacy, multiple choice chemical literacy instrument, chemical kinetics


[1] Barnea, N., Dori, Y.J., and Hofstein, A., 2010, Development and implementation of inquiry-based and computerized-based laboratories: Reforming high school chemistry in Israel, Chem. Educ. Res. Pract., 11 (3), 218–228.

[2] Cigdemoglu, C., and Geban, O., 2015, Improving students’ chemical literacy levels on thermochemical and thermodynamics concepts through a context-based approach, Chem. Educ. Res. Pract., 16 (2), 302–317.

[3] DeBoer, G.E., 2000, Scientific literacy: Another look at its historical and contemporary meanings and its relationship to science education reform, J. Res. Sci. Teach., 37 (6), 582–601.

[4] Shwartz, Y., Ben-Zvi, R., and Hofstein, A., 2006, The use of scientific literacy taxonomy for assessing the development of chemical literacy among high-school students, Chem. Educ. Res. Pract., 7 (4), 203–225.

[5] Singer, S.R., Hilton, M.L., and Scheingruber, H.A., 2005, America’s Lab Report: Investigations in High School Science, National Research Council, National Academies Press, Washington, D.C.

[6] Bond, D., 1989, In pursuit of chemical literacy: A place for chemical reactions, J. Chem. Educ., 66 (2), 157.

[7] OECD, 2016, PISA 2015 Assessment and Analytical Framework: Science, Reading, Mathematic, Financial Literacy and Collaborative Problem Solving, OECD Publishing, Paris.

[8] Ratcliffe, M., and Millar, R., 2009, Teaching for understanding of science in context: Evidence from the pilot trials of the Twenty First Century Science courses, J. Res. Sci. Teach., 46 (8), 945–959.

[9] Cigdemoglu, C., Arslan, H.O., and Cam, A., 2017, Argumentation to foster pre-service science teachers’ knowledge, competency, and attitude on the domains of chemical literacy of acids and bases, Chem. Educ. Res. Pract., 18 (2), 288–303.

[10] Shwartz, Y., Ben‐Zvi, R., and Hofstein, A., 2005, The importance of involving high‐school chemistry teachers in the process of defining the operational meaning of ‘chemical literacy’, Int. J. Sci. Educ., 27 (3), 323–344.

[11] Thummathong, R., and Thathong, K., 2016, Construction of a chemical literacy test for engineering students, J. Turk. Sci. Educ., 13 (3), 185–198.

[12] Chang, S.N., and Chiu, M.H., 2005, The development of authentic assessments to investigate ninth graders’ scientific literacy: In the case of scientific cognition concerning the concepts of chemistry and physics, Int. J. Sci. Math. Educ., 3 (1), 117–140.

[13] OECD, 2006, Assessing scientific, reading and mathematical literacy: a framework for PISA 2006, OECD Publishing, Paris.

[14] Wattanakasiwich, P., Taleab, P., Sharma, M.D., and Johnston, I.D., 2013, Development and Implementation of a Conceptual Survey in Thermodynamics, Int. J. Innovation Sci. Math. Educ., 21 (1), 29–53.

[15] Chandrasegaran, A.L., Treagust, D.F., and Mocerino, M., 2007, The development of a two-tier multiple-choice diagnostic instrument for evaluating secondary school students’ ability to describe and explain chemical reactions using multiple levels of representation, Chem. Educ. Res. Pract., 8 (3), 293–307.

[16] Cloonan, C.A., and Hutchinson, J.S., 2011, A chemistry concept reasoning test, Chem. Educ. Res. Pract., 12 (2), 205–209.

[17] Mutlu, A., and Şeşen, B.A., 2016, Evaluating of pre-service science teachers’ understanding of general chemistry concepts by using two tier diagnostic test, J. Baltic Sci. Educ., 15 (1), 79–96.

[18] Artdej, R., Ratanaroutai, T., Coll, R.K., and Thongpanchang, T., 2010, Thai grade 11 students’ alternative conceptions for acid–base chemistry, Res. Sci. Technol. Educ., 28 (2), 167–183.

[19] Demircioglu, G., Ayas, A., and Demircioglu, H., 2005, Conceptual change achieved through a new teaching program on acids and bases, Chem. Educ. Res. Pract., 6 (1), 36–51.

[20] Damanhuri, M.I.M., Treagust, D.F., Won, M., and Chandrasegaran, A.L., 2016, High school students’ understanding of acid-base concepts: An ongoing challenge for teachers, Int. J. Environ. Sci. Educ., 11 (1), 9–27.

[21] Muntholib, Mayangsari, J., Pratiwi, Y.N., Muchson, Joharnawan, R., Yahmin, and Rahayu, S., 2017, Development of Simple Multiple-Choice Diagnostic Test of Acid-Base Concepts to Identify Students Alternative Conceptions, Proceedings of the 1st Annual International Conference on Mathematics, Science, and Education (ICoMSE 2017), Malang, Indonesia.

[22] Ministry of Research, Technology and Higher Education, 2015, Regulation of Ministry of Research, Technology and Higher Education Number 44 year 2015: Process Standard for Higher Education, Ministry of Research, Technology and Higher Education, Jakarta.

[23] Ministry of Education and Culture, 2016, Regulation of Minister of Education and Culture Number 22 year 2016: Process Standard for Primary and Secondary Education, Ministry of Education and Culture, Jakarta.

[24] Sunal, D.W., Wright, E., and Day, J.B., Eds., 2004, Reform in Undergraduate Science Teaching for the 21st Century, Information Age Publishing, Greenwich.

[25] Dudu, W.T., 2014, Exploring South African high school teachers’ conceptions of the nature of scientific inquiry: A case study, S. Afr. J. Educ., 34 (1), 1–19.

[26] Chang, R., and Goldsby, K.A., 2016, Chemistry, 12th ed., McGraw-Hill Education, New York.

[27] McMurry, J., Fay, R.C., and Robinson, J.K., 2015, Chemistry, 7th ed., Pearson Education Limited, England.

[28] Faculty of Mathematics and Natural Science, State University of Malang, 2018, Chemistry Department Catalog Year 2018, Malang, Indonesia.

[29] Gilbert, J.K., and Treagust, D.F., 2009, “Introduction: Macro, Submicro and Symbolic Representations and the Relationship Between Them: Key Models in Chemical Education” in Multiple Representations in Chemical Education, vol. 4, Eds., Gilbert, J.K., and Treagust, D., Springer, Dordrecht, Netherlands, 1–8.

[30] Ding, L., and Beichner, R., 2009, Approaches to data analysis of multiple-choice questions, Phys. Rev. Spec. Top. Phys. Educ. Res., 5 (2), 020103.

[31] Wuttiprom, S., Sharma, M.D., Johnston, I.D., Chitaree, R., and Soankwan, C., 2009, Development and use of a conceptual survey in introductory quantum physics, Int. J. Sci. Educ., 31 (5), 631–654.

[32] Bybee, R.W., Taylor, J.A., Gardner, A., Van Scotter, P., Powell, J.C., Westbrook, A., and Landes, N., 2006, BSCS 5E Instructional Model: Origins and Effectiveness, A Report Prepared for the Office of Science Education, National Institutes of Health, Colorado Springs, Colorado.

[33] Kimberlin, C.L., and Winterstein, A.G., 2008, Validity and reliability of measurement instruments used in research, Am. J. Health Syst. Pharm., 65 (23), 2276–2284.

[34] Fraenkel, J.R., and Wallen, N.E., 2009, How to Design and Evaluate Research in Education, 7th ed., McGraw-Hill, New York.

[35] van Griethuijsen, R.A.L.F., van Eijck, M.W., Haste, H., den Brok, P.J., Skinner N.C., Mansour, N., Gencer, A.S., and BouJaoude, S., 2015, Global patterns in students’ views of science and interest in science, Res. Sci. Educ., 45 (4), 581–603.

[36] Taber, K.S., 2018, The use of Cronbach’s alpha when developing and reporting research instruments in science education, Res. Sci. Educ., 48 (6), 1273–1296.

[37] Sireci, S.G., 1998, Gathering and analyzing content validity data, Educ. Assess., 5 (4), 299–321.

[38] Adams, W.K., and Wieman, C.E., 2011, Development and validation of instruments to measure learning of expert‐like thinking, Int. J. Sci. Educ., 33 (9), 1289–1312.

[39] Salvucci, S., Walter, E., Conley, V., Fink, S., and Saba, M., 1997, Measurement Error Studies at the National Center for Education Statistics, U.S. Department of Education, New York.

[40] Gormally, C., Brickman, P., Hallar, B., and Armstrong, N., 2009, Effects of inquiry-based learning on students’ science literacy skills and confidence, IJ-SoTL, 3 (2), 16.

[41] Çetin, A., and Özdemir, Ö.F., 2018, Mode-method interaction: the role of teaching methods on the effect of instructional modes on achievements, science process skills, and attitudes towards physics, Eurasia J. Math. Sci. Technol. Educ., 14 (5), 1815–1826.

[42] Sigler, E.A., and Saam, J., 2012, Constructivist or expository instructional approaches: Does instruction have an effect on the accuracy of Judgment of Learning (JOL)?, JoSoTL, 7 (2), 22–31.

[43] Johnstone, A.H., 1991, Why is science difficult to learn? Things are seldom what they seem, J. Comput. Assist. Learn., 7 (2), 75–83.

[44] Taber, K.S., 2013, Revisiting the chemistry triplet: Drawing upon the nature of chemical knowledge and the psychology of learning to inform chemistry education, Chem. Educ. Res. Pract., 14 (2), 156–168.

[45] Kikas, E., 2004, Teachers’ conceptions and misconceptions concerning three natural phenomena, J. Res. Sci. Teach., 41 (5), 432–448.

[46] Walker, J.P., and Sampson, V., 2013, Learning to argue and arguing to learn: argument-driven inquiry as a way to help undergraduate chemistry students learn how to construct arguments and engage in argumentation during a laboratory course, J. Res. Sci. Teach., 50 (5), 561–596.

[47] Anderson, L.W., Krathwohl, D.R., Airasian, P.W., Cruikshank, K.A., Mayer, R.E., Pintrich P.R., Raths, J., and Wittock, M.C., 2001, A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives, Longman, New York.


Article Metrics

Abstract views : 613 | views : 248 | views : 235

Copyright (c) 2019 Indonesian Journal of Chemistry

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.


Indonesian Journal of Chemisty (ISSN 1411-9420 / 2460-1578) - Chemistry Department, Universitas Gadjah Mada, Indonesia.

Analytics View The Statistics of Indones. J. Chem.