An important hands-on activity in science courses is the use of laboratory software. Frequently the laboratory activities are restricted to walkthroughs of demonstration experiments. Demonstration experiments differ from authentic experiments in that 1) the research question is unclear, 2) knowledge of background literature is weak, 3) the students are only passive participants, 4) data is collected from only a few subjects, and 5) data is not analyzed. By contrast, in authentic experiments, 1) the researcher is clear on the research question, 2) the review of background literature is conducted first, 3) students are active experimenters, 4) data is collected from many subjects, and 5) the data collected is actually analyzed. Researchers in science education (e.g., Lehrer, Schauble, & Petrosino, 2000; Chinn & Malhotra, 2000) have recently argued that the demonstration experiments present a very strange and biased view of science to the students, and that students would develop a better understanding of science and the scientific process if they were given authentic experimentation experiences rather than demonstration experiments.
In cognitive psychology research methods classes, there has been little opportunity for authentic experimentation. Until recently, students were restricted to either paper and pencil tasks or very minor variations of canned experiments on the computer. Since most of modern cognitive psychology experiments are computer-based, this proves to be a very severe restriction. Only a few students with very strong computer skills or many, many hours of free time were able to develop more sophisticated experiments. Thus, instructors typically opted for one of two solutions: having students do demonstration experiments on the computer, and/or having students do some approximation of an experiment without the use of the computer. However, both are a poor substitute for authentic experimentation: the students learn little about experimentation from the former, and are highly frustrated by the limitations of the latter.
The existence of modern experiment design and analysis software presents an opportunity for cognitive psychology students to engage in more authentic experimentation. For experimental design, we now have E-Prime and Psyscope. For analysis, we have some analysis utilities in E-Prime and easy to use statistical/analysis packages like Excel, DataDesk, and Statview. This modern software is both easier to learn and easier to use, making it possible for students to design, run, and analyze a real experiment. Or so one would hope.
In previous research, I ran a study in which students in a research methods class were assigned to one of two conditions: 1) a previously existing lab curriculum based on demonstration experiments and a non-computer research project; or 2) a new curriculum in which students learned E-Prime and Statview, were walked through the design and analysis of two full experiments, and then conducted their own research project (on the computer or not, as they chose).
The students in the revised curriculum were able to learn to use the software and designed more sophisticated experiments that were more likely to work (since they could control the situation and measure the subject behavior more accurately). The students also felt that they had a better understanding of the research process and felt that the classroom exercises more adequately prepared them for their end of semester project.
This site describes the details of the new curriculum, showing how one can fit the learning of E-Prime and Statview into a one-semester research methods lab class along with the various other instructional goals one has for such a course (e.g., learning to write APA research papers, learning experimental design issues, etc.). This will include a presentation of problems that arose and solutions that were developed.
Back to the main page