Bulletin of the American Physical Society
2007 APS April Meeting
Volume 52, Number 3
Saturday–Tuesday, April 14–17, 2007; Jacksonville, Florida
Session K9: Topics from the new Physics Education Research Journal |
Hide Abstracts |
Sponsoring Units: FEd Chair: Robert Beichner, North Carolina State University Room: Hyatt Regency Jacksonville Riverfront City Terrace 5 |
Sunday, April 15, 2007 1:15PM - 1:51PM |
K9.00001: Using resource graphs to model learning in physics. Invited Speaker: Physics education researchers have many valuable ways of describing student reasoning while learning physics. One can describe the correct physics and look at specific student difficulties, for example, though that doesn't quite address the issue of how the latter develops into the former. A recent model (building on work by A.A. diSessa and D. Hammer) is to use resource graphs, which are networks of connected, small-scale ideas that describe reasoning about a specific physics topic in a specific physics context. We can compare resource graphs before and after instruction to represent conceptual changes that occur during learning. The representation describes several well documented forms of conceptual change and suggests others. I will apply the resource graphs representation to describe reasoning about energy loss in quantum tunneling. I will end the talk with a brief discussion (in the context of Newton's Laws) of how a resource perspective affects our instructional choices. [Preview Abstract] |
Sunday, April 15, 2007 1:51PM - 2:27PM |
K9.00002: Model analysis: Representing and assessing the dynamics of student learning Invited Speaker: Decades of education research have shown that students can simultaneously possess alternate knowledge frameworks and that the development and use of such knowledge are context dependent. As a result of extensive qualitative research, standardized multiple-choice tests such as Force Concept Inventory and Force-Motion Concept Evaluation tests provide instructors tools to probe their students' conceptual knowledge of physics. However, many existing quantitative analysis methods often focus on a binary question of whether a student answers a question correctly or not. This greatly limits the capacity of using the standardized multiple-choice tests in assessing students' alternative knowledge. In addition, the context dependence issue, which suggests that a student may apply the correct knowledge in some situations and revert to use alternative types of knowledge in others, is often treated as random noise in current analyses. Through research, we developed a new modeling method, model analysis. It applies the results from qualitative studies to establish a quantitative representation framework, with which the space of students' knowledge and the probabilities for students to use different types of knowledge in a range of equivalent contexts can be quantitatively represented and analyzed. This provides a new method for quantitative assessment in education, which can generate much richer information than what is available from score-based analysis. [Preview Abstract] |
Sunday, April 15, 2007 2:27PM - 3:03PM |
K9.00003: Impact of Animation on Assessment of Conceptual Understanding in Physics Invited Speaker: This study investigates the effect of computer animation on assessment and the conditions under which animation may improve or hinder assessment of conceptual understanding in physics. An instrument was developed by replacing static pictures and descriptions of motion with computer animations on the Force Concept Inventory, a commonly used pencil and paper test. Both quantitative and qualitative data were collected. The animated and static versions of the test were given to students and the results were statistically analyzed. Think-aloud interviews were also conducted to provide additional insight into the statistical findings. We found that good verbal skills tended to increase performance on the static version but not on the animated version of the test. In general, students had a better understanding of the intent of the question when viewing an animation and gave an answer that was more indicative of their actual understanding, as reflected in separate interviews. In some situations this led students to the correct answer and in others it did not. Overall, we found that animation can improve assessment under some conditions by increasing the validity of the instrument. [Preview Abstract] |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2024 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
100 Motor Pkwy, Suite 110, Hauppauge, NY 11788
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700