Wednesday, November 28, 2007

Week 13 Response

I found the Edelson, Gordin, and Pea article particularly interesting because (I think) it is the first article that we've read in which we've looked at a single learning technology as it has been adapted and changed through a couple of editions. It was interesting to read what the creators learned throughout their process. I was surprised and amused by their decision not to design any curriculum for the first version of their program, Climate Visualizer. They simply said, "we did not feel predesigned curricula were appropriate because we were hoping to foster an entirely student-driven form of inquiry-based learning in which student would generate and pursue their own research questions" (Edelson et al., 1999, p. 404). Of the many changes they made to their program throughout its four (or so) reincarnations, this seems to be the one they changed most significantly. Later (and this is the part that amused me) they developed a five-week curriculum comprised of structured activities, lessons, and even a culminating mock conference. And throughout all of this, the actual software seemed to play only a small part! That is a far cry from a "predesigned curricula" they initially felt would be "inappropriate." This leads me to two thoughts. First, that in creating this wonderful tool that students were supposed to dive into and explore, they were expecting students -- novices -- to behave like experts. The authors acknowledge the difficulties they had in creating an "authentic" tool for students to use that is similar to what experts would use while still making it accessible to students. But the very premise of this idea, the expectation that students should behave like experts, seems faulty to me. Students know they're students, not experts, and they aren't fooled by technologies that are only disguised as "real" tools that "real" scientists use. While I appreciate the authors' motives -- that by adapting these tools for student use, the developers are trying to increase student learning by providing an experience close to the ways in which professionals work -- I am still skeptical of their (or anyone's) ability to create a balance between accessibility and authenticity. I suppose I think that their end product was better because it gave students a taste of working with an expert-like tool, pushing them to new understandings, but took place within a well-designed and quite structured (and perhaps more traditional?) curriculum. The second thought I had related to this is that once again we see in this article the theme of imposing a really cool and advanced technology on classrooms with teachers who aren't comfortable with either the technology component of the program or the domain knowledge required to teach it . . . and then the learning technology fails to be used to its full potential. I would love to read something that describes teachers coming together with technology experts to design a tool that solves a real problem or fills a genuine void that teachers actually have in their classrooms. Now that would be authentic!

No comments: