1. James Laffey
  2. http://sislt.missouri.edu/author/laffeyj/
  3. Professor
  4. Mission HydroSci: A Virtual Environment for Next Generation Science Learning
  5. https://mhs.missouri.edu/
  6. University of Missouri
  1. Joe Griffin
  2. Technical Director
  3. Mission HydroSci: A Virtual Environment for Next Generation Science Learning
  4. https://mhs.missouri.edu/
  5. Adroit Studios
  1. Justin Sigoloff
  2. Creative Director. Adroit Studios
  3. Mission HydroSci: A Virtual Environment for Next Generation Science Learning
  4. https://mhs.missouri.edu/
  5. University of Missouri
Public Discussion
  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    May 13, 2019 | 10:25 a.m.

    Hi Everyone, Thanks for coming to our video. We have field tested a fairly complete version of the game, and while we are still examining the results, we have significant findings for student achievement. Teachers have also been excited by their own observations of engagement and learning in their classrooms and we are very encouraged by the results. However, we have a few areas that need substantial advancement, have plenty of polish work to do, and need to make the game accessible to lower performing computers. We look forward to your comments as we take on these next steps toward what we hope to be another opportunity for teachers and students to use MHS in the Fall of 2019.

     
    1
    Discussion is closed. Upvoting is no longer available

    Kelsey Edwards
  • Icon for: Sally Crissman

    Sally Crissman

    Facilitator
    May 13, 2019 | 02:59 p.m.

    Do you hope or expect that the using the game will replace other the Earth Science curricula? I'm curious about the nature of the achievement measures? Do you have any evidence students can apply the argumentation, claims, evidence strategies in non-game classroom science situations? 

    Thinking about different kinds of learners, do you ever encounter students who don't "take to" games? If so, what classroom strategies does a teacher employ? I can see games as a strategy to hook students who are disengaged with more traditional hands-on, text book approaches but wonder about the converse.

    Sally

     

  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    May 13, 2019 | 09:14 p.m.

     Hi Sally,

    Thanks for your interest and questions. The game is about 6 to 8 class periods of play so we see it as replacing the water systems unit in an Earth Science Curriculum. Our evidence for achievement comes from pre and post tests taken outside of the game experience....as well as from progress through the game activities and challenges.

    Based on reports from teachers as well as student feedback on questionnaires there are a number of students who are not comfortable or experienced at playing games. Our first unit tries to scaffold the player into the game play experience as well as into the curriculum. We have numerous reports from teachers about how kids who typically don't do well in science become leaders in the class and also how classes can become very social places with kids helping kids. I believe that by the end of game play most kids (even if reluctant in starting) are enjoying the game play as part of class.

  • Icon for: DeLene Hoffner

    DeLene Hoffner

    Facilitator
    May 14, 2019 | 04:27 a.m.

    Thank you for sharing your video and wonderful 3D virtual environment in Mission Hydro Science.  I am intrigued by the teacher dashboard mentioned in the video.  What results does this show and what feedback to teachers?  I wonder if it showed any understanding of science concepts or alerts to misconceptions? Thanks!

  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    May 14, 2019 | 09:09 a.m.

     HI Delene,

    Thanks for checking our our video and for your question. The dashboard really helps teachers keep track of student progress and see when kids are moving ahead or falling behind of expectations, but other than knowing that they rely upon it quite a bit we don't know much more about the results of using the dashboard. We have done 2 field tests and we have teacher interviews for each field test, but have not yet systematically reviewed the second FT interviews.

    After the first FT the teachers asked for counts on the argumentation activities so they could see how many times it took kids to complete each activity. We added that for FT2 but have not yet reviewed teacher comments.

    Our long range view is to be able to add more specific commentary to teachers about student progress and what to do about it, potentially including ideas for addressing content errors or misconceptions, but we are not there yet. Thanks for the question....let me know if you want me to followup any more on my response.

  • Icon for: Acacia McKenna

    Acacia McKenna

    Facilitator
    May 14, 2019 | 11:56 a.m.

    Thank you for sharing your program and its potential use in Earth Science lessons. Do you envision this type of gaming to be a model for other science topics? How do you address unfamiliarity with gaming in your audiences - teachers, students, parents, etc beyond the scaffolding unit? Did you find that you needed more support for some learners and/or learning environments? Are there other lessons learned/outcomes that you have observed within the course of the study that were unexpected?

  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    May 14, 2019 | 10:42 p.m.

     Hi Acacia,

    Thanks for reviewing our video.  Your question about our game being a model for other science topics is a good one. We hope that it can be. Many games used in classrooms are meant to be short experiences that lead to more traditional teaching as followup, using the game to jump start the experience. That is a great use of games, but we wanted to explore a different role....where the game carried the responsibility for teaching and providing the learning experience. we think game experiences like MHS (which is intended for 6 to 8 class sessions of student work) can provide experiential learning and appropriate feedback while managing student behavior. of course there is a role for the teacher to support the student experiences, but not as content provider. We think games like MHS also have great potential for asking students to put knowledge into practice, such as argumentation. so yes, we would love to see MHS-like games across a number of content and practice domains.

    We provide an online course for teachers to orient them to teaching with MHS and we provide a teacher guide so they should (hopefully) feel lost with where the kids are or with the kind of problems the kids may need to resolve. this seems to have worked well, but of course we have only had volunteer teachers.

    For the kids.... most seem to adjust well.... games can be very social experiences even though it is single player, with kids helping kids. we have also designed MHS through multiple iterations where we have tried to work out best ways for kids to learn how to play MHS while they are also learning the content of MHS.

    I don't think it was unexpected, but I think the result that most impressed us was the number of teachers who found kids who typically did not do well nor wanted to be in science class becoming leaders and taking pride in their accomplishments. it made me really impressed with the amazing people who work as teachers by how impacted and excited they were as those things were happening and more committed (if that were possible :) to trying to get MHS into the hands of teachers and kids.

    Thanks for your questions.

  • Icon for: Sally Crissman

    Sally Crissman

    Facilitator
    May 15, 2019 | 12:34 p.m.

    Could you add a few specifics re: your online course: time? is it interactive, designed for solo learner? 

    Does the content relate to specifics of MHS or does it include more generic pedagogical content? It will be interesting to see how your next group of teachers (not these pioneers who tend to be a curious and motivated group) respond. Can you track their participation in the course? What have you learned? 

    Can you tell that I've spent a lot of my time at TERC designing online courses for teacher preparation and thinking about the affordances and challenges!  

    Sally

     

  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    May 15, 2019 | 07:09 p.m.

     Hi Sally,

    The online orientation course for the teachers is about 3 hours of self-paced work through modules in a canvas environment. There is a discussion board for interaction with me and with the other teachers. The first 2 modules are fairly general about self paced learning and learning with games and the remaining modules deal with specific aspects of our implementation. We also provide support via a series of emails during the game play....dealing with specific topics in the sequence of the kids playing thru MHS. We tried to keep it really straightforward getting ready for day 1, but trying to provide some of our thinking as to why using MHS could be powerful with kids.

    Unfortunately much of what we deal with in support are the technical issues which often are a combination of software issues still to be worked out in MHS and limitations of the computer systems in schools.

    While a few of our teachers are game enthusiasts most are pretty new to playing or teaching with games, but I believe most of the concerns are handled by the orientation and support as well as when they see the kids take to the course and their openness to helping each other out.

  • Icon for: Sheila Homburger

    Sheila Homburger

    Researcher
    May 16, 2019 | 07:53 p.m.

    I loved your quote from the student who said the game can make you learn without even knowing you're learning. This is how all educational games should be! Too often, the learning is a tedious task that students have to wade through in order to get to the fun game play. I appreciate the challenges in developing a game where the learning is so closely tied to the game play that students are motivated to learn--and it sounds like you've pulled it off. Well done.

    What do your assessment measures look like for measuring students' skill in argumentation? Do you feel like you've found something that can be used at a large scale (hundreds or thousands of students) and reliably measures this skill? I know there are lots of other folks out there who would love to be able to do this : )

  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    May 17, 2019 | 10:00 a.m.

    I think I was lucky to start with some colleagues who were highly motivated to have a fun game and another colleague who had worked on a prior game and concluded that although they achieved some good results....it was not fun.

    The argumentation assessments are scenario based, but paper and pencil-ish implemented in qualtrics, so yes it could be fairly easily scaled but still work to evaluate. One of our GRAs (recently graduated) is working on automated scoring....so that is a possibility for the future.

    Womack, A. J., Sadler, T. D., & Wulff, E. P. (2018, April) Automated Scoring of Scientific Practices through Open-ended, Scenario-based Assessments, NARST. Atlanta, GA.

    Womack, A. J., Sadler, T. D., & Wulff, E. P. (2018, May). Automated scoring of scientific practices using the next generation Science learning assessment system. AERA. New York, NY.

    Ultimately we would like to do this form of assessment within game, but while we have many of the mechanisms in the game....making sense of performance requires a lot more investigation than we have been able to undertake.

  • Icon for: Lindsey Tropf

    Lindsey Tropf

    Founder & CEO
    May 20, 2019 | 02:27 p.m.

    Thanks for sharing the articles -- will be looking at those!  

  • Icon for: DeLene Hoffner

    DeLene Hoffner

    Facilitator
    May 17, 2019 | 01:26 a.m.

     From your work on this project, what have you learned the "hard way" that you may want to forewarn others about?  We all learn by doing and can improve the next time.  What would you do differently the next time?  

  • Icon for: Joe Griffin

    Joe Griffin

    Co-Presenter
    May 17, 2019 | 12:56 p.m.

    Hi DeLene,

    I would forewarn others to test early and often. Usually before you think you are ready. We had to make a pretty major pivot early on after finding that our initial design wasn't going to be feasible. We went from a very planet oriented simulation concept to the much more immersive adventure type game we have now.

    Currently after completing a field test in schools throughout Missouri, we are finding that performance issues are still our major concern; so my other word of caution would be to focus on optimization from the start, and makes sure to test on low end machines to identify your minimum specifications. 

     
    1
    Discussion is closed. Upvoting is no longer available

    Lindsey Tropf
  • Icon for: Sonia Ellis

    Sonia Ellis

    Instructional Designer
    May 17, 2019 | 07:59 a.m.

    This looks like a very effective way to engage students - particularly, as you've noted, those learners who may not otherwise be class "leaders." I'm interested in the storyline and characters you've created: what was your process for developing those to be appealing and relevant to middle schoolers? Great work!

  • Icon for: Justin Sigoloff

    Justin Sigoloff

    Co-Presenter
    May 17, 2019 | 10:37 a.m.

    Hi Sonia -

    I think my approach to the storyline and characters was to make it accessible to students, but also dramatic enough that there were some stakes involved. For the former, I tried to craft characters that each had a different viewpoint on how science can be used; one NPC wants to conserve the environment while another seeks to profit from it, neither is viewed as the right or wrong choice. I then layered on how they arrived at such a conclusion, such as how the aforementioned NPC learned about conservation from her activist father. I then sprinkled in some flaws and attitudes because it was important for us to create characters that felt "real" to our middle school audience.

     

     
    1
    Discussion is closed. Upvoting is no longer available

    Sonia Ellis
  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    May 17, 2019 | 01:31 p.m.

    Just to add a comment to Justin's response. MHS has a  great variety in types of learning activities from narrative interaction with other characters, exploration of space, solving puzzles to get through "dungeons", using our argumentation interface, etc. Similarly the terrain changes from unit to unit making it visually interesting and novel....I imagine the kids get surprised quite a bit by what comes next.

     
    1
    Discussion is closed. Upvoting is no longer available

    Sonia Ellis
  • Icon for: Lindsey Tropf

    Lindsey Tropf

    Founder & CEO
    May 20, 2019 | 02:33 p.m.

    Came over after James commented on our pretty similar work -- great to see how others are approaching it! 

    I am especially interested in your argumentation/CER framework.  We have a mechanic for the same purpose, but of course it looks totally different, so it was great to see how your team approached it. Do you mind sharing how your mechanic works?  It seemed like where was some sort of sorting mechanism (with the "training") part, I assume earlier in the process (sorting first is something we've been discussing adding to ours), and then piecing it together later in the orbital view?

  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    May 20, 2019 | 07:09 p.m.

    HI Lindsey,

    We use Osborne's argumentation learning progression framework as a basis for the argumentation activities in MHS (some references at the end of this note).

    We have a number of mechanisms for argumentation in MHS: the solar system argument construction system, a mini-game for identifying CRE in statements, a number of narrative sections meant to help students critique arguments, and a number of (for lack of a better term) advance organizers for orienting students to argumentation tasks.

    The solar system tasks are carefully constructed sets of choices for students to select from CRE to make an argument. Too many choices to simply game the task by selecting options without reading (although many have tried :), but trying to keep a limit on the options so as to not make the construction task too difficult. Students get feedback as they make choices and submit responses that are meant to help them think through the arg construction.

     

     

    Osborne, J. F., & Patterson, A. (2011). Scientific argument and explanation: A necessary distinction? Science Education, 95(4), 627-638.

    Osborne, J., Henderson, B., MacPherson, A., & Szu, E. (2013). Building a learning progression for argument in science. Paper presented at the annual conference of the American Educational Research Association Conference, San Francisco, CA.

    Osborne, J., Henderson, B., MacPherson, A., Wild, A., & Friend, M. (2014). IRT analysis of items probing a unidimensional learning progression for argumentation of increasingly complex structure. Paper presented at the annual conference of the National Association for Research in Science Teaching, Pittsburgh, PA.

  • Further posting is closed as the showcase has ended.