1. Jason Jones
  2. Recalibrating Student Learning in the Geosciences
  3. http://www.classforlearning.com
  4. North Carolina State University
  1. David McConnell
  2. Recalibrating Student Learning in the Geosciences
  3. http://www.classforlearning.com
  4. North Carolina State University
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Cristina Heffernan

    Cristina Heffernan

    Researcher
    May 13, 2019 | 09:17 a.m.

    This looks interesting.  Where does the content come from? 

  • Icon for: Jason Jones

    Jason Jones

    Lead Presenter
    May 13, 2019 | 12:48 p.m.

    In general, the content is created by the instructors using the tool and an instructor can generate whatever questions they wish if they want to use the system. 

    We are, however, using the data gained from testing the tool in a relatively-large (~100 student) introductory physical geology course to craft a suite of data-validated assessment questions that will eventually be made available to any interested geology instructors who use the tool. In effect, if a geology instructor wanted to sign up for the system, we could immediately provide them with a suite of quizzes based on learning objectives they select. We have plans for similar endeavors in other STEM fields as well (e.g., chemistry, biology), but we started with the geosciences because that is our "home" discipline and are the courses we teach :)

  • Icon for: Stephen Alkins

    Stephen Alkins

    Facilitator
    Diversity, Equity, and Inclusion Officer
    May 13, 2019 | 11:17 a.m.

    Cool video!

         This seems like it would be a great tool for both students and teachers.  I definitely see this as a professional development tool to help teachers assess what content they perhaps should focus on, or review more.  For students, do you find that explaining the data measurements to students deters them from using it?  Is this voluntary or mandatory for them?  This seems like the students may need to be trained on how to understand what CLASS is measuring.

         Also, what happens following the assessment?  What tools are in place to support them to ensure that the content gets learned and that they gain confidence and proficiency.

  • Icon for: Jason Jones

    Jason Jones

    Lead Presenter
    May 13, 2019 | 01:10 p.m.

    Hi Stephen, thanks for your comment! I will address each point in sequence:

    - The PD angle is a great suggestion and one that I definitely could see utility in pursuing! Thanks for that!

    - Most students (from interviews I've conducted) buy into the system and enjoy using it because its their data measurements that we are providing, not just generic feedback. They all want to "get a good grade," and many seem to appreciate not only the feedback the system provides, but the opportunity to see a model of our assessment style prior to course exams (as we have generated the quizzes and use the same learning objectives course-wide).

    - In developing/testing the tool on our own physical geology students, we have had a few different layouts of how it was integrated into the course. First it was completely optional and there was near ubiquitous buy-in (~90% of students taking quizzes), but they were taking them immediately prior to the exam (a strategy not supported by learning research). So now we ask them to distribute their attempts by having weekly quizzes that are for a grade (unlimited attempts, highest score counts).

    - We have some general training on the data and measurements that CLASS provides at the beginning of the semester and we come back to their trends as the semester progresses.

    - As far as what happens after the assessment, they use the quizzes to prepare for course exams and we consider their trends and results to communicate strategies for where to go for more help (i.e., book pages x, y, z). In addition, we can augment in-class instruction on low-performing or high bias (disparity between confidence and performance) concepts as a result of data gained from students' quiz results.

  • Icon for: Bess Caplan

    Bess Caplan

    Ecology Education Program Leader
    May 13, 2019 | 12:51 p.m.

    What an interesting tool.  Can you share any results you might have showing whether or not students who use this tool actually do better on exams? Have you piloted this tool in high school classrooms?

  • Icon for: Jason Jones

    Jason Jones

    Lead Presenter
    May 13, 2019 | 01:22 p.m.

    Hi Bess,

    We have piloted the tool in introductory (undergraduate) physical geology courses here at NCSU and saw that (for example) students performed better and were more accurate in predicting their exam grades than in prior semesters, but we have not done any testing in high school classrooms as yet. We are, however, also testing the tool in similar courses at two 2-year colleges (one in NC and one in RI) to investigate comparative effects on students at different types of institutions. As the demographics and academic histories of the students at public research institutions and two-year colleges are generally very different, we are seeking to determine the best way to use the tool to support students of all backgrounds and levels of academic experience. 

  • Icon for: Patricia Marsteller

    Patricia Marsteller

    Facilitator
    Assoc Dean and Professor of Practice
    May 13, 2019 | 04:47 p.m.

    Interesting tool.  In your classes do you require participation?  How would you introduce this tool to new instructors? In a different content area?

    Can you use it to ask for reflections from students on how they might modify study  habits?

    In your experience do students close the gap more as they use the tool mor?

     

  • Icon for: Jason Jones

    Jason Jones

    Lead Presenter
    May 14, 2019 | 02:01 p.m.

    Hi Patricia, thanks for your comment! I will address each point in sequence:

    - For this question regarding participation, Stephen asked a similar question above so I will share my response to him below;

    "In developing/testing the tool on our own physical geology students, we have had a few different layouts of how it was integrated into the course. First it was completely optional and there was near ubiquitous buy-in (~90% of students taking quizzes), but they were taking them immediately prior to the exam (a strategy not supported by learning research). So now we ask them to distribute their attempts by having weekly quizzes that are for a grade (unlimited attempts, highest score counts)."

    - We have put a lot of time into generating a suite of quizzes and questions for introductory physical geology, but the tool itself can be used for anything that: a) someone wants to learn/teach and; b) can be assessed via questions, so it will definitely accommodate other content areas. The instructor would, however, have to have defined learning objectives to assess prior to using the tool effectively. Following the principles of backwards design (Wiggins & McTighe, 1998), these objectives should define the assessments, instruction, and skills the student should have as a result of completing the course activities. So to introduce the tool to new instructors, I would suggest a short primer on backwards design and on the nature of student confidence judgments and related accuracy (called "calibration) prior to generating quizzes and trying them out on their students. Towards this end, in the future, we will be seeking to partner with colleagues in other STEM disciplines (e.g., chemistry, biology) to generate curated suites of quizzes (like the ones we currently have for geology) that would be ready to use for instructors who may not want to have to generate their own quizzes from scratch.

    - We ask for reflections on their CLASS feedback and related study habits as part of a weekly assignment in our course management system, so not in the system itself, but CLASS can provide specific feedback and data for them to track across the semester provided you prompt them to be mindful of its results. For example, we can ask students to define their highest bias learning objective at the time of the assignment (disparity between confidence and performance) and to come up with a plan to remedy that disparity. In addition, we can view that same metric for the course a a whole (from the instructor side) and suggest new reference material or augment instruction to help address the issue. 

    - We have evidence that students do close the gap more as a result of using the tool (particularly in being able to predict their eventual exam score), but we have more analysis to complete on that in the near future in hopes to further elucidate the details of the relationship and how it may potentially vary across student variables (e.g., demographics, academic history, institution, etc.). 

  • Icon for: A Daniel Johnson

    A Daniel Johnson

    Higher Ed Faculty
    May 13, 2019 | 09:02 p.m.

    Wearing my practical hat, how difficult is it to use CLASS on the instructors' side to build out quizzes?

     One of our challenges is to get students to actually take supplemental instruction seriously. Do you have any data to say that students are gaining more than they would by just engaging in routine practice on their own?

  • Icon for: Jason Jones

    Jason Jones

    Lead Presenter
    May 14, 2019 | 01:42 p.m.

    Hi Daniel,

    - For a brand new instructor in a brand new content area, it would work like any other quizzing system in that quizzes would need to be built via entering in questions one-at-a-time, albeit tying them to specific and defined learning objectives. Once this process is completed, however, instructors would then be able to generate new quizzes from existing objectives and share them with themselves or other instructors (e.g., a new quiz with objective A from quiz X and objective B from quiz Y), or make their quizzes public for others to take as students or use as instructors. In the future, we will be seeking to partner with colleagues in other STEM disciplines (e.g., chemistry, biology) to generate curated suites of quizzes (like the ones we currently have for geology) to make this "build out" as you call it more seamless in the future.

    - We have quantitative data supporting increased exam scores and increased measures of accuracy related to their judgments (mounds of data/more analysis to come on that, so we will know more nuance to that in the near future). In addition, we have qualitative data (student interviews) that support the notion that they may be taking CLASS more seriously than other supplemental instruction endeavors because they see utility in the feedback we are providing ("After taking some quizzes, I saw I needed to really study concept A and not concept B") and that they are not seeing the quiz-taking process as "extra work" (anecdotally a common student view regarding supplemental instruction in general). A lot of students seek out practice quizzes on the internet as part of their study habits anyway, we are just providing them with ones that are not only generated by their instructors, but ones that are aligned to course objectives and that provide them information related to their learning beyond simply performance feedback. Finally, when we ask students to rank the components of our course that they felt helped them learn the most (e.g., the book, quizzes, lectures, homework assignments, videos, etc.), they have statistically significantly ranked the quizzes higher each semester they have been on CLASS (mode = #1) than when the same quizzes were administered via our course management system. We are, however, always trying to increase the student utility of the tool and hope to develop best practices for its use in college STEM classrooms.

  • Icon for: A Daniel Johnson

    A Daniel Johnson

    Higher Ed Faculty
    May 16, 2019 | 04:30 p.m.

    Thanks for the detailed response. I'm looking forward to incorporating what you describe into our own new course sequence.

    D

  • Icon for: Frank Davis

    Frank Davis

    Researcher
    May 14, 2019 | 08:48 p.m.

    Hi Jason,

     I think this is a very creative approach to helping students to think about what some see as a metacognitive skill about reflection on one’s learning. Because of some work I am doing around the idea of learning progressions  I was wondering if in a series of these calibration studies over the course of a semester you could see how various concepts/ ideas in physical geology were linked developmentally – a weakness or non confidence at a particular point in the course predicted further problems later on.

  • Icon for: Jason Jones

    Jason Jones

    Lead Presenter
    May 15, 2019 | 12:19 p.m.

    Hi Frank,

    That is a great idea! Thanks for the suggestion! We are considering similar lines of inquiry via quantitative analysis (e.g., multiple regression models to investigate predictive relationships between student variables such as demographics and CLASS use and exam outcomes). The idea, however, of potentially developing a path model that may isolate some "turning points" of course content, if you will, that lead to variable achievement for the students (or some subset of students) would be a great path forward (*pun intended*).

  • Icon for: Alex Rudolph

    Alex Rudolph

    Facilitator
    Professor
    May 15, 2019 | 10:08 a.m.

    This is a very cool idea! I do have a few questions:

    Is it freely available to any instructor at any university? How is the data stored? Could you run into database management issues if this becomes very popular? Or do you offer the system to other universities to run locally? Also, are there any training tools for instructors, or is the system intuitive enough to use without it? Finally, what are your plans for sustainability (maintenance) and dissemination?

  • Icon for: Jason Jones

    Jason Jones

    Lead Presenter
    May 16, 2019 | 03:06 p.m.

     Hi Alex, thanks for your comment! I will address each point in sequence:

    - Yes, with some (temporary) caveats. Anyone can freely sign up for the system as a student and attempt public quizzes, but we are asking potential instructors who want to adopt the tool to provide some information regarding their goals and courses prior to contacting them regarding potential use. We are always developing new features for the system to increase ease of use for both the student and instructor, but we are still at a point with the system that we want to have a dialogue with all potential instructors before and during use to utilize their experiences to further benefit the system and its utility for users. 

    - The data is stored in our AWS cloud server database. We monitor use and investigate server bandwidth draw during peak usage and currently have more than adequate room for growth without worry of overloading our capacity, but you are definitely right. This issue is something we will have to monitor and be prepared to manage into the future (which is also another reason for the scaled instructor rollout described above). The cloud server model is nice for this as well, as we can scale up bandwidth as needed.

    - Tied to the scaled instructor rollout, we are looking to collaborate with colleagues both in geology and across other disciplines to develop suites of curated quizzes that would be ready for a potential geology instructor immediately upon sign-up. This would limit the need for a lot of the initial training related to quiz construction etc., but depending on a user's experience, the tool has been designed to be intuitive (of course I am biased however). We are, however, planning on developing short training videos (similar to overall introduction above) that address many of the tasks that are required to use the system effectively. 

    - In addition to seeking out further development support from NSF to expand the system to more use cases (e.g, mobile devices, classroom response system) and to develop more curated resources from other disciplines for users, we are committed to making the resource available to all at the end of the (current) project and to promote its use by reaching out directly to teachers/faculty at professional conferences, etc. and to the general public via online marketing campaigns.

  • Icon for: Jonathan Lewis

    Jonathan Lewis

    Higher Ed Faculty
    May 20, 2019 | 07:38 a.m.

    Hi Jason

    Nice project!  This addresses a substantial challenge in geosciences.  Is the calibration gap recognized in other STEM disciplines?  I'll be on the lookout for opportunities to use CLASS and I'll spread word with my colleagues.  

    I have one thought about one of your frameworks for using of online quizzes.  I've noted that when online quizzes are available for unlimited attempts and the high score is retained that it seems (!) to contribute to the calibration gap.  My impression is that when the stakes are low the students are less sincere in their efforts than when they have a finite number of opportunities (e.g., 2 or 3).  

  • Icon for: Jason Jones

    Jason Jones

    Lead Presenter
    May 20, 2019 | 04:38 p.m.

    Hi Jonathan,

    - Absolutely. The phenomenon has been recognized in pretty much every situation where people are asked to ascertain their ability to do a task. People who are skilled generally minimally underestimate their (high) performance and unskilled individuals generally vastly overestimate their low ability. The topic has been extensively studied in educational psychology, however less research has been done in STEM specifically. There are some studies that have investigated introductory Chemistry (Hawker et al., 2016) and Biology (Stanton et al., 2015) courses, but there is definitely much to be investigated regarding the gap and how to best help students reduce it during the process of STEM learning.

    - We are actively researching this change in requirement (i.e., optional vs required vs graded) and will certainly have more specific results to report in the future! That is a great point, however, and something that we hope to be able to have some data speak to for instructors trying to determine best practices for situating formative assessment in their courses.

     
    1
    Discussion is closed. Upvoting is no longer available

    Jonathan Lewis
  • Further posting is closed as the showcase has ended.