1. Susan Sunbury
  2. Project Manager
  3. Professional Development Models and Outcomes for Science Teachers (PDMOST)
  4. Center for Astrophysics | Harvard & Smithsonian
  1. Cynthia Crockett
  2. Science Education Specialist, Research Associate
  3. Professional Development Models and Outcomes for Science Teachers (PDMOST)
  4. Center for Astrophysics | Harvard & Smithsonian
  1. Jacqueline Doyle
  2. Professional Development Models and Outcomes for Science Teachers (PDMOST)
  3. Center for Astrophysics | Harvard & Smithsonian
Facilitators’
Choice
Public Discussion
  • Icon for: Susan Sunbury

    Susan Sunbury

    Lead Presenter
    May 12, 2019 | 10:24 p.m.

    Welcome to PDMOST – Professional Development Models and Outcomes for Science Teachers

     We are in the final stretch of the project. In addition to disseminating our research findings, we are currently developing a website to offer resources for PD providers and science educators (available this fall through our MOSART website https://www.cfa.harvard.edu/smgphp/mosart/). The site contains a repository of online assessments in all grade bands and in all science subject areas to test educators’ knowledge of both science content and student misconceptions. If you are a PD provider, we’d like to know how you assess educators in your programs.

  • Icon for: Stephanie Palumbo

    Stephanie Palumbo

    K-12 Teacher
    May 16, 2019 | 07:44 a.m.

    Professional development is so important to keep teacher's practices relevant. If you looking for an interesting international PD model watch our video 

    https://videohall.com/p/1468 :) 

  • Icon for: Susan Sunbury

    Susan Sunbury

    Lead Presenter
    May 16, 2019 | 10:21 a.m.

    Stephanie,

    I totally agree. I enjoyed watching your video. Teaching abroad experience would definitely have been an interesting variable to run in our model.

  • Icon for: Nicole Wong

    Nicole Wong

    Researcher
    May 13, 2019 | 04:00 p.m.

    Hi Susan, Cynthia, and Jacqueline,

    Thank you for this important and interesting work. In the past, I have used the MOSART student tests or portions of these tests successfully with teachers, and I am so exited to hear about your recent study.

    In relation to your finding about time spent on foundational concepts, have you gotten any sense of how much is "enough"?  In several studies, including a large-scale RCT, my colleagues and I have found that 5-day Making Sense of SCIENCE courses designed to support teacher content knowledge and PCK significantly improved teacher content knowledge, PCK, observed classroom practices, and student content knowledge (Heller et al., 2012; Little et al., 2019).  However, in our recent study of PD scaling efforts, we are finding that, unsurprisingly, administrators and science leaders want or need shorter courses.  Many PL providers either opt for courses that are available in a shorter 2-day or 3-day format, or they shorten the 5-day courses to fit their needs.  I wonder whether your team has gained any additional insight into how much contact time is needed to improve teacher knowledge and what qualities/features of that time are most supportive of teacher learning.  I ask these questions, of course, understanding that there are many factors at play.  I would also love to hear more about the other PD factors you have found to be linked with improved teacher knowledge.  I'm looking forward to future discussions.  Thank you!

     
    2
    Discussion is closed. Upvoting is no longer available

    Jennifer Mendenhall
    Molly Stuhlsatz
  • Icon for: Jacqueline Doyle

    Jacqueline Doyle

    Co-Presenter
    May 13, 2019 | 05:48 p.m.

    We didn't find anything like a "break point" where increasing amounts are good until that point, but diminishing returns after that. One drawback to our measurement is that we have "frequency of learning foundational concepts" and "total duration of the PD", but not "total duration of learning foundational concepts", which would be better for answering your question. I expect that there is a level of "enough", but our measurement design was less sensitive to it. 

     

    As for other qualities or factors, it seems that a focused PD does better than one which is trying to do many things at once; if you're going to train teachers by teaching them foundational concepts, do that and only that, and save the other lessons for when you're not trying to increase their knowledge. Also helpful: having interested and engaged attendees. Teachers who expressed more personal interest in the PD and its topics seemed to learn more than those who didn't. Hopefully fostering excitement and engagement gives you and your colleagues a potential path forward to maintain effectiveness if the administrators and science leaders insist on cutting your programs shorter.

  • Icon for: Cynthia Crockett

    Cynthia Crockett

    Co-Presenter
    May 13, 2019 | 05:06 p.m.

    Hi Nicole, thanks so much for your question. We did look a bit at duration of PD and did not find an effect. That said, I am also going to hand this to Jackie who did the analysis and can speak to it better. It is interesting that administrators are wanting to compress the PD. We have found an effect with more content as a contributor to teacher knowledge. We did not see anything for duration and we had progrms from 1 day to all summer. I will ask Jackie to elaborate in this. Thanks!

  • Icon for: Daniel Capps

    Daniel Capps

    Facilitator
    May 13, 2019 | 10:50 p.m.

    Hi all,

    I enjoyed watching your video. I and it got me thinking about other ways to assess and think about teacher knowledge and learning beyond pre- and posttests to get after what teachers might need or want from PD experiences. Your video made me start thinking about the literature on adult learning and what motivates people to want to learn. Starting from misconceptions is one way, but I wonder if you’ve ever begun PD from self-identified needs and/or interests. Do you have any thoughts on this? 

  • Icon for: Cynthia Crockett

    Cynthia Crockett

    Co-Presenter
    May 14, 2019 | 09:21 a.m.

    Good morning, Daniel,

    thank you for your comment. we looked at programs that took place in the summer only, and were looking to measure teachers' subject matter knowledge as well as knowledge of students' misconceptions in science. Hence, using the content-related pre/post manner of assessing this. We did not specifically look at participants needs/interests for being at the PD, however, we did ask questions about their program, including why they were attending. The largest effect (although still small) was that people found it 'interesting or personally rewarding' which showed up a small bit in the gains in SMK. Participants could also mark that they attended because it was an area 'they needed to brush up on' (so, needs).

    We have proposed using the assessment tool in Methods classes with preservice teachers to assess subject matter weaknesses and identify areas of 'need' BEFORE they get into the classroom. That is something we still hope to do at some point. 

  • Icon for: Molly Stuhlsatz

    Molly Stuhlsatz

    Facilitator
    May 14, 2019 | 10:38 a.m.

    Thanks for the well done video! Along the same lines as Daniel's question, I was wondering if you collected any other information from the teachers in the study, such as self-efficacy, motivation, or even information about whether they were required by their district to attend PD?

    Did you use the same assessment for all PD offerings, or did you used an equating technique across multiple assessments?

     

  • Icon for: Jacqueline Doyle

    Jacqueline Doyle

    Co-Presenter
    May 14, 2019 | 01:35 p.m.

    We used the same questions regarding teacher demographics/background and PD characteristics for all PD offerings; Cynthia describes several of the types of questions below. The assessments for SMK and KOSM were tailored to each grade band and subject, so all high school physics PDs got the same SMK/KOSM assessment, but that assessment was different than the one given to, e.g., middle school physical science and high school life science. 

    Because the assessments were not calibrated against each other prior to data collection, we included among our control variables the grade band and subject of the assessment, in case a particular subject/grade was easier to learn than others. The differences in knowledge were all taken in a pairwise fashion, so if one assessment were harder than another but the subject were just as easy to improve in, both pre- and post-scores would be lower by a similar amount due to difficulty and it wouldn't affect the final results. When using pre-scores of SMK and KOSM in the regression, the distributions of each grade band + subject were normalized separately to put them on a similar scale and account for missing difficulty calibration.

     
    1
    Discussion is closed. Upvoting is no longer available

    Molly Stuhlsatz
  • Icon for: Molly Stuhlsatz

    Molly Stuhlsatz

    Facilitator
    May 14, 2019 | 06:51 p.m.

    Thanks for the clarification!

  • Icon for: Cynthia Crockett

    Cynthia Crockett

    Co-Presenter
    May 14, 2019 | 10:54 a.m.

    Hi Molly,

    thank you for your question. We did collect other information from the teacher PD participants as to why they were attending the program such as it 'addressed a science subject they needed to brush up on','provided opportunity to learn new/innovative methods of teaching science', 'looked fun, challenging, or personally rewarding', provided PD credits and/or a stipend, or 'was required by district, school, licensing, etc.'. Those who attended a program because 'it provided an opportunity to learn new or innovative methods of teaching science’ or it looked 'fun, rewarding, personally challenging' saw gains in both subject matter knowledge and knowledge of students' misconceptions. Being required to attend by one's district doesn't seem to be helpful.

    We also collected demographic information on participants' teaching experience, certifications, in/out of field teaching, etc.

    We did use the same assessment with identical questions for all participants in all participating PD offerings. Jackie can speak to the methodology of the analysis that was used.

    Thank you for your comments!

     
    1
    Discussion is closed. Upvoting is no longer available

    Molly Stuhlsatz
  • Icon for: Cynthia Crockett

    Cynthia Crockett

    Co-Presenter
    May 14, 2019 | 01:52 p.m.

    Thank you for clarifying that! I was thinking of the questions "about the program" outside of the content/grade level. My apologies for skipping mentioning the grade band and content area questions. Our grade bands were K-4, 5-8, and 9-12. Content areas are as follows: K-4 and 5-8:  Physical Science, Life Science, Earth Science, Astronomy; 9-12: Life Science, Astronomy, Chemistry, Physics.

     
    1
    Discussion is closed. Upvoting is no longer available

    Molly Stuhlsatz
  • Icon for: Courtney Arthur

    Courtney Arthur

    Facilitator
    May 14, 2019 | 04:17 p.m.

    This is a really well done video! I, too, have found in my own work that administrators are often trying to shorten a pre-set PD from a week down to 3 days or two-days compressed into one. I am curious as to factors that either helped or hindered an extended PD session. Was there any data collected around teacher preference for duration of PD? (i.e- after school, weekends, summer institutes, etc).

  • Icon for: Cynthia Crockett

    Cynthia Crockett

    Co-Presenter
    May 14, 2019 | 04:55 p.m.

    Hi Courtney,

    thank you for your comments!We did not necessarily look at extended PD offerings per se, but compared programs that only took place in the summer. If they had an extended yearlong component, we did not look at that. However, we were able to look at duration as a factor based on the fact that of the many participating programs, they ran from 1 day to 3 weeks, to all summer (on an independent work basis). Despite other reports of longer duration being better, we did not find this to be true. In fact, "for most of the activities, time spent did not have a distinguishable effect". In fact, it is notable that "program duration showed no significant association with either [subject matter knowledge] or [knowledge of students' misconceptions] gains', which is what we were measuring. A factor of extended yearlong PD seems to be the collaboration and implementation of the PD piece among colleagues and would be interesting to try to measure. 

    We did not collect data around teachers' preference for duration of PD, merely what was the duration of the PD they were attending that summer.

  • May 14, 2019 | 11:47 p.m.

    Have you thought about integrating coding into the 9-12 cohort?

    The STEMcoding project does online training for high school physics teachers. Do you have any advice for us on best practices for teacher PD?

  • Icon for: Cynthia Crockett

    Cynthia Crockett

    Co-Presenter
    May 15, 2019 | 09:09 a.m.

    Good morning Chris,

    thank you for your question. We have not looked at integrating coding into the 9-12 PD groups. We use the MOSART suite of assessments which are geared toward the science content standards, both from the NRC NSES and more recently the NGSS. We have only included the science standards (DCIs) of the NGSS into the 9-12 Life Sci, Chem, and Physics assessments but did not include any Cross-cutting concepts or Science and Engineering practices.

    In our findings, a key piece of improving subject matter knowledge (in the sciences) and knowledge of students' misconceptions was spending time on content and time spent learning foundational concepts. In addition, the more engaging the PD the more participants seemed to learn. Also, " participants [in our study] reporting they attended the program because it ‘provided an opportunity to learn new or innovative methods of teaching science’ or ‘looked fun, challenging, or personally rewarding’ tended to have higher gains in both SMK and KOSM." These are the directions that seem to be most beneficila to participants when planning and putting on a PD offering. I hope that helps. For more, please see the paper with our findings, the link is given in the video.

    I will look at your STEMcoding project, it sounds interesting! 

  • Further posting is closed as the showcase has ended.