1. Edith Graf
  2. Research Scientist
  3. DEVELOPMENT AND EMPIRICAL RECOVERY FOR A LEARNING PROGRESSION-BASED ASSESSMENT OF THE FUNCTION CONCEPT
  4. Educational Testing Service (ETS)
  1. Frank Davis
  2. Evaluation & Research Consulting
  3. DEVELOPMENT AND EMPIRICAL RECOVERY FOR A LEARNING PROGRESSION-BASED ASSESSMENT OF THE FUNCTION CONCEPT
  4. Frank E. Davis Consulting
  1. Chad Milner
  2. National Director of Tech & Media
  3. DEVELOPMENT AND EMPIRICAL RECOVERY FOR A LEARNING PROGRESSION-BASED ASSESSMENT OF THE FUNCTION CONCEPT
  4. Young People's Project
  1. Maisha Moses
  2. Executive Director
  3. DEVELOPMENT AND EMPIRICAL RECOVERY FOR A LEARNING PROGRESSION-BASED ASSESSMENT OF THE FUNCTION CONCEPT
  4. Young People's Project
  1. Sarah Ohls
  2. Sr. Research Project Manager
  3. DEVELOPMENT AND EMPIRICAL RECOVERY FOR A LEARNING PROGRESSION-BASED ASSESSMENT OF THE FUNCTION CONCEPT
  4. Educational Testing Service (ETS)
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Edith Graf

    Edith Graf

    Lead Presenter
    Research Scientist
    May 12, 2019 | 06:58 p.m.

    Thank you for visiting! We welcome your comments and questions about our work, and would also like to hear about work you are doing! Here are a few questions to start the discussion:

    1) How can student voice be incorporated into the design of LPs/LTs and/or assessment? We've used focus groups and cognitive interviews, but we'd be interested to hear about other approaches.

    2) What are your thoughts about assessment and instruction with respect to the concept of function?

    3) What are your thoughts about how teacher voice might be incorporated into this work?

    4) What are your thoughts about psychometric approaches for validating learning progressions?

     

  • Icon for: Gillian Puttick

    Gillian Puttick

    Facilitator
    Senior Scientist
    May 13, 2019 | 11:35 a.m.

    I believe that engaging students in embodied learning is a valuable method for incorporating student voice, though in a slightly different way than I think you're asking about!  I would be interested in hearing more about how your current findings from focus groups and cognitive interviews have informed design to date.

  • Icon for: Chad Milner

    Chad Milner

    Co-Presenter
    National Director of Tech & Media
    May 13, 2019 | 12:31 p.m.


    Hi Gillian, thanks for your question. One area we observed was around the intersection of language used in the questions and the related math content.  We observed that when vocabulary was new to students, or used in a new context, it often became a barrier for us to assess the students understanding of the underlying mathematics (i.e. learning progression.)  Hence some of the feedback from students during the cognitive interview process, suggested simplifying or modifying the way that questions were asked, with some of these modifications involving specific mathematical terminology, while others involving more general wording of questions.


    Mathematical language/terminology
    In some of the geometry tasks, many students expressed confusion around the statement "reflection about a line or a point" and informed us that it would make more sense to them to say "reflection across a line or around a point".  As a group we discussed this with our team, including the research mathematicians, and concurred that this word could be swapped out without compromising the mathematical integrity of the item.  


    Another example along the same lines, was stating that "functions are [something]" versus "functions representing [something]" -- students preferred the later. (screenshot)


    More general grammatical language choice in items
    Many students expressed a preference for the use of more concise language in some situations (i.e. using simple verb tenses, not including excessive information,) while in other items they preferred more elaborate or precise language to be used (i.e. not dropping optional prepositions.)


    For example, in the geometry items, the directive for students to “plot points” was often used in questions, and (surprisingly) created confusion for a number of students.  The term was interpreted as an (unfamiliar) double noun clause (so they were interpreting 'plot points' as objects, as in "the plot points", instead of the verb + noun directive “Plot (the) points”.)  Students suggested  the use of an alternate verb such as "mark", would work better:  "Mark the points".  Often times this confusion was independent of an individual students' learning progression level on that item.  (screenshot2)


    This feedback will be incorporated into the final design of the computer tasks.


  • Icon for: Gillian Puttick

    Gillian Puttick

    Facilitator
    Senior Scientist
    May 17, 2019 | 09:28 a.m.

    Thanks Chad.  I like the way you describe the intersection of vocabulary and content, and think of the process you describe as good research around the design of instruments regardless of the community in which the instruments are to be used.  I'm wondering about including student voice in a more expansive sense I think, taking culture or everyday experience into account, for example, carefully choosing contexts in which to pose math problems.   I think Frank addressed this in his later comment.   I look forward to hearing about how your work continues!

  • Icon for: Edith Graf

    Edith Graf

    Lead Presenter
    Research Scientist
    May 13, 2019 | 01:06 p.m.

    Thank you for the comment. As Chad suggests, the focus groups and the cognitive interviews provided the team with information about how students were interpreting both the mathematics and language in the tasks. In the focus groups, students provided the team with direct feedback about the relatability of the contexts and the clarity of the language. In the cognitive interviews, the students verbalized while working on the tasks, so the interviewers inferred how the tasks were being interpreted through the verbalizations and student work, as well as through follow-up questions. These observations were discussed by a cross-disciplinary team, including mathematicians, learning scientists, and staff from the Young People’s Project. This resulted in some interesting discussion. For example, though “rotate about a point,” is standard mathematical language, the team concurred that in this case there was no reason that “rotate around a point” could not be used instead.

    We would be interested in ideas about how embodied learning might be applied to this work.

  • Icon for: Frank Davis

    Frank Davis

    Co-Presenter
    Evaluation & Research Consulting
    May 13, 2019 | 01:33 p.m.

    Hi Gilliam,

     Another element of “Student Voice” that is reflected in the issues of language that Chad and Edith discussed is the broader context of experiences that students may or may not have, that assessment items may assume –both in terms of a more regimented language used by mathematicians and math educators developed as part of their practice, and sometime assumed experiences such as a camping trip, car or subway trip, etc. that an assessment task may be built around.  The student focus groups were asked if such experiences would be familiar or not familiar to students like themselves.

     As evaluator of this project I had pointed out that names of imaginary students used in tasks might also denote different social context, depending on both old and new naming traditions, for example James compared to Jamal, or Mary versus Maisha. But, even this point is more complex. I had asked some of the older young people in the project if they thought this was problem – and some said they had become use to seeing certain names in text books that didn’t fully represent their experiences. 

    The explicit recognition of these types of issues in the project by incorporating “Student Voice” has been important in helping the project to keep in sight its goal to produce an LP and assessments that are valid for a diverse group of students.

  • Icon for: Matt Fisher

    Matt Fisher

    Facilitator
    Professor
    May 14, 2019 | 10:13 p.m.

    Based on your experience to this point with incorporating "Student Voice" in the context of your projejt, are there any recommendations you would offer to others who would want create similar resources that could work with a diverse group of students?

  • Icon for: Sarah Ohls

    Sarah Ohls

    Co-Presenter
    Sr. Research Project Manager
    May 15, 2019 | 11:54 a.m.

    Hi Matt,

    One recommendation I would offer is that it's very much worth the time and effort involved to collect observations and feedback from the students you are creating them for.  Our project team is made up of a variety of researchers and practitioners, some with extensive experience working with students from diverse backgrounds.  All of these people were involved in developing and reviewing materials, and identified and resolved many potential issues before they ever made it to students.  But even so, the actual student interactions identified unanticipated issues, and in some cases, ones that significantly interfered with their ability to fully display their understanding.  The one-on-one interactions with students gave insights into the reasoning behind many of these instances that we wouldn't have been able to get by looking at their work alone.

    A second observation is that, particularly in a field like mathematics, there can be a delicate balance between maintaining mathematical precision, while using language that is familiar and accessible to all students.  Students often expressed a greater comfort with colloquial phrasing, and in many cases on reviewing the items we were able to make those substitutions.  In other cases, a preferred phrasing would compromise the mathematical precision of the item, and the team was left looking for other ways to resolve the complexity that created.  As mathematicians, the tendency is to prefer using the most mathematically precise terminology in all cases, but for some students that may interfere with their ability to display the conceptual understanding that they have if the development of their mathematical vocabulary is less advanced.

  • Icon for: Edith Graf

    Edith Graf

    Lead Presenter
    Research Scientist
    May 15, 2019 | 12:38 p.m.

    Hi Matt,

    Thank you for the question. Here are some recommendations, which I think could be applied to learning progression-based task design, but also to many other resources (I think colleagues in educational software or game design may like to speak to how they have used these techniques!)

    Use multiple methods to collect input. We used both cognitive interviews and focus groups. I think these two methods provided different, but complementary information. For example, during the cognitive interviews the students worked on the tasks and verbalized their thought processes, which provided rich information about solution strategies (how they were approaching the mathematics). During the focus groups, students provided feedback directly about what they found clear or unclear.

    Provide a set of guiding questions, but also offer the opportunity for students to comment on issues you may not have considered. The guiding questions can take the form of a questionnaire, or they can be asked more informally. They are useful for eliciting information about particular aspects of the resource you may have questions about. In a focus group, it may be better to ask guiding questions at the beginning, to frame the discussion, and then ask at the end, “Is there anything else you would like to tell us?” In a cognitive interview, it may be better to ask retrospective questions after the student has completed a task, so you do not interrupt the thought process.

    Convene a decision-making team to address the input you have gathered. As Sarah pointed out in her post, there is a delicate balance between using technically precise mathematics vocabulary and assessing conceptual understanding. To make matters more complicated, many mathematical terms have multiple mathematical and colloquial meanings (the word “range” is an example). Discussions among team members about how to address such issues are key.

    Consider gathering student input at multiple time points in the design process. For example, your team might conduct an early brainstorming session before design of the resource begins (I would still recommend providing a set of guiding questions), once a first draft or prototype has been prepared, and after revisions have been made (did the revisions incorporated by the decision-making team address the issue?)

  • Icon for: Peg Cagle

    Peg Cagle

    Facilitator
    math teacher & math department chair
    May 16, 2019 | 04:02 a.m.

    I have been familiar with, though never directly involved with, the work of the Algebra Project and the Young People's Projects for many years. It is exciting to see this partnership with ETS which implies that some of the many lessons learned by Dr. Moses and his team will gain a larger audience, and have a positive impact on greater numbers of students. I am curious how the materials currently being developed will be disseminated, and how you see them interfacing with more traditional curricular materials being used in most schools across the country.

  • Icon for: Edith Graf

    Edith Graf

    Lead Presenter
    Research Scientist
    May 16, 2019 | 06:23 p.m.

    Hi Peg,

    Thank you for the question. Although I am posting the response, it reflects the collective thoughts of our project team. Our current work is a research project and so we intend to disseminate research findings with respect to the learning progression (LP) and the tasks through conferences and research articles.

    The provisional LP includes three content strands: 1) the “traditional” strand, which assumes that functions from real numbers to real numbers are the main focus; 2) the “finite to finite” strand which involves mappings between finite sets; and 3) the “geometry” strand which focuses on geometric transformations as functions. The latter two strands are aligned with experiential curriculum modules developed by the Algebra Project, although the ideas could be found in more traditional materials. The “traditional” strand, which grew out of work at ETS, is engaged by Algebra Project and non-Algebra Project students. It is what most educators would think of as the content of algebra courses.

    The goal is to provide a research-based LP that is applicable to different mathematical contexts in which functions appear, and that has been validated across the diversity of students in our schools.

    The tasks we are developing are being used provisionally in professional development efforts. At the end of the project, our hope is that the LP and the tasks could be used in further research efforts around the application of the materials to professional development and formative assessment. In our view, validating the interpretation and use of a LP entails not only answering questions about whether the cognitive model of learning is a reasonable one, but also about whether use of the LP in the classroom leads to instruction that effectively guides student learning. It is this question that we are interested in investigating next, and it is our expectation that answering these questions will involve the combined efforts of researchers and experts in professional development.

    We are interested in collaborating with others who might want to use these materials in a research context, so please contact us if you have interest!

    In addition to the materials, the team is interested in extending some of the techniques (cognitive interviews by near peers in particular) to inform teaching and learning. Also, our team is comprised of organizations that participate in an emerging national “We the People – Math Literacy for All” Alliance, https://iris.siue.edu/math-literacy/, which was initiated through a NSF INCLUDES DDLP #1649342. Classrooms, schools and districts participating in the Alliance would be one potential context for this future research and professional development work.

  • Icon for: Leanne Ketterlin-Geller

    Leanne Ketterlin-Geller

    Higher Ed Faculty
    May 16, 2019 | 06:40 p.m.

    I enjoyed your video and appreciate the work you are doing to enhance instruction and classroom assessment procedures through the integration of learning progressions. I am particularly intrigued by your consideration of the relate-ability of tasks. The video mentioned that you examined their relate-ability through cognitive interviews. I'm curious about the criteria you used to create relatable tasks and then how you evaluated the students' responses to determine if the tasks were indeed relatable and to whom.

     
    1
    Discussion is closed. Upvoting is no longer available

    Edith Graf
  • Icon for: Edith Graf

    Edith Graf

    Lead Presenter
    Research Scientist
    May 16, 2019 | 07:16 p.m.

    Hi Leanne! Thank you for the question. The question about relatability was asked directly, during focus groups: 

    "Is the situation described in this problem one that you, personally, can relate to?  If not, explain why not."

    It mostly applied to tasks that were contextualized in a real-world setting (we also developed tasks that were not).

    The question was posed to near-peer mentors (staff of the Young People's Project).

    I think another way we could approach this would be to have initial brainstorming sessions where we provide students with a list of activities with potential mathematical applications, and ask them to indicate which are familiar to their experience.

  • Further posting is closed as the showcase has ended.