Working Group:Curriculum Documents and Assessment
Chair: Phil Daro, Malcolm Swan
The following papers provide the background for the work of this group. Please bear in mind that some of the these papers are informal, or represent work-in-progress. To enable the working group sessions to focus on discussing these and other issues in relation to the conference themes, we suggest that delegates familiarize themsleves with the papers before the session.
Designing Assessment of Performance in Mathematics
Hugh Burkhardt and Malcom Swan – Mathematics Assessment Resource Service/Shell Centre
The effective implementation of intended curricula that emphasise problem solving processes requires high-stakes tests that will recognize and reward these aspects of performance across a range of contexts and content. In this paper we discuss the challenge of designing such tests, a set of principles for doing so well, and strategies and tactics for turning those principles into tasks and tests that will work well in practice. While the context is England, the issues raised have wider relevance.
Common Failures in Strategic Design
Hugh Burkhardt – Mathematics Assessment Resource Service/Shell Centre
These two examples, one from assessment design and one from the design of curriculum documents, are from my forthcoming paper for “On Strategic Design”.
Mapping children’s mathematical development in the early years
Brian Doig – Deakin University, Australia
The tasks described here, and those still under development,are in a highly structured format but the look and feel of the tasks, from the children’s perspective, is that of games. These games cover a range of aspects of mathematics, including number, chance, measurement, and mathematical structure. All games use simple equipment. An over-arching feature that these games have is that they should begin play with concrete materials, and then move on to playing mentally. To date several children have been interviewed and issues with the games revealed. Two examples of games are detailed, and interested parties are invited to comment or participate in further development.
Supporting Targeted Connections: A Call for Cross-Curricular Design
Dr. Cheryl Malm, Northwest Missouri State University;
Dr. Patricia Lucido, Rockhurst University
Current efforts by the National Governor’s Association’s Common Core Standards Initiative seek to articulate expectations for what students should know and be able to do grade-level by grade-level within a content area (NGS, 2009). There is no indication, however, that any attempt is being made to correlate topic placement between content areas. Close examination of mathematics and science concepts to identify supporting ideas, processes, and skills would allow the design of parallel curricula that would take advantage of “targeted connections” that arise naturally within the study of a unit. Such parallel programs expand the definitions of integrated curricula and correlated lessons to include the idea of correlated conceptual explorations. (Berlin & White, 1994; Offer & Vasquez-Mireles, 2009). The supporting connections that exist between mathematics and science would be made explicit through the process of aligning the sequence of topics for each grade level across the content areas. This paper will look at the differences between integrated curricula and correlated lessons and the challenges each face. Targeted connections and parallel curricula will be defined and an example of the type of connections that could be exploited to enhance student exploration and understanding of each content area will be given.
Computers in Mathematics Assessment (shared with Software group)
Daniel Pead – Mathematics Assessment Resource Service/Shell Centre team
University of Nottingham, UK.
This paper details recent research and development undertaken at the Mathematics Assessment Resource Service, University of Nottingham, and focusses on three different computer-based assessment projects: The development of problem-solving tests for the World Class Arena project, an evaluation of a new digital version of an existing paper assessment, and a small-scale design research project looking at issues which might arise from computerising an established high-stakes assessment. The computer is, ultimately, a delivery medium and not tied to any pedagogical theory: these case studies show that solutions can be found to support – and hopefully enhance – very different assessment cultures. They also highlight many technical, practical and organisational issues and how these could, in some cases, unintentionally subvert the educational aspirations of a project.
SMART Assessment for Learning
Kaye Stacey, Beth Price, Vicki Steinle,
Helen Chick, Eugene Gvozdenko – University of Melbourne, Australia
“Specific Mathematics Assessments that Reveal Thinking,” which we abbreviate to “smart tests,” provide teachers with a quick and easy way to conduct assessment for learning. Using the internet, students in Years 7, 8, and 9 undertake a short test that is focused strongly on a topic selected by their teacher. Students’ stages of development are diagnosed, and sent to the teacher immediately. Where available, on-line teaching resources are linked to each diagnosis, to guide teachers in moving students to the next stage. Many smart tests are now being trialled in schools and their impact on students’ and teachers’ learning is being evaluated. Design issues are discussed.
Designing Questions to Probe Relational or Structural Thinking in
Arithmetic
Max Stephens – University of Melbourne, Australia
How do we probe more deeply into connections between structural thinking in arithmetic, on the one hand, and mathematical structure, on the other, to learn more about shifts from particular to structural
understandings? The importance of structural understandings in these contexts is that they offer students a source of control which allows them to move beyond the particular situation. In designing a research
instrument we need to capture the extent to which this control is open to growth – that is, it is open to increasing levels of generality.
Conferences
- #18 Conference in Boston USA 2023
- #17 Conference in Nottingham 2022
- #16 Virtual Conference – March 24, 2021
- #15 Pittsburgh, PA USA 2019
- #14 Galway 2018
- #13 Berkeley 2017
- #12 Utrecht 2016
- #11 Boulder 2015
- #10 Cambridge 2014
- #9 Berkeley 2013
- #8 Utrecht 2012
- #7 Boston 2011
- #6 Oxford 2010
- #5a Cairns 2009
- #5b 2009 UK Regional Meeting
- #4 Egmond aan Zee 2008
- #3 Berkeley 2007
- #2 Oxford 2006
- #1 Oxford 2005