Tag Archive for assessment

Assessment for Next Generation Learning

A new article from EdSurge describes a MIT effort to design assessments for next generation learning. “Playful assessment” captures curiosity, creativity and critical thinking within the natural context of student learning activities.”It emphasizes recognizing and reflecting on what works and what doesn’t, and in response, identifying skills to improve on moving forward.”

While such habits of mind are recognized as essential for today’s learners and are frequently embedded in curriculum and lesson design, they are also difficult to systematically and accurately assess. Instruments such as the Mission Skills Assessment and SSAT Character Skills Snapshot have emerged in recent years but are disconnected from classroom curricula.  Effective teacher assessment is needed to both measure and deepen lasting next generation learning for students.

Photo by Plush Design Studio on Unsplash

Uses of Technology to Enhance Formative Assessment and Differentiated Instruction

CiC Tech Formative DifferentiatedAcademic Technology Director Jeff Tillinghast and I have co-authored an article for Curriculum In Context, the journal of the Washington State Association for Supervision and Curriculum Development, an ASCD affiliate. We wrote a practitioner’s view of how our teachers use contemporary computing technologies to provide specific, rapid, and varied feedback to students and then accordingly adjust individual student instruction. Read the article (PDF) or access the full issue. Many thanks to Seattle Pacific University professor David Denton for inviting us to contribute to the journal.

 

Quantitative study of school programs

On reviewing last winter’s issue of Independent School Magazine, I was struck by stories of schools conducting rigorous studies of their own practice, particularly quantitative studies. Granted, the issue theme was “Assessing What We Value,” but turning the lens of assessment inward onto school practice represented a significant additional step in my mind.

In the article, “The Role of Noncognitive Assessment in Admissions,” the author described several schools that are collecting new information about students, traits that might help predict school success. One school (Choate Rosemary Hall) found statistically significant correlations between self-efficacy, locus of control, and intrinsic motivation (as reported by students) and GPA.

2013 E. E. Ford grant award winners included Castilleja School, to support the development of “meaningful and valid assessments of experiential learning, to apply these tools to improve the effectiveness of innovative experiential programs, and to share these best practices with other educators.” $1 million, three-quarters of this raised by the school, supports this effort.

I am following a similar path here at U Prep. Whether the question is the predictive power of standardized assessments or the meeting agendas of our instructional leadership team, I find myself quantifying behavioral data, seeking patterns, and sharing the information with people. Is this just coincidence?

While I have not rigorously studied and confirmed the possible existence of a trend toward quantitative program analysis (irony intended), it seems to me that several contributing factors might exist. Quantitative data is more easily collected, processed and shared than before. The setup of a Google Form is trivial, compared to the “old days” (actually just 10 years ago) when we used to write online forms in Perl on our school web server. Data visualization has grown as a field, to the point where major news corporations prominently feature beautiful, illustrative graphic representations of data, and programming libraries make the process easier. Publication and presentation tools easily incorporate such graphics. Use of data to support conclusions has remained a respectable practice, notwithstanding occasional misuse.

In years past, schools would rarely conduct quantitative study of their own work without substantial external help or an internal reassignment. This lent a measure of respectability to the work, as one would expect valid work from a consultant or internal member of the faculty or staff. Now, with people like me studying school practice within the scope of our full-time jobs, the risk exists that we will reach conclusions that are not well supported by the data or not well compared against results from other institutions. We have to be careful, as well as thorough.