UA-59956679-1

Tracking Portfolios, Badges, and IPAs

Published by Laura Sexton on

The first three-ring circus of grades is done for this semester. Three rings, because this was my first attempt at incorporating Integrated Performance Assessments as tests instead of portfolios, and I’m trying to make use of badges in a meaningful way now too.

As you can imagine, though, managing all three rings has been a little tricky. Between remembering to publish badges on ForAllRubrics so kids could actually see them and sorting out who earned what and what they need to do about it, quite frankly, I’m beat.

But it’s a good beat. I feel like something worthwhile is happening, is getting done.

It’s also a learning curve, and I’ve had to ask myself some serious questions.


1. Is switching up what makes an A each six weeks really necessary?

I decided that the sliding proficiency/grading scale I worked out here really only worked for IPAs. The portfolios only account for 20% of the grade as “quizzes,” whereas IPAs represent 65% as “tests.” So, me, I’m satisfied with the integrity of the standards-based/modes-based grade this way.

Color code system and record keeping for
differentiated portfolio goals

Besides, the “quizzes” need to be what guide students’ practice as they prepare for spontaneous performance on the IPAs. So if a kid’s stuck at a Novice Mid level for listening, by golly, she needs to get a 10/10 on Novice Mid listening before she has to stress about demonstrating Novice High listening. If both portfolio and IPA say she’s not there yet, I think it’s pretty safe to say she needs more work, so she should work at revising that portfolio section without penalty.

Mind you, we’ll still practice Novice High skills in class, but when she’s setting personal goals for her homework for the grading period or working on submitting a portfolio section for class, she needs to focus on her personal needs!

The Catch: This means I have to maintain a spreadsheet of portfolio progress–which I have done before–that allows me to track each individual kid’s level for each communication mode and what they need to work on next. Not so bad, but I could not do this if I had 90 kids at a time.

2. How do I get kids to reflect on their current proficiency levels to set appropriate goals for the next grading period?

I mean, if you have a badge, you gotta do something with it, right? So on our snow day last Tuesday, I sent my own kiddos to watch cartoons in the other room and recorded my first screencast on Screencastify.

That right there is probably my tenth though. Remember: learning curve.

Still, this way kids have a solid visual for what they are done with and what they need to work on next. If they go to their badge page, they can quickly see which level they should be working on for each skill.

The Catch: if I did not go through and click publish on each and every individual badge, they ain’t gonna see nothin’. I had to make myself a student account to really get a feel for it from their end, then log out and log back in a bunch (I didn’t feel like downloading another browser). Furthermore, I have to make sure I labeled all of my evaluations of them under Activities on ForAllRubrics as “Teacher Evaluation” so I could quickly sort by activity to find the “official” version (since I’d had them experiment with pledging and peer evaluation and such).

PS: on the badge rundown page I can see as a teacher, it shows me that they earned a badge even if it was students that issued it, so that was slightly confusing. Also, I couldn’t make pledges they sent me go away until I scored them on their pledge rubrics, even if I’d scored the same badge rubric separately.

Shouldn’t a kid who clearly shows in two different IPAs that he is capable of solid Novice Mid level work get to move on to Novice High?

IPAs are perfect justifications for a bump! 10 out of 10 both times on that section of the IPA? BADGE! If the kid can do it spontaneously two times, I feel safe nudging an 8 or 9 to a 10. It’s not quite as easy to justify if their portfolio is abysmal, because then they wouldn’t have enough evidence to make their case should employers or college placement folk be lurking about.

Also, if I factor IPAs in to the badge distribution, I can see who really is struggling with the skill and who is perhaps just slacking a bit.

 Green means go is a quick way for me to see who is ready to move on, so I can just leave the portfolio numbers be and color code the score if the IPA says they’re ready, but it’s also worth making note of whose IPAs indicated they were struggling, so red means I need to stop and pull them aside. (I also made it a point to include the next step suggestions from the AAPPL rubrics on report card comments for those in the red!)

The Catch: it’s hard to remember to go through and 1) issue each of these bonus badges and 2) PUBLISH each of the bonus badges so they can add them to their portfolio. Plus there’s the spreadsheet to be color coded and updated.