What I got was a rushed melee of evidence and analysis I couldn't hear or see half the time.
The problem...sMan, I love me some Adobe Spark, but it was NOT the super-solution I had anticipated. There were a few problems with the platform:
- Adobe Spark apparently doesn't work at all on MacBooks, the device of choice for a handful in each class.
- If they were using desktops, say during lab time, were blocked by the school firewall.
- Screencastify records videos as .WEBM files. Spark only accepts .MP4 and .MOV files, and for file converting took forever if it worked at all (I have no idea what they were doing that it wouldn't work, though).
My kiddos did figure out how to get around the 30-second snippet "suggestions" for videos, but that made me wish all the more for a fast forward option when I ended up with 7-10 minute videos for speaking and listening. Had I but required that they download the files to submit on Classroom, I could have used Google's x2 function! But silly me, I thought allowing them to just submit the link would be the most humane thing for them.
Also, they almost never checked to make sure I could actually HEAR their speaking/listening samples over the music. And of course if they had issues uploading, well, I was just expected to pause the video, open a new window, type out the Google Drive link they had added into the URL bare (sometimes graciously shortened) so I could bask in their abilities.
Note: I did no such typing and just took off points.
As for the samples themselves, I thought it was a pretty simple matter to take a screencast of a Voicethread from last year or a Vib from the beginning of this year. Then they could just go back and add more detail and do another!
Note: they did not agree on the matter of the simplicity of screencasting.
I will confess, though, that updating a speaking sample was probably actually pretty nearly impossible--unless they happened to own the Spark video and could just rerecord their own slides and republish. Otherwise they had to have a WHOLE new conversation. Heaven forfend.
So all in all, it was a melange of technical difficulties, instructional ignoring, and time constraints that got us, well, wherever we ended up.
I don't think it had to be that way though.
Solution #1: Start at the beginning
I think this all could have gone a lot more smoothly if this had been my plan at the beginning of the semester. However, I had pipe dreams of empowering portfolios and student choice in their own assessment.
Note to self: they still don't know what they don't know in Spanish 2. "Make your own objectives" was also a disaster. Maybe it was how I graded, or maybe it was giving too much choice all at once, or probably some combination of the two plus just a really weird year overall.
BUT if I had, say, had them add one sample at the beginning of the year, revise that sample, and reflect on the level that one sample showed, wash, rinse, repeat each six weeks, we might have been in better shape.
If we had done that, they would have had a video from the very beginning that they just kept adding to, and thus--hopefully--revising for legibility and audibility's sake at the very least. Closing the feedback loop, as Dra. Tharrington says.
But even before all that, I think I'll have them review some of the awesomer examples I collected this year. I'll have them watch them and answer questions like:
- How much time do you need to actually read the written samples?
- How much time do you need to tell how well they can speak or hear?
- How long are you willing to watch and/or listen to one sample?
I'm hoping this will make them see that I really don't need more than 2 minutes to tell how well they can read/write/hear/speak, but I do need to actually have enough time to, you know, process what they're showing me. (One kid thought the 30-second limit was for the whole video, not each slide...that took some pausing and rewinding, let me tell you.)
Solution #2: Substitute slides
I still like the idea of having students describe their own abilities in terms of functions, text types, and strategies. I still like the idea of them stringing samples together in ever-improving sequences.
I also like to be able to control what I'm looking at.
Yeah, it sounded well and good to have something future employers could just hit play on. But frankly I needed to backtrack sometimes, and there was a lot of annoying guessing as to where that think I missed might be in the timeline. Google Slides, however, would allow me to progress at my own pace, moving on when I'm ready and finding the previous sample for simpler comparison without endlessly scooting the cursor back and forth.
There would also be more flexibility for me to compare the actual skills/levels students say they're demonstrating with the actual samples they provide me.
Also, you can embed .webm files straight from Google Drive.
Solution #3: RUBRICS
Well, just see the list of problems above.
So I want to spell out exactly what I need from my little grasshoppers, what it is a portfolio has to have to be worthwhile, and you know, if there's a point value, so be it.
I really liked the single-point rubrics my amigo pointed out after Dr. November went all scorched-earth on rubrics as destroyers of creativity when professionally developing some of us from the district. So I offer these "single points" to go in between the "concerns" and "advanced" columns:
- Professionalism - Analysis and examples are clear and attractively organized
- Self-evaluation - Accurate description of skill level emphasizing function, text type, and strategies along with ACTFL performance descriptors
- Growth - Revised samples appropriately address errors and demonstrate increased proficiency
- Reflection - Thoughtful assessment of improvements and strategies for continued growth