All too often, transitioning to computer-based testing can be perceived as a daunting challenge, particularly in schools where the technology is mobile, and students and teachers are more familiar with traditional paper testing. But this doesn't have to be the case, as explained by Jason Agress, Network Media Specialist for Newton Public Schools.
During Wednesday's JAMF Nation User Conference (JNUC) session, Agress shared his learning experience from Newton's journey to technical readiness when preparing to administer Common Core State Standards exams online.
Agress summarized his journey into four main lessons learned:
- Get hands-on with the testing platform and user experience
- Lots of testing and soliciting user feedback
- Leverage Casper’s reporting tools
- Supporting and monitoring implementation
He discussed the technical details of preparing, deploying, and supporting the technology, as well as how this allows students and teachers to focus on the test itself — not the technology associated with it. Agress utilized groups of students to try and “break” the testing environment so he could fix problems before they arose in live situations.
“Test, test and test some more.” Agress couldn’t stress enough to the audience how important this was. They tested and re-tested to ensure they knew exactly what the user’s experience would be.
Obviously it is important for Agress to know which machines were ready for testing. In order to pull a list of machines not ready—machines that needed remediation—he used policies, smart groups, saved searches, and extension attributes.
In case a machine did need to be updated immediately to be used for testing, his team created Self Service policies. These policies were so easy to use that in many cases, the teachers began running them in the classroom without needing the IT department’s intervention.
One of the last gems that he shared was how he used extension attributes for reporting in the JSS without needing to pull full JSS inventory reports. Scripts with API allowed them to barely even add any burden to the JSS and credited JAMF for help in implementing these time-savers. This reduced operational drag and increased the efficiency of reporting on the critical data points.
The Q&A was full of questions surrounding topics from Java updating at inconvenient time to how much time Agress spent with Pearson’s support. Circle back to this blog in a few weeks to watch the full session video.