The Effectiveness of STC Technology Recommendations: A retrospective informed by the 2021 AT Survey
In 2021, the Arts & Sciences Support of Education Through Technology (ASSETT) team conducted its latest Academic Technology survey across the Ƶ Boulder campus. Those results, analyzed by Shane Schwikert (Learning Analytics & Assessment Lead) and Michael Schneider (Educational Data Analyst) are used to give us a sense of how prepared undergraduate students are for the technologies used in the classroom, how the perceptions of undergraduate technology use differs between instructors and students, and how the technologies used by instructors in the classroom compares to the technologies students deem most helpful.
As the manager of the Student Technology Consultants, a group that recommends technologies to instructors and helps them implement those tools in their classroom, I was particularly interested in that last element. How well do the technologies we’ve been recommending line up with what students deem helpful in the classroom? While many of our recommendations are informed by the specific projects instructors bring to our team, we’ve been able to move more and more into building instructional guides or suggested use articles for technology tools of our choosing. If we can cater those “reverse-engineered” tutorials to the data gathered from the 2021 AT survey, we could avoid much of the guesswork that comes from trying to map the experiences of a handful of students onto the entire student body.
Luckily, Z MacLean (Education & Technology Consultant) and I had been working on building a database of all of the tools we could find that our team had recommended in the past. From there, we were able to tag them with the same nomenclature used to tag the technologies in the AT survey. By overlaying that information with the structure of the Learning Technology Tools scatter plot, I created the following visual.
I first counted the number of instances a tool in each tag had been recommended by an STC. Then, I gave each tag coordinates that corresponded to their respective position on the Learning Technology Tools scatter plot. Finally, I made sure the bubbles grew in size with the number of times a tool in that tag had been recommended by an STC. “Negative” bubbles, or tags that have not yet been recommended by an STC, were left blank.
So how did we do? The data collected by the AT survey was meant to encourage ASSETT to focus on the technologies in Quadrant 4 of the scatter plot, or OER, Data Analysis Tools, Data Visualization Tools, Online Practice, Online Polling, Tablets, Online Training, Screencasting, Chat-based office hours, Graphic Design tools, Doc Cameras, Web hosting, and E-portfolios. Of those 13 tags, we have recommended tools in 8, or 61.5% of them. 20 of the 40 recommendation counts that fit within these tags, or 50% of our applicable efforts, are already being directed towards these tools. While that certainly leaves us room to grow, it also affirms our efforts thus far.
It’s worth pointing out that many of the kinds of technologies we’ve recommended are not included in the tags listed on the AT survey, so we can’t make a reasonable estimate as to how “helpful” students perceive those technologies to be. Since that means we couldn’t include them on STC Tech Recs bubble chart, I instead made a separate bar chart with the frequency of those recommendations (tools not recommended were omitted from this chart).
Moving forward, I would recommend we continue to address Quadrant 4 technologies without feeling completely bound by them. Some technologies—data analysis tools, for example—may have a skewed usefulness for our students at the College of Arts & Sciences because of how helpful it is in other colleges included in this campus-wide survey. That isn’t to say data analysis isn’t relevant for A&S students (I’m writing this data-informed article as someone with a graduate theatre degree from A&S, after all!) but the non-numerical visualization tools we have recommended often and use more frequently might cover some of those bases.
Additionally, I suspect other campus entities like OIT and the Ƶ Library will already have wonderful tutorials on the more hardware or research-focused technologies like Doc Cams and Lit Search Tools. Our efforts might be better focused in promoting these pre-existing tutorials rather than trying to create our own. We should also continue to earnestly support the Quadrant 1 technologies—they are already frequently used after all, so helping instructors use them well may have a greater immediate impact.
Finally, our bar graph shows that we’re supporting many technologies that haven’t made it into a campus-wide assessment. This trend will likely continue as technology grows and shifts at a pace we can’t quite keep up with. Continuing to try new, wonderful tools with a renewed focus in getting TIP feedback from instructors and students can keep our efforts informed as we delve into the unknown.