home Top News One Test To Measure It All

One Test To Measure It All

Taylor Abbey | tabbey19@dtechhs.org | February 1, 2019

What are the needs of these users?

As January intersession began to wind down, many students scrambled to put together their coveted photo essays–all tailored to display every student’s unique d.lab experience. However, amidst the hustle and bustle to stash a bunch of low-quality photos into a Google slide deck, it was revealed that a far greater threat to our d.lab grades was thrown into the mix: the design thinking test.

In typical d.tech fashion, details about the test were issued sparingly to students upon announcement. With only a couple of days leading up to the exam, only two components were known: one–it was a written exam, and two–we had 30 minutes to take it.

Director of Learning, Nicole Cerra, was the driving force behind the idea of administering the Design Thinking test, which soon garnered support from School Director Melissa Mizel–who assisted Cerra in spearheading the initiative. After the test’s first launch on January 18th, 2019, discussion and uproar sparked among students pertaining to the idea of what it means to measure one’s design thinking abilities accurately.

Initial student perception of the test proved to be polarizing across all grade levels. Jasper Bull, an 11th grader part of the VR d.lab this past intersession, noted how perplexed he felt towards the concept of a design thinking test upon hearing about it. “The idea of a design thinking test is absurd,” Bull recalled, “Like, how do you test a mindset?”

Alternatively, sophomore Mari Managadze generally felt neutral when it came to taking the test. “I didn’t really care,” Managadze noted, “However, there were people around me that were saying, you know, ‘This is crazy, this is insane–we’re taking a test about design thinking?’”

Among the senior class, however, the consensus around the test was that it fell into the vein of conventionality. Senior Maya Pratt-Bauman initially viewed the concept of a written design thinking test as bearing too much resemblance to that of a traditional high school, and questioning d.tech’s transition into more traditional means of assessment. “I personally really liked the POL presentations because it gave me a chance to reflect on everything,” Pratt-Bauman said, “So I thought, ‘Really? A written test?’”

Despite the whirlwind of confusion surrounding the test itself, it’s not the first of its kind. The idea of quantifying and measuring design thinking comes from a man by the name of Adam Royalty. Though spending time at the Stanford d.school, Royalty is also a Columbia professor who administers a similar test to his Introduction to Design Thinking class.

Cerra, along with Mizel, consulted Royalty’s methods and thus pioneered the testing initiative, viewing the test as an opportunity to see if students could transfer their design thinking skills over into new scenarios they hadn’t encountered before.

When discussing the advantages of the test as opposed to POLs, Mizel commented, “POLs focus more on growth as a designer. The test is more like you’re looking at these different tasks and they’re asking you to focus on different aspects of [the design thinking process].”

By isolating these skills in a test format, Mizel notes how the test can serve as a teaching mechanism for students who may not have gotten adequate design experience via their d.lab classes. “It’s like we’re not necessarily aligned around, ‘You make sure that you teach students how to do [design thinking]’,” Mizel stated, “So [the test] might have been, for some people, the first time in the [school] year they were practicing this.”

As students sluggishly emerged out of their @d.tech classes on the last day of intersession after being subjected to the much-anticipated assessment, student opinions erupted about all facets of the testing experience–everything from the questions, rubric, and even the 30 minute time limit.

The test itself came equipped with three sections in total–with the number of sections you had to complete dependent on your d.lab level (first-year design lab students only had to complete one of the four sections, second-year–two sections, and so on).

The overarching objective was for students to carefully examine an image, or written scenario, and develop as many solutions to the perceived needs of the people within the scenario as possible. Drawing from real-world instances, the test featured an image of a coffee shop students had to identify potential needs for, as well as designing simulations for SpaceX participants’ experiences in space to be more bearable while being confined to a tiny capsule.

However, students found that the staunch time limit, as well as a particular low-quality image depicting a coffee shop scenario within the first section, greatly influenced their testing experience as a whole–leaving students hung up on the first section at the hands of a faulty print job.

“Taking the test was sort of a bad experience because it was like, ‘Look at this picture and empathize with it,’” Bull commented, “But I was just looking at a black and gray square and I could barely make out the details at all.”

Some were also quick to point out the redundancy of some the expectations listed in some scenarios. In order to achieve a grade of Proficient or higher, quantity of ideas takes precedence over quality. In the third scenario, Managadze felt a repetitiveness in her statements, a feat many students needed in order to get higher than a Developing. “It was a little strange trying to design something for the one where you had to create different scenarios for what it would feel like inside a space capsule,” Managadze continued, “I felt like I had to repeat my [answers] constantly.”

Pratt-Bauman voiced similar concerns. While she felt as though the assessment was adequate in terms of asking open-ended questions and explaining your thought process, she also stated how she, “Felt like we didn’t have enough time and [the test] could’ve been more precise in what they were asking for.”

While the administration has yet to settle on a reliable mechanism for measuring a student’s ability to utilize design thinking, it seems as though we can expect this test in future intersessions. While opinions about the test remain overwhelmingly negative, one question prevails among students and staff: is there truly one, solid way to test design thinking?

Leave a Reply

Your email address will not be published. Required fields are marked *