Difference between revisions of "CLIO Evaluation"
(4 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
{{DevelopmentPage}}Our evaluation process often happen concurrently with our design and research process, allowing us to make tweaks and changes as we continued to learn more about how we could use the project. During the development process, evaluation has been performed in the form of user testing, as well as concept evaluation. We performed formal evaluation, which included user surveys and testing, as well as app critique sessions and web seminars aimed at addressing concerns about open-source technologies. | |||
Our evaluation process often happen concurrently with our design and research process, allowing us to make tweaks and changes as we continued to learn more about how we could use the project. During the development process, evaluation has been performed in the form of user testing, as well as concept evaluation. We performed formal evaluation, which included user surveys and testing, as well as app critique sessions and web seminars aimed at addressing concerns about open-source technologies. | |||
==Formal Evaluation== | ==Formal Evaluation== | ||
Line 12: | Line 11: | ||
We performed two pilot test programs for over twenty upper elementary classrooms, each with between five and thirty students. Teachers were given an online experience survey where they could express concerns or approval about the pilot test programs. Students were also given a qualitative pre- and post-survey designed to evaluate a change in reasoning skills and not a change in numerical score. | We performed two pilot test programs for over twenty upper elementary classrooms, each with between five and thirty students. Teachers were given an online experience survey where they could express concerns or approval about the pilot test programs. Students were also given a qualitative pre- and post-survey designed to evaluate a change in reasoning skills and not a change in numerical score. | ||
{{See|Slater Museum Evaluation}} | {{See|Slater Museum Evaluation}} | ||
=== Evergreen State College === | |||
This project was a collaborative effort to create exhibition content that engages the expertise and educational resources of multiple project partners. We worked with the Evergreen Natural History Museum, the Daniel J Evans Library, the House of Welcome and the Evergreen Gallery to provide physical items, digital media and text-based interpretation for these exhibits. Activity content was drafted by students and volunteers using word processing templates. The content was evaluated as a group before it was coded into a final activity prototype. We used an evaluative instrument to run preliminary user testing of the kiosk software, hardware and interpretive content. This included an in-person survey for faculty and students with questions relating to the hardware, software, user interface, user experience and content. Additionally, we created a completely remote evaluative survey option for activity content that allowed us to involve the community. | |||
{{See|Evergreen State College Evaluation}} | |||
==Critique== | ==Critique== | ||
Line 23: | Line 26: | ||
==References== | ==References== | ||
<references />{{DevelopmentNavigation}} |
Latest revision as of 12:06, 12 May 2022
Our evaluation process often happen concurrently with our design and research process, allowing us to make tweaks and changes as we continued to learn more about how we could use the project. During the development process, evaluation has been performed in the form of user testing, as well as concept evaluation. We performed formal evaluation, which included user surveys and testing, as well as app critique sessions and web seminars aimed at addressing concerns about open-source technologies.
Formal Evaluation
Burke Museum
While working to integrate open-source technologies into an existing Burke educational program, we performed concept evaluation regarding the strengths and weaknesses of technology exhibits currently used in the museum field. We used this information to help form our own technology exhibit concepts. For each project proposal we created, internal concept evaluation was based heavily on the MUSETECH model. The project took shape, with a kiosk and exhibit software, and we performed internal user interface and interaction testing with a sample of Education staff, facilitators and volunteers. During development, an online virtual kiosk was deployed to allow the team to evaluate the current exhibit software prototype, including the educational content drafts, in between our monthly project meetings. Due to the COVID-19 pandemic, social distancing restrictions made it impossible to complete visitor evaluation according to our original plans.
Slater Museum
For the Slater Museum of Natural History, we wouldn't be focusing on creating content to be used on a physical kiosk as we had in the past. Due to the COVID-19 pandemic, social distancing made it impossible to safely facilitate live sessions. As a team, we performed initial concept evaluation around the museum's existing education programs and how we could bring them online so they were available to remote students. We settled on using a hybrid approach; attempting to use a combination open-source technologies to help remote lessons and activities feel more personal and interactive. We evaluated using Open Broadcaster Software[1] to display interactive CLIO activities through video conference software, such as Zoom, Meets, or Teams, as well as how to give students remote access to the same activities as the educators.
We performed two pilot test programs for over twenty upper elementary classrooms, each with between five and thirty students. Teachers were given an online experience survey where they could express concerns or approval about the pilot test programs. Students were also given a qualitative pre- and post-survey designed to evaluate a change in reasoning skills and not a change in numerical score.
Evergreen State College
This project was a collaborative effort to create exhibition content that engages the expertise and educational resources of multiple project partners. We worked with the Evergreen Natural History Museum, the Daniel J Evans Library, the House of Welcome and the Evergreen Gallery to provide physical items, digital media and text-based interpretation for these exhibits. Activity content was drafted by students and volunteers using word processing templates. The content was evaluated as a group before it was coded into a final activity prototype. We used an evaluative instrument to run preliminary user testing of the kiosk software, hardware and interpretive content. This included an in-person survey for faculty and students with questions relating to the hardware, software, user interface, user experience and content. Additionally, we created a completely remote evaluative survey option for activity content that allowed us to involve the community.
Critique
MuseWeb 2020
We attended the MuseWeb 2020 conference, which was happening virtually as a result of the COVID-19 pandemic. We took part in an app critique session where leaders in the museum field evaluate and critique technology-based projects. We also wrote the paper "Designing CLIO, An Open-Source Toolkit for Museum Pop-Up Digital Interactives"[2] that was peer reviewed prior to publication.
Lunch and Learn Webinar
In order to inform the museum community about the CLIO toolkit and garner feedback on interest in and barriers to deploying technology like CLIO, two online informational and discussion sessions were held with 25 museum professionals on May 13th and 14th, 2020. We have provided the transcripts of the discussion portions of both sessions.