Background
As part of one of my core courses at Georgia Tech, Psychological Research Methods for HCI, I completed a semester-long research and design project. Because this project was completed during the Covid-19 pandemic, all research and design work was carried out completely remotely. From a list of topics selected by our professor, we chose to "Help people create more butterfly, hummingbird, and bee-friendly gardens."
What I Did:
UX Researcher + UX Designer
Tools Used:
Qualtrics
Miro
Adobe XD
Axure RP
BlueJeans
Team:
Ellen Guo
Timeline:
August 2020 – December 2020
Phase I: Problem Space
Information regarding pollinators and pollinator-friendly spaces in urban spaces is currently difficult for users to assimilate. Users in urban areas need an enjoyable and effective way to learn about pollinator-friendly spaces, why they are important, and how to create them.
Phase II: Identifying User Needs
Expert Interview
To better understand ongoing research in this space at Georgia Tech, we conducted a remote, informal interview with Joey Bishop from the BeekeeperGo team. We were particularly interested by their decision to develop a game to encourage citizen science data collection. We learned that the team chose to make a game in the hopes of creating a supportive community and increasing the app’s popularity; however, the team had not conducted any user research.
Survey
With our initial survey, our team aimed to gather a broad overview of our user group’s attitudes towards pollinators and pollinator-friendly spaces. We also used the survey to snowball recruit users who were interested in participating in an interview.
We utilized multiple-choice, Likert scale, and open-ended question types. Our survey was hosted on Qualtrics and was distributed via various online messaging platforms, including Slack and GroupMe.

 Likert scale question and results (visualization by Kelsie Thomas)

Semi-Structured Interviews
We interviewed 3 potential users to gain richer insights into how users like to learn. These 3 informants were recruited via our survey. We used the affinity mapping framework to thematically analyze our qualitative data. From our analysis, we found three primary takeaways:
Users want us to streamline the information finding process by synthesizing data from reputable sources with peer-produced knowledge.
Users want the learning process to feel natural and easy.
Users want more information on how to make a pollinator-friendly space.
Phase III: Iterative Design
Sketches
After conducting our initial research, the next step was to begin creating initial designs to determine what sort of intervention users might like. We developed two concepts: Pollinvite, a gamified progress-tracking app with community challenges that have real world impacts, and BlossomBase, a powerful plant database that utilizes augmented reality (AR) and computer vision (CV) to help users learn about plant species. 
We created a set of sketches and presented them to 3 users for high-level feedback. 
Sketch by me
Sketch by me
Sketch by Kelsie Thomas
Sketch by Kelsie Thomas
Wireframes
Wireframes by me and Tim Trent
Phase IV: Evaluation and Validation
This feedback brought us to our final design concept: “Pollinvite.” This system provides tasks and information related to pollinator-friendly spaces, sorted into broad categories. Pollinvite tracks users’ progress and uses achievements and progress bars to encourage participation. Some of these elements, like achievements, are user favorites from our game design.
Pollinvite also includes broader “community challenges” to add some social buy-in for the system. Users can see if their friends have joined a challenge, but it isn’t a direct competition, as in feedback sessions we heard that direct competition wasn’t desired.
​​Remote Moderated User Testing
Once we completed our design, we moderated task-based usability tests remotely with 4 target users. We asked users to "think aloud" while completing a few representative tasks in our system, such as “learn how to plant a native wildflower” or “join a community challenge.”
After users completed these tasks, they completed a System Usability Scale (SUS) survey. The aggregate results from this, which are visualized below, were compared to well-established and validated industry standard responses to see how well our design met usability and learnability requirements. The 76.875 score indicates our system is in the 76th percentile of systems (usability-wise) and is considered good.
We also conducted debrief interviews after users finished the SUS survey to get more qualitative feedback on our system.

You may also like

Back to Top