UXR Case Study — Steiger Dynamics
INTRODUCTION
The computer. The new era. Since the introduction of the personal desktop computer, people’s lives have never been the same. And with the proliferation of high-definition video solutions of late, it was only a matter of time until they were paired together.
And so the Home Theater PC (HTPC) was born. The concept is not new, but the commercialization of this concept has picked up traction with some PC boutique builders. It’s represented by a niche in the PC market with only a few boutique builders to call themselves true HTPC specialists.
One of these true HTPC specialists is Steiger Dynamics (SD). SD is a Bay Area PC builder specializing in HTPC and offering a variety of different products. SD is the focus of my user experience case study as it fits the self-imposed criteria for user research analysis. The company was founded in 2012 and the last major website revamp was completed in 2018 as indicated by the copyright in the site’s footer.
STUDY GOALS
As this is a Phase 1 (Discover) research initiative, most of the research methods have been crafted to gain insight about SD’s current user experience and to identify major pain points in addition to providing any relevant insight to lead any UX improvements. There’s no need to try to address every issue and answer every question concerning the website’s usability for this initiative.
The primary goal is to understand the user and understand what the user thinks and feels while using the SD website. And every subsequent question echoes this research mindset. We’re out to improve the site as a whole and worry about specific redesigns in the next iteration.
Keeping in line with the research goal, I chose three methods to effectively identify broad usability issues and to capture the user’s journey in equally broad strokes.
The following methods were used:
- Heuristic Evaluation
- Remote Unmoderated Usability Survey
- Competitive Analysis
Each method is cost-effective and provides a fundamental understanding of the user’s journey through the SD website experience. I was quite limited in what I could choose as a research method due to not only the pandemic, but limited third-party tools as I’m not affiliated with a research company and have no access to these tools.
This fact, however, did not impact the effectiveness of the heuristic evaluation or the usability survey as much as it did the competitive analysis, specifically pertaining to any quantitative elements.
Nonetheless, each method was able to yield enough information to synthesize and validate several key insights which will be discussed later.
HEURISTIC EVALUATION
Crafting the Study
The first method helped to evaluate the current design of Steiger Dynamics through a heuristic evaluation (HE).
There are several reasons behind the choice of a heuristic evaluation:
- The number of available participants. I know five evaluators who have bought pre-built desktop systems from similar services throughout their lives and are all technically oriented. And they have not bought a system within the last six months.
- Cost-efficient. All five evaluators agreed to evaluate SD in order to help me build this case study. The cost for participation was a coffee (ordered remotely of course).
- Controlled review of the experience. All evaluators were given tasks to complete before they provided their evaluation.
I used Jakob Nielsen’s 10 Usability Heuristics for User Interface as the foundation for the HE as seen below:
All of the five evaluators completed the HE sheet.
For each heuristic, I assigned a level of severity from 0 to 4. Once the severity was established, the evaluator provided detailed commentary of the issues to reinforce the severity of their answer. And to prevent confusion, a clear definition of each heuristic was added to the sheet.
Only severity of design violations and the issues were measured; recommendations were added to the end of the study. Once all of the HE sheets were filled, I aggregated the results and created a color-coded chart to help understand where the majority of issues are.
The Findings
The primary concerns surrounded the actual UI design and the site’s ability for help assistance and documentation. The minor concern came down to the user’s control ability when making their way into a “tight spot” from where a “hard return” is required to restore page functionality.
As the evaluators do not really represent the true user in their entirety, the heuristic evaluation results were simply added to the main usability survey I crafted for participants. The HE results were used as tailored benchmarks to assist in analyzing the results of the remote unmoderated survey further down the line.
Lastly, the evaluators could have possibly provided more insight if the evaluation was done in-person where I could speak with the evaluator as he looked over SD experience.
REMOTE USABILITY SURVEY — GOOGLE FORMS
Crafting the Study
The bulk of the information came from the second method: the usability survey created through Google Forms. The survey was completed by 6 participants.
Each participant was chosen based on their interest in Desktop PCs and their approx. demographics. Some of the participants were found on Reddit while others were found on other forums including Discord chat rooms.
The survey was set up as one long sheet rather than broken up by sections. I felt this increased the transparency of the study and gave the participant a clear indication of how much time they’ll have to commit to the study.
The completion of the survey required over one week to receive all participant data. In the interest of obtaining results from the correct demographic, I worked to find the appropriate participants thus resulting in an extended deadline for the survey results.
The Study
I’m including a link to the survey here. And below are key questions from the study with their respective results.
The User — Results
Although the survey was unmoderated, I found that the questions posed allowed me to create a viable user persona. From the six participants, the following details were synthesized:
Usability Task #1
For the first usability task, I was looking for a much more generalized response from the participants, focusing on their gut feeling rather than a detailed play-by-play. Their responses surrounded the design and the ease-of-use of the website. Here’s the results summary:
- Design is outdated; too dark
- Simplistic navigation
- Redesign is needed
One participant said, “It’s pretty dark and looks a little old. Like web 1.0. It’s a little off putting.” Another said, “I mean, when was this made? It looks kinda old. I think a redesign is definitely needed here. The black theme is so 2001.”
The navigation did not take a big hit as no participant rated the navigation harder than a 3:
The same was echoed in the answer regarding design and layout. More mentions of “redesign needed” or that the “black theme isn’t working”. There are also mentions of inconsistency between elements throughout multiple pages; buttons disappearing, information misaligned, etc. One participant just wrote “standard” in regards to the overall experience.
Usability Task #2
A more in-depth task for the participants; each was asked to go ahead and build a HTPC system from the products page. It is unclear whether the system they built was completed as indicated by some participants. Still, partially built systems yielded notable details. Results summary below:
On a side note, the biggest non-UX insight is the price:
- All participants complained about the high system price with over half writing that they’d rather build their own systems in a HTPC case.
- Even with a reasonable price, two participants mentioned that they’d rather build their own system regardless of cost.
The Findings
The results were not too far from what was expected actually. The majority of the participants called out for a redesign and felt the overall color scheme along with the layout of SD was certifiably outdated. Only one of the participants actually owned a HTPC, but he built it himself and it was a self-proclaimed HTPC.
A majority of the participants had previous technical knowledge pertaining to consumer hardware. And this led to the same concern surrounding price.
Close to 85% of the participants exclaimed that they would rather build the computer in a HTPC case rather than buy it pre-built. (This was also echoed in the competitive analysis below)
Question on Desirability
Jump-starting the opportunity to gain design-oriented data, I decided to include a desirability question for color. Each color palette selection represents the actual color palette of a popular website for building and/or purchasing desktop PCs. Here are the options:
Desirability Findings
Surprising results surrounding color preference! As I initially hypothesized, participants chose a lighter color palette with a high level of contrast (complimentary colors). While PCPartPicker garnered the most votes @ 4, palettes CyberPCPower and Newegg came in close second with 1 vote each.
There’s a clear indication that users prefer a lighter complementary color scheme and very much prefer a scheme similar to that of PCPartPicker or Newegg.
For navigation, it’s easy for simple tasks, but as the tasks mount up, specifically having to customize a system, there’s a sharp increase in cognitive load as the user searches for the relevant components or looks for help on a particular issue.
A review of the information architecture should be completed before the redesign, specially relating to the help documentation and customization organization.
COMPETITIVE ANALYSIS
The final research method used was the competitive analysis of a direct competitor. Given the specificity of the HTPC niche from a commercial perspective, I found only one competitor with similarity on many levels: Assassin PC (APC). APC’s website, especially the landing page, is similar to that of SD.
And below is the complete competitive analysis by both quantitative and qualitative means and not limited to usability design.
Assassin PC (APC) vs. Steiger Dynamics (SD)
Tertiary Analysis
The Findings
Against it’s direct competitor, SD holds up in navigation and information architecture as well as pricing. However, its UI and the overall presentation is lacking. Both APC and SD require a more robust help system although Assassin PC has a much more intuitive customization tool.
Against general competitors such as Falcon Northwest and CyberPowerPC, both APC and SD are far below the expectations. Pricing is still high across the board for the competitors, but the approach reflects a more modern understanding of user experience.
INSIGHTS
Insight #1
A redesign is highly desirable. This fact has been emphasized by both the heuristic evaluators and the usability participants. Effort should be dedicated to redesigning the information foundation of SD and initiate a complete redesign of the UI.
Color theme should match Steiger Dynamics company profile while maximizing the use of dynamic complimentary colors to ensure a comfortable and friendly user experience.
Insight #2
A better system for managing customization components. 90% of potential buyers of high-end desktop systems will customize a substantial amount of their system.
Designing a collapsible component system, visually clear, with a focus on popular options for each HTPC system will surely increase engagement. Moreover, some criticism from both evaluators and participants was shifted to the amount of products offered, so consider streamlining the product line and offer 3 key products with comprehensive customization of features.
Insight #3
As an HTPC is a sizable investment, potential buyers visiting the website should be provided with a simple yet robust help system beyond a comprehensive FAQ.
Implementing a direct chat system with a responsive design will help establish rapport with potential buyers. Including a chatbot may help boost user confidence, but should be the focus of future redesign iterations. Adding in-window hints while browsing and customizing systems could nip particular issues or problems in the bud without requiring the user to visit the dedicated help page.
General Insight
Non-UX problems can cause steep drops in experience and expectations of the service and its products. Too many choices without clear differences between each and a high product cost for a large majority of users will bounce potential buyers off the site, particularly due to the fact that most potential buyers have the knowledge and means to build their own system at two-thirds of the total cost.
Consider a wider research effort involving product, marketing and business teams to look into revising the business and marketing strategies.
PHASE 2 (DESIGN) RECOMMENDATIONS
Once a rough redesign has been completed, 2–3 usability tests should be designed and completed to flesh out the design along with any other elements pertaining to the user experience. Steer away from quantitative methods unless absolutely necessary and if information design still poses a problem, consider card sorting coupled with heat maps within the usability test.
By the completion of the third usability test, all UX questions should be addressed and work on the final design should commence.
PHASE 3 (DELIVER) RECOMMENDATIONS
With the final design delivered and released into the wild, consider running and tracking the usual metrics via a robust platform such as Google Analytics. The qualitative results should provide enough information to assess the quality of the redesign. Once all of the relevant information such audience engagement, traffic sources, site search behavior and page bounce rates has been collected, promptly compare each element with pre-redesign results. The quantitative results should be sufficient to lead the next iteration of the website.
LAST REMARKS
Please let me know your thoughts, where I can improve in my methodology, report and presentation. I’m eager to learn as much as I can on my UXR journey. Thanks for reading this far. You get a cookie from the Bespin of my imagination.